Python实现线性回归

一、线性回归原理

y = w1 _x1+w2_x2…+b

均方误差

梯度下降:
采用梯度下降法对损耗进行优化。最小损耗对应的重量和偏移量是我们想要的模型参数。

[En]

Gradient descent is used to optimize the loss. The weight and offset corresponding to the minimum loss are the model parameters we want.

二、设计方案

假定随机指点100个点,只有一个特征。x和y之间的关系满足y=kx+b
x = (100,1)
真实的y_true = (100,1)

数据分布满足y = 0.8*x+0.7
x(100,1) * weight(1,1) + bias(1,1) = y_true(100,1)
预测y_predict = tf.matmul(x,weight) + bias

error = tf.reduce_mean(tf.square(y_predict-y_true))
optimizer = tf.train.GradientDescentOptimizer(learning_rate=0.01).minimize(error)

最后在会话中不断迭代optimizer当损失最小时候获取模型参数(权重weight和偏置bias)

三、代码实现

import tensorflow as tf
def linear_regression():

    X= tf.random_normal(shape=[100.1])
    y_true = tf.matmul(X,[[0.8]])+0.7

    weight = tf.Variable(initial_value=tf.random_normal(shape=[1,1]))
    bias = tf.Variable(initial_value=tf.random_normal(shape=[1,1]))
    y_predict = tf.matmul(X,weight)+bias

    error = tf.reduce_mean(tf.square(y_predict-y_true))

    optimizer = tf.train.GradientDescentOptimizer(learning_rate=0.1).minimize(error)

    tf.global_variables_initializer()
    with tf.Session() as sess:

        sess.run(init)
        print('训练前查看模型参数:权重:%f,偏置:%f,损失:%f'%(weight.eval(),bias.eval(),error.eval()))

        for i in range(100):
            sess.run(otimizer)
            print('训练%d后查看模型参数:权重:%f,偏置:%f,损失:%f'% ((i+1),weight.eval(),bias.eval(),error.eval()))

if __name__=='__main__':
    linear_regression()

四、总结

学习率训练次数权重偏置损失0.0110000.7999990.6999990.000000.11000.798380.6995690.000005100nannannan

显然,
在学习率为0.01时,训练398次后,损失达到0,权重稳定在0.8,偏置稳定在0.7;
在学习率为0.1时,训练33次后,损失达到0,权重稳定在0.8,偏置稳定在0.7;
在学习率为5时,出现训练爆炸

特性做得不好,参数调整到旧的。学习率越高越好。参数应根据实际情况进行调整。

[En]

The characteristics are not done well, and the parameters are adjusted to old. The higher the learning rate, the better. The parameters should be adjusted according to the actual situation.

Original: https://blog.csdn.net/Prototype___/article/details/119931355
Author: 极客范儿
Title: Python实现线性回归

原创文章受到原创版权保护。转载请注明出处:https://www.johngo689.com/497073/

转载文章受原作者版权保护。转载请注明原作者出处!

(0)

大家都在看

亲爱的 Coder【最近整理,可免费获取】👉 最新必读书单  | 👏 面试题下载  | 🌎 免费的AI知识星球