Python学习日记3-第一个tensorflow程序

y=2x

1生成加入噪声的y=2x+0.8,X在(-1,1)有100个点,并画图
发现缺少matplotlib
在Anaconda prompt中给tensorflow环境安装包
pip install cython
pip install matplotlib

2完整代码和运行效果

import tensorflow as tf
import numpy as np
import matplotlib.pyplot as plt
import time
#生成模拟数据
train_X = np.linspace(-1,1,100)
train_Y = 2*train_X+ np.random.randn(*train_X.shape)*0.3+0.8 #y=2x+0.8
#display
plt.plot(train_X,train_Y,‘ro’,label=‘Original data’)
plt.legend()
plt.show()
#linear regression
#占位符定义输入节点
X_holder = tf.placeholder(“float”)
Y_holder = tf.placeholder(“float”)
#模型的变量,使用回归求估计值
W_variable = tf.Variable(tf.random_normal([1]),name = “weight”)
b_variable = tf.Variable(tf.zeros([1]),name = “bias”)
#print(W_variable)
#print(b_variable)
#foward
#前向传播网络,?只定义,不计算
z_pred=tf.multiply(X_holder,W_variable)+b_variable
#print(z_pred)
#backward
#后向传播
#代价函数
cost = tf.reduce_mean(tf.square(Y_holder-z_pred))
#print(cost)
#优化速率
learning_rate = 0.01
#优化策略,梯度下降
optimizer = tf.train.GradientDescentOptimizer(learning_rate).minimize(cost)
#init
init = tf.global_variables_initializer()
#times
training_epochs = 1000
display_step = 200
#start session
with tf.Session() as sess:
sess.run(init)
plotdata ={“batchsize”:[],“loss”:[]}
icount_feeddic=0;
start_time = time.clock()
print(‘start time is’,start_time)
print(time.clock())
#input data
for epoch in range(training_epochs):#所有样本跑几遍
#方法1:一次放入全部样本
sess.run(optimizer,feed_dict={X_holder:train_X,Y_holder:train_Y})
#方法2:每次放入一个样本
# for (x,y) in zip(train_X,train_Y):#所有样本跑一遍
# sess.run(optimizer,feed_dict={X_holder:x,Y_holder:y})
#display loss
if epoch % display_step ==0:
print(time.clock())
#方法一、二相同:每次放入全部样本
loss = sess.run(cost,feed_dict={X_holder:train_X,Y_holder:train_Y})
print (“Epoch:”,epoch+1,“cost=”,loss,“W=”,
sess.run(W_variable),“b=”,sess.run(b_variable))
if not(loss == “NA”):
plotdata[“batchsize”].append(epoch)
plotdata[“loss”].append(loss)
end_time=time.clock()
print(‘it takes’,end_time-start_time,‘seconds’)
print(“Finished!”)
print (“cost=”,loss,“W=”,
sess.run(W_variable),“b=”,sess.run(b_variable))
#画样本点和拟合曲线
plt.plot(train_X,train_Y,‘ro’,label=‘Original data’)
plt.plot(train_X,sess.run(W_variable)*train_X+sess.run(b_variable),label=‘Fittedline’)
plt.legend()
plt.show()
#画训练次数和代价函数值
plotdata[“avgloss”]=moving_average(plotdata[“loss”])
plt.figure(1)
plt.subplot(211)
plt.plot(plotdata[“batchsize”],plotdata[“avgloss”],‘b–’)
plt.xlabel(‘Minibatch number’)
plt.ylabel(‘Loss’)
plt.title(‘Minibatch run vs. Training loss’)
plt.show()


结果,单个样本注入:

start time is 1059.7321915769558
1059.7323751144845
1059.8880759169178
Epoch: 1 cost= 0.34107947 W= [1.2368563] b= [0.97965276]
1072.2689446652462
Epoch: 201 cost= 0.09215031 W= [2.0805874] b= [0.82373667]
1084.4980321033702
Epoch: 401 cost= 0.09215031 W= [2.0805874] b= [0.82373667]
1096.0414710291445
Epoch: 601 cost= 0.09215031 W= [2.0805874] b= [0.82373667]
1107.8997963578668
Epoch: 801 cost= 0.09215031 W= [2.0805874] b= [0.82373667]
it takes 60.73443387426573 seconds
Finished!
cost= 0.09215031 W= [2.0805874] b= [0.82373667]

批量样本注入:
Python学习日记3-第一个tensorflow程序

start time is 1231.3632895962596
1231.3634683038536
1231.4680297548143
Epoch: 1 cost= 0.82909894 W= [1.3106842] b= [0.01552008]
1231.6652172836807
Epoch: 201 cost= 0.12748799 W= [1.7747071] b= [0.76262873]
1231.8060370564685
Epoch: 401 cost= 0.11925615 W= [1.8932198] b= [0.77576894]
1231.9389858464792
Epoch: 601 cost= 0.118730776 W= [1.9234884] b= [0.77599984]
1232.062556714002
Epoch: 801 cost= 0.11869651 W= [1.9312189] b= [0.7760025]
it takes 0.8303877773666954 seconds
Finished!
cost= 0.11869651 W= [1.9331884] b= [0.7760025]


Python学习日记3-第一个tensorflow程序

Python学习日记3-第一个tensorflow程序