新聞中心
這篇文章主要講解了“如何使用tensorboard展示神經(jīng)網(wǎng)絡(luò)的graph”,文中的講解內(nèi)容簡(jiǎn)單清晰,易于學(xué)習(xí)與理解,下面請(qǐng)大家跟著小編的思路慢慢深入,一起來(lái)研究和學(xué)習(xí)“如何使用tensorboard展示神經(jīng)網(wǎng)絡(luò)的graph”吧!
專(zhuān)業(yè)從事網(wǎng)站設(shè)計(jì)、成都網(wǎng)站建設(shè),高端網(wǎng)站制作設(shè)計(jì),微信平臺(tái)小程序開(kāi)發(fā),網(wǎng)站推廣的成都做網(wǎng)站的公司。優(yōu)秀技術(shù)團(tuán)隊(duì)竭力真誠(chéng)服務(wù),采用H5響應(yīng)式網(wǎng)站+CSS3前端渲染技術(shù),響應(yīng)式網(wǎng)站設(shè)計(jì),讓網(wǎng)站在手機(jī)、平板、PC、微信下都能呈現(xiàn)。建站過(guò)程建立專(zhuān)項(xiàng)小組,與您實(shí)時(shí)在線(xiàn)互動(dòng),隨時(shí)提供解決方案,暢聊想法和感受。
# 創(chuàng)建神經(jīng)網(wǎng)絡(luò), 使用tensorboard 展示graph import tensorflow as tf import numpy as np import matplotlib.pyplot as plt # 若沒(méi)有 pip install matplotlib # 定義一個(gè)神經(jīng)層 def add_layer(inputs, in_size, out_size, activation_function=None): #add one more layer and return the output of this layer with tf.name_scope('layer'): with tf.name_scope('Weights'): Weights = tf.Variable(tf.random_normal([in_size, out_size]),name='W') with tf.name_scope('biases'): biases = tf.Variable(tf.zeros([1, out_size]) + 0.1,name='b') with tf.name_scope('Wx_plus_b'): Wx_plus_b = tf.matmul(inputs, Weights) + biases if activation_function is None: outputs = Wx_plus_b else: outputs = activation_function(Wx_plus_b)### return outputs #make up some real data x_data = np.linspace(-1, 1, 300)[:, np.newaxis] # x_data值為-1到1之間,有300個(gè)單位(例子),再加一個(gè)維度newaxis,即300行*newaxis列 noise = np.random.normal(0, 0.05, x_data.shape) # 均值為0.方差為0.05,格式和x_data一樣 y_data = np.square(x_data) - 0.5 + noise #define placeholder for inputs to network with tf.name_scope('inputs'): xs = tf.placeholder(tf.float32, [None, 1],name='x_input1') # none表示無(wú)論給多少個(gè)例子都行 ys = tf.placeholder(tf.float32, [None, 1],name='y_input1') # add hidden layer l1 = add_layer(xs, 1, 10, activation_function=tf.nn.relu) # add output layer prediction = add_layer(l1, 10, 1, activation_function=None) #the error between prediction and real data with tf.name_scope('loss'): loss = tf.reduce_mean( tf.reduce_sum(tf.square(ys - prediction), reduction_indices=[1])) # 對(duì)每個(gè)例子進(jìn)行求和并取平均值 reduction_indices=[1]指按行求和 with tf.name_scope('train'): train_step = tf.train.GradientDescentOptimizer(0.1).minimize(loss) # 以0.1的學(xué)習(xí)效率對(duì)誤差進(jìn)行更正和提升 #兩種初始化的方式 #init = tf.initialize_all_variables() init = tf.global_variables_initializer() sess = tf.Session() sess.run(init) #把整個(gè)框架加載到一個(gè)文件中去,再?gòu)奈募屑虞d出來(lái)放到瀏覽器中查看 #writer=tf.train.SummaryWriter("logs/",sess.graph) #首先找到tensorboard.exe的路徑并進(jìn)入c:Anaconda\Scripts,執(zhí)行tensorboard.exe --logdir=代碼生成的圖像的路徑(不能帶中文) writer=tf.summary.FileWriter("../../logs/",sess.graph) fig = plt.figure() ax = fig.add_subplot(1, 1, 1) ax.scatter(x_data, y_data) plt.ion() plt.show() #show()是一次性的展示,為了使連續(xù)的展示,加入plt.ion() for i in range(1000): sess.run(train_step, feed_dict={xs: x_data, ys: y_data}) if i % 50 == 0: # to see the step improment 顯示實(shí)際點(diǎn)的數(shù)據(jù) # print(sess.run(loss,feed_dict = {xs:x_data,ys:y_data})) try: # 每次劃線(xiàn)前抹除上一條線(xiàn),抹除lines的第一條線(xiàn),由于lines只有一條線(xiàn),則為lines[0],第一次沒(méi)有線(xiàn) ax.lines.remove(lines[0]) except Exception: pass # 顯示預(yù)測(cè)數(shù)據(jù) prediction_value = sess.run(prediction, feed_dict={xs: x_data}) # 存儲(chǔ) prediction_value 的值 lines = ax.plot(x_data, prediction_value, 'r-', lw=5) # 用紅色的線(xiàn)畫(huà),且寬度為5 # 停止0.1秒后再畫(huà)下一條線(xiàn) plt.pause(0.1)
生成的tensorboard的graph:
感謝各位的閱讀,以上就是“如何使用tensorboard展示神經(jīng)網(wǎng)絡(luò)的graph”的內(nèi)容了,經(jīng)過(guò)本文的學(xué)習(xí)后,相信大家對(duì)如何使用tensorboard展示神經(jīng)網(wǎng)絡(luò)的graph這一問(wèn)題有了更深刻的體會(huì),具體使用情況還需要大家實(shí)踐驗(yàn)證。這里是創(chuàng)新互聯(lián),小編將為大家推送更多相關(guān)知識(shí)點(diǎn)的文章,歡迎關(guān)注!
分享名稱(chēng):如何使用tensorboard展示神經(jīng)網(wǎng)絡(luò)的graph
轉(zhuǎn)載注明:http://ef60e0e.cn/article/podjpc.html