Äú¿ÉÒÔ¾èÖú£¬Ö§³ÖÎÒÃǵĹ«ÒæÊÂÒµ¡£

1Ôª 10Ôª 50Ôª





ÈÏÖ¤Â룺  ÑéÖ¤Âë,¿´²»Çå³þ?Çëµã»÷Ë¢ÐÂÑéÖ¤Âë ±ØÌî



  ÇóÖª ÎÄÕ ÎÄ¿â Lib ÊÓÆµ iPerson ¿Î³Ì ÈÏÖ¤ ×Éѯ ¹¤¾ß ½²×ù Model Center   Code  
»áÔ±   
   
 
     
   
 ¶©ÔÄ
  ¾èÖú
tensorboadʹÓÃ
 
×÷Õߣº ºìÓãÓã
  1556  次浏览      28
 2020-2-12
 
±à¼­ÍƼö:
±¾ÎĽ²½âTensorBoardÊÇTensorflow×Ô´øµÄÍøÂçÄ£ÐÍ¿ÉÊÓ»¯µÄ¹¤¾ß£¬Ê¹ÓÃÕß¿ÉÒÔÇå³þµØ¿´µ½ÍøÂçµÄ½á¹¹¡¢Êý¾ÝÁ÷Ïò¡¢Ëðʧº¯Êý±ä»¯µÈһϵÁÐÓëÍøÂçÄ£ÐÍÓйصÄÊý¾Ý¡£Ï£Íû¶ÔÄúÓÐËù°ïÖú¡£
±¾ÎÄÀ´×ÔÓÚcsdn£¬ÓÉ»ðÁú¹ûÈí¼þDelores±à¼­¡¢ÍƼö¡£

ÍøÂç½á¹¹

µ¥´¿Ê¹ÓÃtensorboard²é¿´ÍøÂç½á¹¹Ö»ÐèÒªÔÚ³ÌÐòÀï¼ÓÈëÏÂÃæÒ»ÐдúÂ룺

writer = tf.summary.FileWriter
('C:\\Users\\45374\\logs',sess.graph)

ʹÓÃÍêºó¶ÔӦ·¾¶½«»áÉú³ÉÒ»¸ö»·¾³Îļþ£¬ÈçÏÂͼ¡£

ÕâʱÎÒÃÇ´ò¿ªcmd£¨ÓÉÓÚÎÒÊÇÔÚanacondaϽ¨µÄtensorflow»·¾³£¬ËùÒÔÎÒÓõÄanaconda prompt£¬²¢ÏȽøÈëµ½tensorflowµÄ»·¾³ÖУ©£¬ÊäÈëtensorboard --logdir='·¾¶' µÃµ½ÈçϽá¹û£º

£¨Èç¹û²»ÊÇ×Ô¼º½¨µÄtensorflow»·¾³²»Ðè񻂡̾ÄǾäactivate tensorflow£¬¶øÇÒ×¢ÒâÕâÀïµÄtensorflowÊÇÎÒ×Ô¼º¸ø»·¾³È¡µÄÃû×Ö£©£¬Ò»°ã´ó¼ÒÖ»ÐèÔÚcmdÖÐÊäÈëtensorboard --logdir='·¾¶' ¡£

È»ºó»á·µ»ØÒ»¸öÍøÖ·£¬¸´ÖƸÃÍøÖ·È»ºóÔÚÔÚä¯ÀÀÆ÷Öдò¿ª£¨ÍƼöÓûðºü»ò¹È¸èä¯ÀÀÆ÷£¬ÓеÄä¯ÀÀÆ÷¿ÉÄÜ´ò²»¿ª£©£¬ÈçÏ£º

Õâ¸öͼ½á¹û½ÏΪ»ìÂÒ£¬Í¼ÖÐ4¸ö±äÁ¿·Ö±ðÊÇÁ½×éwºÍb¡£ÎªÁËÈÃÍøÂç½á¹¹¸üÇåÎú£¬ÎÒÃÇ¿ÉÒÔ¶Ô²¿·Ö±äÁ¿ºÍ²Ù×÷ÉèÖÃÃüÃû¿Õ¼ä£¬Ê¹½á¹¹¸üÈÝÒ׹۲죬´úÂë¿É×÷ÈçÏÂÐ޸ģº

with tf.name_scope("input"):
x = tf.placeholder(tf.float32,
[None,784],name='x_input')
y = tf.placeholder(tf.float32,
[None,10],name='y_input')
#ÊäÈë²ãµ½Òþ²Ø²ã
with tf.name_scope('layer'):
with tf.name_scope('wights'):
W = tf.Variable(tf.zeros([784,10]))
#ÕâÀïW³õʼ»¯Îª0£¬¿ÉÒÔ¸ü¿ìÊÕÁ²
with tf.name_scope('biases'):
b = tf.Variable(tf.zeros([10]))
with tf.name_scope('Wx_plus_b_L1'):
Wx_plus_b_L1 = tf.matmul(x,W) + b
#Òþ²Ø²ãµ½Êä³ö²ã
with tf.name_scope('output'):
with tf.name_scope('wights'):
W2 = tf.Variable(tf.random_normal([10,10]))
#Òþ²Ø²ã²»Äܳõʼ»¯Îª0
with tf.name_scope('biases'):
b2 = tf.Variable(tf.zeros([10]))
with tf.name_scope('softmax'):
prediction = tf.nn.softmax
(tf.matmul(Wx_plus_b_L1,W2)+b2)
#¶þ´Î´ú¼Ûº¯Êý
with tf.name_scope('loss'):
loss = tf.reduce_mean(tf.square(y-prediction))
#ÌݶÈϽµ·¨ÑµÁ·
train_step = tf.train.GradientDescentOptimizer(0.2).
minimize(loss)#ѧϰÂÊΪ0.2
init = tf.global_variables_initializer()

ÕâʱtensorboardÖв鿴µÄÍøÂç½á¹¹ÈçÏ£º

ÿһ¸öÃüÃû¿Õ¼ä¿ÉÒÔË«»÷´ò¿ª²é¿´ÄÚ²¿½á¹¹£º

Ò²¿ÉÒÔÓÒ¼üÑ¡Ôñ°Ñijһ²¿·Öµ¥¶ÀÄóöÀ´»ò·Å½øÍøÂ磺

²»¹ý×¢Ò⣬Èç¹û¶à´ÎÔËÐгÌÐò£¬±ØÐëÏȽ«³ÌÐò¹Ø±Õ£¬È»ºóÖØÐÂÔËÐУ¬²¢ÇÒ½«logsÎļþÖеÄenventoutÎļþɾ³ý¡£²»È»¿ÉÄÜ»á³öÏÖ¶à¸öÍøÂçͬʱÏÔʾµÄÇé¿ö£¬ÈçÏ£º

²ÎÊý±ä»¯

ÔÚÍøÂçѵÁ·¹ý³ÌÖУ¬»áÓкܶà²ÎÊýµÄ±ä»¯¹ý³Ì£¬ÎÒÃÇ¿ÉÒÔ¶ÔÕâЩ²ÎÊýµÄ±ä»¯¹ý³Ì½øÐÐÏÔʾ¡£

ÎÒÃÇ¿ÉÒÔÏȶ¨ÒåÒ»¸öº¯Êý£º

def variable_summarise(var):
with tf.name_scope('summarise'):
mean = tf.reduce_mean(var)
tf.summary.scalar('mean',mean)
with tf.name_scope('stddev'):
stddev = tf.sqrt(tf.reduce_mean
(tf.square(var - mean)))
#tf.summary.scalarÊä³ö±êÁ¿
#tf.summary.histogramÊä³öÖ±·½Í¼
tf.summary.scalar('srddev',stddev)
tf.summary.scalar('max',tf.reduce_max(var))
tf.summary.scalar('min',tf.reduce_min(var))
tf.summary.histogram('histogram',var)

º¯ÊýÀﶨÒå×ÅÎÒÃÇÏëÖªµÀµÄÐÅÏ¢£¬ÎÒÃÇÏëÖªµÀÄǸö±äÁ¿ÐÅÏ¢£¬¾Íµ÷ÓÃÕâ¸öº¯Êý¡£

Ëðʧº¯ÊýºÍ׼ȷÂÊÿ´ÎÖ»ÓÐÒ»¸ö±êÁ¿Öµ£¬ËùÒÔÖ»ÐèÒ»¸ösummary.scalarº¯Êý£º

with tf.name_scope('loss'):
loss = tf.reduce_mean(tf.square(y-prediction))
tf.summary.scalar('loss',loss)
with tf.name_scope('accuracy'):
correct_prediction = tf.equal(tf.argmax(y,1),
tf.argmax(prediction,1))
#equlÅжÏÊÇ·ñÏàµÈ£¬argmax·µ»ØÕÅÁ¿×î´óÖµµÄË÷Òý
accuracy = tf.reduce_mean(tf.cast
(correct_prediction,tf.float32))
#½«²¼¶ûÐÍת»»Îª¸¡µãÐÍ
tf.summary.scalar('accuracy',accuracy)

È»ºóºÏ²¢ËùÓÐsummary£º

merged = tf.summary.merge_all()

½« mergedº¯ÊýºÍÍøÂçѵÁ·Ò»Æð½øÐУº

summary,_ = sess.run([merged,train_step],
feed_dict={x:batch_xs,y:batch_ys})

×îºó½«summaryÖÐÊý¾ÝдÈëÎļþ£º

writer.add_summary(summary,epoch)

½ÓÏÂÀ´¾Í¿ÉÒÔÔÚtensorboardÖв鿴¸Õ¸Õ¼Ç¼µÄ¸÷ÖÖÊý¾Ý£º

ÍêÕû´úÂëÈçÏ£º

import tensorflow as tf
from tensorflow.examples.tutorials.mnist
import input_data
#ÔØÈëÊý¾Ý
mnist = input_data.read_data_sets
("E:/mnist",one_hot=True)
#ÿ¸öÅú´Î´óС
batch_size = 200
#¼ÆËãÒ»¹²ÓжàÉÙ¸öÅú´Î
n_batch = mnist.train.num_examples
//batch_size #Õû³ý
def variable_summarise(var):
with tf.name_scope('summarise'):
mean = tf.reduce_mean(var)
tf.summary.scalar('mean',mean)
with tf.name_scope('stddev'):
stddev = tf.sqrt(tf.reduce_mean
(tf.square(var - mean)))
tf.summary.scalar('srddev',stddev)
tf.summary.scalar('max',tf.reduce_max(var))
tf.summary.scalar('min',tf.reduce_min(var))
tf.summary.histogram('histogram',var)
with tf.name_scope("input"):
x = tf.placeholder(tf.float32,
[None,784],name='x_input')
y = tf.placeholder(tf.float32,
[None,10],name='y_input')
#ÊäÈë²ãµ½Òþ²Ø²ã
with tf.name_scope('layer'):
with tf.name_scope('wights'):
W = tf.Variable(tf.zeros([784,10]))
#ÕâÀïW³õʼ»¯Îª0£¬¿ÉÒÔ¸ü¿ìÊÕÁ²
variable_summarise(W)
with tf.name_scope('biases'):
b = tf.Variable(tf.zeros([10]))
variable_summarise(b)
with tf.name_scope('Wx_plus_b_L1'):
Wx_plus_b_L1 = tf.matmul(x,W) + b
#Òþ²Ø²ãµ½Êä³ö²ã
with tf.name_scope('output'):
with tf.name_scope('wights'):
W2 = tf.Variable(tf.random_normal([10,10]))
#Òþ²Ø²ã²»Äܳõʼ»¯Îª0
with tf.name_scope('biases'):
b2 = tf.Variable(tf.zeros([10]))
with tf.name_scope('softmax'):
prediction = tf.nn.softmax(tf.matmul
(Wx_plus_b_L1,W2)+b2)
#¶þ´Î´ú¼Ûº¯Êý
with tf.name_scope('loss'):
loss = tf.reduce_mean(tf.square(y-prediction))
tf.summary.scalar('loss',loss)
#ÌݶÈϽµ·¨ÑµÁ·
train_step =
tf.train.GradientDescentOptimizer(0.2).
minimize(loss)#ѧϰÂÊΪ0.2
init = tf.global_variables_initializer()
#Çó׼ȷÂÊ
with tf.name_scope('accuracy'):
correct_prediction = tf.equal
(tf.argmax(y,1),tf.argmax
(prediction,1))#equlÅжÏÊÇ·ñÏàµÈ£¬
argmax·µ»ØÕÅÁ¿×î´óÖµµÄË÷Òý
accuracy = tf.reduce_mean(tf.cast
(correct_prediction,tf.float32))
#½«²¼¶ûÐÍת»»Îª¸¡µãÐÍ
tf.summary.scalar('accuracy',accuracy)
merged = tf.summary.merge_all()
with tf.Session() as sess:
sess.run(init)
writer = tf.summary.FileWriter
('C:\\Users\\45374\\logs',sess.graph)
#µü´úѵÁ·20´Î
for epoch in range(50):
for batch in range(n_batch):
#ѵÁ·¼¯Êý¾ÝÓë±êÇ©
batch_xs,batch_ys =
mnist.train.next_batch(batch_size)
# sess.run(train_step,feed_dict=
{x:batch_xs,y:batch_ys})
summary,_ = sess.run([merged,train_step],
feed_dict=
{x:batch_xs,y:batch_ys})
writer.add_summary(summary,epoch)
acc = sess.run(accuracy,feed_dict
={x:mnist.test.images,y:mnist.test.labels})
print("Iter " + str(epoch) + " Accuracy" + str(acc))
   
1556 ´Îä¯ÀÀ       28
Ïà¹ØÎÄÕÂ

»ùÓÚͼ¾í»ýÍøÂçµÄͼÉî¶Èѧϰ
×Ô¶¯¼ÝÊ»ÖеÄ3DÄ¿±ê¼ì²â
¹¤Òµ»úÆ÷ÈË¿ØÖÆÏµÍ³¼Ü¹¹½éÉÜ
ÏîĿʵս£ºÈçºÎ¹¹½¨ÖªÊ¶Í¼Æ×
 
Ïà¹ØÎĵµ

5GÈ˹¤ÖÇÄÜÎïÁªÍøµÄµäÐÍÓ¦ÓÃ
Éî¶ÈѧϰÔÚ×Ô¶¯¼ÝÊ»ÖеÄÓ¦ÓÃ
ͼÉñ¾­ÍøÂçÔÚ½»²æÑ§¿ÆÁìÓòµÄÓ¦ÓÃÑо¿
ÎÞÈË»úϵͳԭÀí
Ïà¹Ø¿Î³Ì

È˹¤ÖÇÄÜ¡¢»úÆ÷ѧϰ&TensorFlow
»úÆ÷ÈËÈí¼þ¿ª·¢¼¼Êõ
È˹¤ÖÇÄÜ£¬»úÆ÷ѧϰºÍÉî¶Èѧϰ
ͼÏñ´¦ÀíËã·¨·½·¨Óëʵ¼ù