Äú¿ÉÒÔ¾èÖú£¬Ö§³ÖÎÒÃǵĹ«ÒæÊÂÒµ¡£

1Ôª 10Ôª 50Ôª





ÈÏÖ¤Â룺  ÑéÖ¤Âë,¿´²»Çå³þ?Çëµã»÷Ë¢ÐÂÑéÖ¤Âë ±ØÌî



  ÇóÖª ÎÄÕ ÎÄ¿â Lib ÊÓÆµ iPerson ¿Î³Ì ÈÏÖ¤ ×Éѯ ¹¤¾ß ½²×ù Model Center   Code  
»áÔ±   
   
 
     
   
 ¶©ÔÄ
  ¾èÖú
TensorflowµÄ¿ÉÊÓ»¯¹¤¾ßTensorboardµÄ³õ²½Ê¹ÓÃÏê½â
 
×÷Õߣº ÍõС²Ý
  3001  次浏览      28
 2020-2-7
 
±à¼­ÍƼö:
±¾ÎÄÖ÷ҪʹÓÃÊÖдÊý×Öʶ±ðµÄС°¸ÀýÀ´½²½âÁËÈçºÎ³õ²½Ê¹ÓÃTensorflowµÄ¿ÉÊÓ»¯¹¤¾ßTensorboard¡£Ï£Íû¶ÔÄúÓÐËù°ïÖú¡£
±¾ÎÄÀ´×ÔÓڽű¾Ö®¼Ò£¬ÓÉ»ðÁú¹ûÈí¼þDelores±à¼­¡¢ÍƼö¡£

µ±Ê¹ÓÃTensorflowѵÁ·´óÁ¿Éî²ãµÄÉñ¾­ÍøÂçʱ£¬ÎÒÃÇÏ£ÍûÈ¥¸ú×ÙÉñ¾­ÍøÂçµÄÕû¸öѵÁ·¹ý³ÌÖеÄÐÅÏ¢£¬±ÈÈçµü´úµÄ¹ý³ÌÖÐÿһ²ã²ÎÊýÊÇÈçºÎ±ä»¯Óë·Ö²¼µÄ£¬±ÈÈçÿ´ÎÑ­»·²ÎÊý¸üкóÄ£ÐÍÔÚ²âÊÔ¼¯ÓëѵÁ·¼¯ÉϵÄ׼ȷÂÊÊÇÈçºÎµÄ£¬±ÈÈçËðʧֵµÄ±ä»¯Çé¿ö£¬µÈµÈ¡£Èç¹ûÄÜÔÚѵÁ·µÄ¹ý³ÌÖн«Ò»Ð©ÐÅÏ¢¼ÓÒԼǼ²¢¿ÉÊÓ»¯µÃ±íÏÖ³öÀ´£¬ÊDz»ÊǶÔÎÒÃÇ̽Ë÷Ä£ÐÍÓиüÉîµÄ°ïÖúÓëÀí½âÄØ£¿

Tensorflow¹Ù·½ÍƳöÁË¿ÉÊÓ»¯¹¤¾ßTensorboard£¬¿ÉÒÔ°ïÖúÎÒÃÇʵÏÖÒÔÉϹ¦ÄÜ£¬Ëü¿ÉÒÔ½«Ä£ÐÍѵÁ·¹ý³ÌÖеĸ÷ÖÖÊý¾Ý»ã×ÜÆðÀ´´æÔÚ×Ô¶¨ÒåµÄ·¾¶ÓëÈÕÖ¾ÎļþÖУ¬È»ºóÔÚÖ¸¶¨µÄweb¶Ë¿ÉÊÓ»¯µØÕ¹ÏÖÕâЩÐÅÏ¢¡£

1. Tensorboard½éÉÜ

1.1 TensorboardµÄÊý¾ÝÐÎʽ

Tensorboard¿ÉÒԼǼÓëչʾÒÔÏÂÊý¾ÝÐÎʽ£º

£¨1£©±êÁ¿Scalars

£¨2£©Í¼Æ¬Images

£¨3£©ÒôƵAudio

£¨4£©¼ÆËãͼGraph

£¨5£©Êý¾Ý·Ö²¼Distribution

£¨6£©Ö±·½Í¼Histograms

£¨7£©Ç¶ÈëÏòÁ¿Embeddings

1.2 TensorboardµÄ¿ÉÊÓ»¯¹ý³Ì

£¨1£©Ê×Ïȿ϶¨ÊÇÏȽ¨Á¢Ò»¸ögraph,ÄãÏë´ÓÕâ¸ögraphÖлñȡijЩÊý¾ÝµÄÐÅÏ¢

£¨2£©È·¶¨ÒªÔÚgraphÖеÄÄÄЩ½Úµã·ÅÖÃsummary operationsÒԼǼÐÅÏ¢

ʹÓÃtf.summary.scalar¼Ç¼±êÁ¿

ʹÓÃtf.summary.histogram¼Ç¼Êý¾ÝµÄÖ±·½Í¼

ʹÓÃtf.summary.distribution¼Ç¼Êý¾ÝµÄ·Ö²¼Í¼

ʹÓÃtf.summary.image¼Ç¼ͼÏñÊý¾Ý

£¨3£©operations²¢²»»áÈ¥ÕæµÄÖ´ÐмÆË㣬³ý·ÇÄã¸æËßËûÃÇÐèҪȥrun,»òÕßËü±»ÆäËûµÄÐèÒªrunµÄoperationËùÒÀÀµ¡£¶øÎÒÃÇÉÏÒ»²½´´½¨µÄÕâЩsummary operationsÆäʵ²¢²»±»ÆäËû½ÚµãÒÀÀµ£¬Òò´Ë£¬ÎÒÃÇÐèÒªÌØµØÈ¥ÔËÐÐËùÓеÄsummary½Úµã¡£µ«ÊÇÄØ£¬Ò»·Ý³ÌÐòÏÂÀ´¿ÉÄÜÓ㬶àÕâÑùµÄsummary ½Úµã£¬ÒªÊÖ¶¯Ò»¸öÒ»¸öÈ¥Æô¶¯×ÔÈ»ÊǼ°Æä·±ËöµÄ£¬Òò´ËÎÒÃÇ¿ÉÒÔʹÓÃtf.summary.merge_allÈ¥½«ËùÓÐsummary½ÚµãºÏ²¢³ÉÒ»¸ö½Úµã£¬Ö»ÒªÔËÐÐÕâ¸ö½Úµã£¬¾ÍÄܲúÉúËùÓÐÎÒÃÇ֮ǰÉèÖõÄsummary data¡£

£¨4£©Ê¹ÓÃtf.summary.FileWriter½«ÔËÐкóÊä³öµÄÊý¾Ý¶¼±£´æµ½±¾µØ´ÅÅÌÖÐ

£¨5£©ÔËÐÐÕû¸ö³ÌÐò£¬²¢ÔÚÃüÁîÐÐÊäÈëÔËÐÐtensorboardµÄÖ¸Á֮ºó´ò¿ªweb¶Ë¿É²é¿´¿ÉÊÓ»¯µÄ½á¹û

2.TensorboardʹÓð¸Àý

²»³öËùÁÏÄØ£¬ÎÒÃÇ»¹ÊÇʹÓÃ×î»ù´¡µÄʶ±ðÊÖд×ÖÌåµÄ°¸Àý¡«

²»¹ý±¾°¸ÀýÒ²ÊÇÏȲ»È¥×·Çó¶àÃÀºÃµÄÄ£ÐÍ£¬Ö»Êǽ¨Á¢Ò»¸ö¼òµ¥µÄÉñ¾­ÍøÂ磬Èôó¼ÒÁ˽âÈçºÎʹÓÃTensorboard¡£

2.1 µ¼Èë°ü£¬¶¨Ò峬²ÎÊý£¬ÔØÈëÊý¾Ý

£¨1£©Ê×ÏÈ»¹Êǵ¼ÈëÐèÒªµÄ°ü£º

from __future__ import absolute_import
from __future__ import division
from __future__ import print_function
import argparse
import sys
import tensorflow as tf
from tensorflow.examples.tutorials.
mnist import input_data

£¨2£©¶¨Òå¹Ì¶¨µÄ³¬²ÎÊý,·½±ã´ýʹÓÃʱֱ½Ó´«Èë¡£Èç¹ûÄãÎÊ£¬Õâ¸ö³¬²ÎÊýΪɶҪÕâÑùÉ趨£¬ÈçºÎÑ¡Ôñ×îÓŵij¬²ÎÊý£¿Õâ¸öÎÊÌâ´Ë´¦ÏȲ»ÌÖÂÛ£¬³¬²ÎÊýµÄÑ¡ÔñÔÚ»úÆ÷ѧϰ½¨Ä£ÖÐ×î³£Óõķ½·¨¾ÍÊÇ¡°½»²æÑéÖ¤·¨¡±¡£¶øÏÖÔÚ¼ÙÉèÎÒÃÇÒѾ­»ñµÃÁË×îÓŵij¬²ÎÊý£¬ÉèÖÃѧÀûÂÊΪ0.001£¬dropoutµÄ±£Áô½Úµã±ÈÀýΪ0.9£¬×î´óÑ­»·´ÎÊýΪ1000.

ÁíÍ⣬»¹ÒªÉèÖÃÁ½¸ö·¾¶£¬µÚÒ»¸öÊÇÊý¾ÝÏÂÔØÏÂÀ´´æ·ÅµÄµØ·½£¬Ò»¸öÊÇsummaryÊä³ö±£´æµÄµØ·½¡£

max_step = 1000 # ×î´óµü´ú´ÎÊý
learning_rate = 0.001 # ѧϰÂÊ
dropout = 0.9 # dropoutÊ±Ëæ»ú±£ÁôÉñ¾­ÔªµÄ±ÈÀý
data_dir = '' # Ñù±¾Êý¾Ý´æ´¢µÄ·¾¶
log_dir = '' # Êä³öÈÕÖ¾±£´æµÄ·¾¶

£¨3£©½Ó׿ÓÔØÊý¾Ý,ÏÂÔØÊý¾ÝÊÇÖ±½Óµ÷ÓÃÁËtensorflowÌṩµÄº¯Êýread_data_sets,ÊäÈëÁ½¸ö²ÎÊý£¬µÚÒ»¸öÊÇÏÂÔØµ½Êý¾Ý´æ´¢µÄ·¾¶£¬µÚ¶þ¸öone_hot±íʾÊÇ·ñÒª½«Àà±ð±êÇ©½øÐжÀÈȱàÂë¡£ËüÊ×ÏÈ»ØÈ¥ÕÒÖÆ¶¨Ä¿Â¼ÏÂÓÐûÓÐÕâ¸öÊý¾ÝÎļþ£¬Ã»Óеϰ²ÅÈ¥ÏÂÔØ£¬Óеϰ¾ÍÖ±½Ó¶ÁÈ¡¡£ËùÒÔµÚÒ»´ÎÖ´ÐÐÕâ¸öÃüÁËÙ¶È»á±È½ÏÂý¡£

mnist = input_data.read_data_sets
(data_dir,one_hot=True)

2.2 ´´½¨ÌØÕ÷Óë±êÇ©µÄռλ·û£¬±£´æÊäÈëµÄͼƬÊý¾Ýµ½summary

£¨1£©´´½¨tensorflowµÄĬÈϻỰ£º

sess = tf.InteractiveSession()

£¨2£©´´½¨ÊäÈëÊý¾ÝµÄռλ·û£¬·Ö±ð´´½¨ÌØÕ÷Êý¾Ýx£¬±êÇ©Êý¾Ýy_

ÔÚtf.placeholder()º¯ÊýÖд«ÈëÁË3¸ö²ÎÊý£¬µÚÒ»¸öÊǶ¨ÒåÊý¾ÝÀàÐÍΪfloat32£»µÚ¶þ¸öÊÇÊý¾ÝµÄ´óС£¬ÌØÕ÷Êý¾ÝÊÇ´óС784µÄÏòÁ¿£¬±êÇ©Êý¾ÝÊÇ´óСΪ10µÄÏòÁ¿£¬None±íʾ²»¶¨ËÀ´óС£¬µ½Ê±ºò¿ÉÒÔ´«ÈëÈκÎÊýÁ¿µÄÑù±¾£»µÚ3¸ö²ÎÊýÊÇÕâ¸öռλ·ûµÄÃû³Æ¡£

with tf.name_scope('input'):
x = tf.placeholder(tf.float32,
[None, 784], name='x-input')
y_ = tf.placeholder(tf.float32,
[None, 10], name='y-input')

£¨3£©Ê¹ÓÃtf.summary.image±£´æÍ¼ÏñÐÅÏ¢

ÌØÕ÷Êý¾ÝÆäʵ¾ÍÊÇͼÏñµÄÏñËØÊý¾ÝÀ­Éý³ÉÒ»¸ö1*784µÄÏòÁ¿£¬ÏÖÔÚÈç¹ûÏëÔÚtensorboardÉÏ»¹Ô­³öÊäÈëµÄÌØÕ÷Êý¾Ý¶ÔÓ¦µÄͼƬ£¬¾ÍÐèÒª½«À­ÉýµÄÏòÁ¿×ª±ä³É28 * 28 * 1µÄԭʼÏñËØÁË£¬ÓÚÊÇ¿ÉÒÔÓÃtf.reshape()Ö±½ÓÖØÐµ÷ÕûÌØÕ÷Êý¾ÝµÄά¶È£º

½«ÊäÈëµÄÊý¾Ýת»»³É[28 * 28 * 1]µÄshape£¬´æ´¢³ÉÁíÒ»¸ötensor£¬ÃüÃûΪimage_shaped_input¡£

ΪÁËÄÜʹͼƬÔÚtensorbordÉÏչʾ³öÀ´£¬Ê¹ÓÃtf.summary.image½«Í¼Æ¬Êý¾Ý»ã×ܸøtensorbord¡£

tf.summary.image£¨£©Öд«ÈëµÄµÚÒ»¸ö²ÎÊýÊÇÃüÃû£¬µÚ¶þ¸öÊÇͼƬÊý¾Ý£¬µÚÈý¸öÊÇ×î¶àչʾµÄÕÅÊý£¬´Ë´¦Îª10ÕÅ

with tf.name_scope('input_reshape'):
image_shaped_input = tf.reshape
(x, [-1, 28, 28, 1])
tf.summary.image
('input', image_shaped_input, 10)

2.3 ´´½¨³õʼ»¯²ÎÊýµÄ·½·¨£¬Óë²ÎÊýÐÅÏ¢»ã×ܵ½summaryµÄ·½·¨

£¨1£©ÔÚ¹¹½¨Éñ¾­ÍøÂçÄ£ÐÍÖУ¬Ã¿Ò»²ãÖж¼ÐèҪȥ³õʼ»¯²ÎÊýw,b,ΪÁËʹ´úÂë¼ò½éÃÀ¹Û£¬×îºÃ½«³õʼ»¯²ÎÊýµÄ¹ý³Ì·â×°³É·½·¨function¡£

´´½¨³õʼ»¯È¨ÖØwµÄ·½·¨£¬Éú³É´óСµÈÓÚ´«ÈëµÄshape²ÎÊý£¬±ê×¼²îΪ0.1,Õý̬·Ö²¼µÄËæ»úÊý£¬²¢ÇÒ½«Ëüת»»³ÉtensorflowÖеÄvariable·µ»Ø¡£

def weight_variable(shape):
"""Create a weight variable with
appropriate initialization."""
initial = tf.truncated_normal
(shape, stddev=0.1)
return tf.Variable(initial)

´´½¨³õʼ»»Æ«Ö´ÏîbµÄ·½·¨£¬Éú³É´óСΪ´«Èë²ÎÊýshapeµÄ³£Êý0.1£¬²¢½«Æäת»»³ÉtensorflowµÄvariable²¢·µ»Ø

def bias_variable(shape):
"""Create a bias variable with
appropriate initialization."""
initial = tf.constant
(0.1, shape=shape)
return tf.Variable(initial)

£¨2£©ÎÒÃÇÖªµÀ£¬ÔÚѵÁ·µÄ¹ý³ÌÔÚ²ÎÊýÊDz»¶ÏµØÔڸıäºÍÓÅ»¯µÄ£¬ÎÒÃÇÍùÍùÏëÖªµÀÿ´Îµü´úºó²ÎÊý¶¼×öÁËÄÄЩ±ä»¯£¬¿ÉÒÔ½«²ÎÊýµÄÐÅÏ¢Õ¹ÏÖÔÚtenorbordÉÏ£¬Òò´ËÎÒÃÇרÃÅдһ¸ö·½·¨À´ÊÕ¼ÿ´ÎµÄ²ÎÊýÐÅÏ¢¡£

def variable_summaries(var):
"""Attach a lot of summaries to a Tensor
(for TensorBoard visualization)."""
with tf.name_scope('summaries'):
# ¼ÆËã²ÎÊýµÄ¾ùÖµ£¬²¢Ê¹ÓÃtf.summary.scaler¼Ç¼
mean = tf.reduce_mean(var)
tf.summary.scalar('mean', mean)
# ¼ÆËã²ÎÊýµÄ±ê×¼²î
with tf.name_scope('stddev'):
stddev = tf.sqrt(tf.reduce_mean
(tf.square(var - mean)))
# ʹÓÃtf.summary.scaler¼Ç¼
¼Ç¼Ï±ê×¼²î£¬×î´óÖµ£¬×îСֵ
tf.summary.scalar('stddev', stddev)
tf.summary.scalar('max', tf.reduce_max(var))
tf.summary.scalar('min', tf.reduce_min(var))
# ÓÃÖ±·½Í¼¼Ç¼²ÎÊýµÄ·Ö²¼
tf.summary.histogram('histogram', var)

2.4 ¹¹½¨Éñ¾­ÍøÂç²ã

£¨1£©´´½¨µÚÒ»²ãÒþ²Ø²ã

´´½¨Ò»¸ö¹¹½¨Òþ²Ø²ãµÄ·½·¨,ÊäÈëµÄ²ÎÊýÓУº

input_tensor£ºÌØÕ÷Êý¾Ý

input_dim£ºÊäÈëÊý¾ÝµÄά¶È´óС

output_dim£ºÊä³öÊý¾ÝµÄά¶È´óС(=Òþ²ãÉñ¾­Ôª¸öÊý£©

layer_name£ºÃüÃû¿Õ¼ä

act=tf.nn.relu£º¼¤»îº¯Êý£¨Ä¬ÈÏÊÇrelu)

def nn_layer(input_tensor, input_dim,
output_dim, layer_name, act=tf.nn.relu):
"""Reusable code for making a simple
neural net layer.
It does a matrix multiply, bias add,
and then uses relu to nonlinearize.
It also sets up name scoping so that the
resultant graph is easy to read,
and adds a number of summary ops.
"""
# ÉèÖÃÃüÃû¿Õ¼ä
with tf.name_scope(layer_name):
# µ÷ÓÃ֮ǰµÄ·½·¨³õʼ»¯È¨ÖØw£¬²¢ÇÒµ÷ÓÃ
²ÎÊýÐÅÏ¢µÄ¼Ç¼·½·¨£¬¼Ç¼wµÄÐÅÏ¢
with tf.name_scope('weights'):
weights = weight_variable
([input_dim, output_dim])
variable_summaries(weights)
# µ÷ÓÃ֮ǰµÄ·½·¨³õʼ»¯È¨ÖØb£¬²¢ÇÒµ÷ÓÃ
²ÎÊýÐÅÏ¢µÄ¼Ç¼·½·¨£¬¼Ç¼bµÄÐÅÏ¢
with tf.name_scope('biases'):
biases = bias_variable([output_dim])
variable_summaries(biases)
# Ö´ÐÐwx+bµÄÏßÐÔ¼ÆË㣬²¢ÇÒÓÃÖ±·½Í¼¼Ç¼ÏÂÀ´
with tf.name_scope('linear_compute'):
preactivate = tf.matmul
(input_tensor, weights) + biases
tf.summary.histogram('linear', preactivate)
# ½«ÏßÐÔÊä³ö¾­¹ý¼¤Àøº¯Êý£¬
²¢½«Êä³öÒ²ÓÃÖ±·½Í¼¼Ç¼ÏÂÀ´
activations = act(preactivate, name='activation')
tf.summary.histogram('activations', activations)
# ·µ»Ø¼¤Àø²ãµÄ×îÖÕÊä³ö
return activations

µ÷ÓÃÒþ²ã´´½¨º¯Êý´´½¨Ò»¸öÒþ²Ø²ã£ºÊäÈëµÄά¶ÈÊÇÌØÕ÷µÄά¶È784£¬Éñ¾­Ôª¸öÊýÊÇ500£¬Ò²¾ÍÊÇÊä³öµÄά¶È¡£

hidden1 = nn_layer(x, 784, 500, 'layer1')

£¨2£©´´½¨Ò»¸ödropout²ã£¬,Ëæ»ú¹Ø±Õµôhidden1µÄһЩÉñ¾­Ôª£¬²¢¼Ç¼keep_prob

with tf.name_scope('dropout'):
keep_prob = tf.placeholder(tf.float32)
tf.summary.scalar
('dropout_keep_probability', keep_prob)
dropped = tf.nn.dropout(hidden1, keep_prob)

£¨3£©´´½¨Ò»¸öÊä³ö²ã£¬ÊäÈëµÄά¶ÈÊÇÉÏÒ»²ãµÄÊä³ö:500,Êä³öµÄά¶ÈÊÇ·ÖÀàµÄÀà±ðÖÖÀࣺ10£¬¼¤»îº¯ÊýÉèÖÃΪȫµÈÓ³Éäidentity.£¨ÔÝÇÒÏȱðʹÓÃsoftmax,»á·ÅÔÚÖ®ºóµÄËðʧº¯ÊýÖÐÒ»Æð¼ÆË㣩

y = nn_layer(dropped, 500, 10,
'layer2', act=tf.identity)

2.5 ´´½¨Ëðʧº¯Êý

ʹÓÃtf.nn.softmax_cross_entropy_with_logitsÀ´¼ÆËãsoftmax²¢¼ÆËã½»²æìØËðʧ,²¢ÇÒÇó¾ùÖµ×÷Ϊ×îÖÕµÄËðʧֵ¡£

with tf.name_scope('loss'):
# ¼ÆËã½»²æìØËðʧ£¨Ã¿¸öÑù±¾¶¼»áÓÐÒ»¸öËðʧ£©
diff = tf.nn.softmax_cross_entropy_with
_logits(labels=y_, logits=y)
with tf.name_scope('total'):
# ¼ÆËãËùÓÐÑù±¾½»²æìØËðʧµÄ¾ùÖµ
cross_entropy = tf.reduce_mean(diff)
tf.summary.scalar('loss', cross_entropy)

2.6 ѵÁ·£¬²¢¼ÆËã׼ȷÂÊ

£¨1£©Ê¹ÓÃAdamOptimizerÓÅ»¯Æ÷ѵÁ·Ä£ÐÍ£¬×îС»¯½»²æìØËðʧ

with tf.name_scope('train'):
train_step = tf.train.AdamOptimizer
(learning_rate).minimize(
cross_entropy)

£¨2£©¼ÆËã׼ȷÂÊ,²¢ÓÃtf.summary.scalar¼Ç¼׼ȷÂÊ

with tf.name_scope('accuracy'):
with tf.name_scope('correct_prediction'):
# ·Ö±ð½«Ô¤²âºÍÕæÊµµÄ±êÇ©ÖÐÈ¡³ö×î´óÖµµÄË÷Òý£¬
ÈõÏàͬÔò·µ»Ø1(true),²»Í¬Ôò·µ»Ø0(false)
correct_prediction = tf.equal
(tf.argmax(y, 1), tf.argmax(y_, 1))
with tf.name_scope('accuracy'):
# Çó¾ùÖµ¼´Îª×¼È·ÂÊ
accuracy = tf.reduce_mean(tf.cast
(correct_prediction, tf.float32))
tf.summary.scalar('accuracy', accuracy)

2.7 ºÏ²¢summary operation, ÔËÐгõʼ»¯±äÁ¿

½«ËùÓеÄsummariesºÏ²¢£¬²¢ÇÒ½«ËüÃÇдµ½Ö®Ç°¶¨ÒåµÄlog_dir·¾¶

# summariesºÏ²¢
merged = tf.summary.merge_all()
# дµ½Ö¸¶¨µÄ´ÅÅÌ·¾¶ÖÐ
train_writer = tf.summary.FileWriter
(log_dir + '/train', sess.graph)
test_writer = tf.summary.FileWriter
(log_dir + '/test')
# ÔËÐгõʼ»¯ËùÓбäÁ¿
tf.global_variables_initializer().run()

2.8 ×¼±¸ÑµÁ·Óë²âÊÔµÄÁ½¸öÊý¾Ý£¬Ñ­»·Ö´ÐÐÕû¸ögraph½øÐÐѵÁ·ÓëÆÀ¹À

£¨1£©ÏÖÔÚÎÒÃÇÒª»ñȡ֮ºóҪιÈ˵ÄÊý¾Ý.

Èç¹ûÊÇtrain==true£¬¾Í´Ómnist.trainÖлñȡһ¸öbatchÑù±¾£¬²¢ÇÒÉèÖÃdropoutÖµ£»

Èç¹ûÊDz»ÊÇtrain==false,Ôò»ñÈ¡minist.testµÄ²âÊÔÊý¾Ý£¬²¢ÇÒÉèÖÃkeep_probΪ1£¬¼´±£ÁôËùÓÐÉñ¾­Ôª¿ªÆô¡£

def feed_dict(train):
"""Make a TensorFlow feed_dict: maps data
onto Tensor placeholders."""
if train:
xs, ys = mnist.train.next_batch(100)
k = dropout
else:
xs, ys = mnist.test.images, mnist.test.labels
k = 1.0
return {x: xs, y_: ys, keep_prob: k}

£¨2£©¿ªÊ¼ÑµÁ·Ä£ÐÍ¡£

ÿ¸ô10²½£¬¾Í½øÐÐÒ»´Îmerge, ²¢´òÓ¡Ò»´Î²âÊÔÊý¾Ý¼¯µÄ׼ȷÂÊ£¬È»ºó½«²âÊÔÊý¾Ý¼¯µÄ¸÷ÖÖsummaryÐÅϢд½øÈÕÖ¾ÖС£

ÿ¸ô100²½£¬¼Ç¼ԭÐÅÏ¢

ÆäËûÿһ²½Ê±¶¼¼Ç¼ÏÂѵÁ·¼¯µÄsummaryÐÅÏ¢²¢Ð´µ½ÈÕÖ¾ÖС£

for i in range(max_steps):
if i % 10 == 0: # ¼Ç¼²âÊÔ¼¯µÄsummaryÓëaccuracy
summary, acc = sess.run([merged, accuracy],
feed_dict=feed_dict(False))
test_writer.add_summary(summary, i)
print('Accuracy at step %s: %s' % (i, acc))
else: # ¼Ç¼ѵÁ·¼¯µÄsummary
if i % 100 == 99: # Record execution stats
run_options = tf.RunOptions
(trace_level=tf.RunOptions.FULL_TRACE)
run_metadata = tf.RunMetadata()
summary, _ = sess.run([merged, train_step],
feed_dict=feed_dict(True),
options=run_options,
run_metadata=run_metadata)
train_writer.add_run_metadata
(run_metadata, 'step%03d' % i)
train_writer.add_summary(summary, i)
print('Adding run metadata for', i)
else: # Record a summary
summary, _ = sess.run([merged, train_step],
feed_dict=feed_dict(True))
train_writer.add_summary(summary, i)
train_writer.close()
test_writer.close()

2.9 Ö´ÐгÌÐò£¬tensorboardÉú³É¿ÉÊÓ»¯

£¨1£©ÔËÐÐÕû¸ö³ÌÐò£¬ÔÚ³ÌÐòÖж¨ÒåµÄsummary node¾Í»á½«Òª¼Ç¼µÄÐÅϢȫ²¿±£´æÔÚÖ¸¶¨µÄlogdir·¾¶ÖÐÁË£¬ÑµÁ·µÄ¼Ç¼»á´æÒ»·ÝÎļþ£¬²âÊԵļǼ»á´æÒ»·ÝÎļþ¡£

£¨2£©½øÈëlinuxÃüÁîÐУ¬ÔËÐÐÒÔÏ´úÂ룬µÈºÅºóÃæ¼ÓÉÏsummaryÈÕÖ¾±£´æµÄ·¾¶£¨ÔÚ³ÌÐòµÚÒ»²½ÖоÍÊÂÏÈ×Ô¶¨ÒåÁË£©

tensorboard --logdir=

Ö´ÐÐÃüÁîÖ®ºó»á³öÏÖÒ»ÌõÐÅÏ¢£¬ÉÏÃæÓÐÍøÖ·£¬½«ÍøÖ·ÔÚä¯ÀÀÆ÷Öдò¿ª¾Í¿ÉÒÔ¿´µ½ÎÒÃǶ¨ÒåµÄ¿ÉÊÓ»¯ÐÅÏ¢ÁË

Starting TensorBoard 41 on port 6006
(You can navigate to http://127.0.1.1:6006)

½«http://127.0.1.1:6006ÔÚä¯ÀÀÆ÷Öдò¿ª£¬³É¹¦µÄ»°ÈçÏ£º

ÓÚÊÇÎÒÃÇ¿ÉÒÔ´ÓÕâ¸öweb¶Ë¿´µ½ËùÓгÌÐòÖж¨ÒåµÄ¿ÉÊÓ»¯ÐÅÏ¢ÁË¡£

2.10 Tensorboard Web¶Ë½âÊÍ

¿´µ½×îÉÏÃæ³ÈɫһÀ¸µÄ²Ëµ¥£¬·Ö±ðÓÐ7¸öÀ¸Ä¿£¬¶¼Ò»Ò»¶ÔÓ¦×ÅÎÒÃdzÌÐòÖж¨ÒåÐÅÏ¢µÄÀàÐÍ¡£

£¨1£©SCALARS

չʾµÄÊDZêÁ¿µÄÐÅÏ¢£¬ÎÒ³ÌÐòÖÐÓÃtf.summary.scalars()¶¨ÒåµÄÐÅÏ¢¶¼»áÔÚÕâ¸ö´°¿Ú¡£

»Ø¹Ë±¾ÎijÌÐòÖж¨ÒåµÄ±êÁ¿ÓУº×¼È·ÂÊaccuracy,dropoutµÄ±£ÁôÂÊ£¬Òþ²Ø²ãÖеIJÎÊýÐÅÏ¢£¬ÒѾ­½»²æìØËðʧ¡£ÕâЩ¶¼ÔÚSCLARS´°¿ÚÏÂÏÔʾ³öÀ´ÁË¡£

µã¿ªaccuracy,ºìÏß±íʾtest¼¯µÄ½á¹û£¬À¶Ïß±íʾtrain¼¯µÄ½á¹û£¬¿ÉÒÔ¿´µ½Ëæ×ÅÑ­»·´ÎÊýµÄÔö¼Ó£¬Á½ÕßµÄ׼ȷ¶ÈÒ²ÔÚͨÇ÷ÊÆÔö¼Ó£¬ÖµµÃ×¢ÒâµÄÊÇ£¬ÔÚ0µ½100´ÎµÄÑ­»·ÖÐ׼ȷÂÊ¿ìËÙ¼¤Ôö£¬100´ÎÖ®ºó±£³Ö΢ÈõµØÉÏÉýÇ÷ÊÆ£¬Ö±´ï1000´Îʱ»áµ½´ï0.967×óÓÒ

µã¿ªdropout£¬ºìÏß±íʾµÄ²âÊÔ¼¯Éϵı£ÁôÂÊʼÖÕÊÇ1£¬À¶ÏßʼÖÕÊÇ0.9

µã¿ªlayer1£¬²é¿´µÚÒ»¸öÒþ²Ø²ãµÄ²ÎÊýÐÅÏ¢¡£

ÒÔÉÏ£¬µÚÒ»ÅÅÊÇÆ«Ö´ÏîbµÄÐÅÏ¢£¬Ëæ×ŵü´úµÄ¼ÓÉ×î´óÖµÔ½À´Ô½´ó£¬×îСֵԽÀ´Ô½Ð¡£¬Óë´Ëͬʱ£¬Ò²°éËæ×Å·½²îÔ½À´Ô½´ó£¬ÕâÑùµÄÇé¿öÊÇÎÒÃÇÔ¸Òâ¿´µ½µÄ£¬Éñ¾­ÔªÖ®¼äµÄ²ÎÊý²îÒìÔ½À´Ô½´ó¡£ÒòΪÀíÏëµÄÇé¿öÏÂÿ¸öÉñ¾­Ôª¶¼Ó¦¸ÃÈ¥¹Ø×¢²»Í¬µÄÌØÕ÷£¬ËùÒÔËûÃǵIJÎÊýÒ²Ó¦ÓÐËù²»Í¬¡£

µÚ¶þÅÅÊÇȨֵwµÄÐÅÏ¢£¬Í¬Àí£¬×î´óÖµ£¬×îСֵ£¬±ê×¼²îÒ²¶¼ÓÐÓëbÏàͬµÄÇ÷ÊÆ£¬Éñ¾­ÔªÖ®¼äµÄ²îÒìÔ½À´Ô½Ã÷ÏÔ¡£wµÄ¾ùÖµ³õʼ»¯µÄʱºòÊÇ0£¬Ëæ×ŵü´úÆä¾ø¶ÔÖµÒ²Ô½À´Ô½´ó¡£

µã¿ªlayer2

µã¿ªloss£¬¿É¼ûËðʧµÄ½µµÍÇ÷ÊÆ¡£

£¨2£©IMAGES

ÔÚ³ÌÐòÖÐÎÒÃÇÉèÖÃÁËÒ»´¦±£´æÁËͼÏñÐÅÏ¢£¬¾ÍÊÇÔÚת±äÁËÊäÈëÌØÕ÷µÄshape£¬È»ºó¼Ç¼µ½ÁËimageÖУ¬ÓÚÊÇÔÚtensorflowÖоͻỹԭ³öԭʼµÄͼƬÁË£º

Õû¸ö´°¿Ú×ܹ²Õ¹ÏÖÁË10ÕÅͼƬ£¨¸ù¾Ý´úÂëÖеIJÎÊý10£©

£¨3£©AUDIO

ÕâÀïչʾµÄÊÇÉùÒôµÄÐÅÏ¢£¬µ«±¾°¸ÀýÖÐûÓÐÉæ¼°µ½ÉùÒôµÄ¡£

£¨4£©GRAPHS

ÕâÀïչʾµÄÊÇÕû¸öѵÁ·¹ý³ÌµÄ¼ÆËãͼgraph,´ÓÖÐÎÒÃÇ¿ÉÒÔÇåÏ´µØ¿´µ½Õû¸ö³ÌÐòµÄÂß¼­Óë¹ý³Ì¡£

µ¥»÷ij¸ö½Úµã£¬¿ÉÒԲ鿴ÊôÐÔ£¬ÊäÈ룬Êä³öµÈÐÅÏ¢

µ¥»÷½ÚµãÉϵġ°+¡±×ÖÑù£¬¿ÉÒÔ¿´µ½¸Ã½ÚµãµÄÄÚ²¿ÐÅÏ¢¡£

ÁíÍ⻹¿ÉÒÔÑ¡ÔñͼÏñÑÕÉ«µÄÁ½ÕßÄ£ÐÍ£¬»ùÓڽṹµÄģʽ£¬ÏàͬµÄ½Úµã»áÓÐͬÑùµÄÑÕÉ«£¬»ùÓÚÔ¤ËãÓ²¼þµÄ£¬Í¬Ò»¸öÓ²¼þÉϵĻáÓÐÏàͬÑÕÉ«¡£

£¨5£©DISTRIBUTIONS

ÕâÀï²é¿´µÄÊÇÉñ¾­ÔªÊä³öµÄ·Ö²¼£¬Óм¤»îº¯Êý֮ǰµÄ·Ö²¼£¬¼¤»îº¯ÊýÖ®ºóµÄ·Ö²¼µÈ¡£

£¨6£©HISTOGRAMS

Ò²¿ÉÒÔ¿´ÒÔÉÏÊý¾ÝµÄÖ±·½Í¼

£¨7£©EMBEDDINGS

չʾµÄÊÇǶÈëÏòÁ¿µÄ¿ÉÊÓ»¯Ð§¹û£¬±¾°¸ÀýÖÐûÓÐʹÓÃÕâ¸ö¹¦ÄÜ¡£Ö®ºóÆäËû°¸ÀýÖÐÔÙÏêÊö¡£

   
3001 ´Îä¯ÀÀ       28
Ïà¹ØÎÄÕÂ

»ùÓÚͼ¾í»ýÍøÂçµÄͼÉî¶Èѧϰ
×Ô¶¯¼ÝÊ»ÖеÄ3DÄ¿±ê¼ì²â
¹¤Òµ»úÆ÷ÈË¿ØÖÆÏµÍ³¼Ü¹¹½éÉÜ
ÏîĿʵս£ºÈçºÎ¹¹½¨ÖªÊ¶Í¼Æ×
 
Ïà¹ØÎĵµ

5GÈ˹¤ÖÇÄÜÎïÁªÍøµÄµäÐÍÓ¦ÓÃ
Éî¶ÈѧϰÔÚ×Ô¶¯¼ÝÊ»ÖеÄÓ¦ÓÃ
ͼÉñ¾­ÍøÂçÔÚ½»²æÑ§¿ÆÁìÓòµÄÓ¦ÓÃÑо¿
ÎÞÈË»úϵͳԭÀí
Ïà¹Ø¿Î³Ì

È˹¤ÖÇÄÜ¡¢»úÆ÷ѧϰ&TensorFlow
»úÆ÷ÈËÈí¼þ¿ª·¢¼¼Êõ
È˹¤ÖÇÄÜ£¬»úÆ÷ѧϰºÍÉî¶Èѧϰ
ͼÏñ´¦ÀíËã·¨·½·¨Óëʵ¼ù