±à¼ÍƼö: |
±¾ÎÄÀ´Ô´csdn£¬½éÉÜÁ˽éÉÜÁËDataset
APIµÄ»ù±¾¼Ü¹¹£ºDatasetÀàºÍIteratorÀ࣬ÒÔ¼°ËüÃǵĻù´¡Ê¹Ó÷½·¨¡£ |
|
Dataset APIÊÇTensorFlow 1.3°æ±¾ÖÐÒýÈëµÄÒ»¸öеÄÄ£¿é£¬Ö÷Òª·þÎñÓÚÊý¾Ý¶ÁÈ¡£¬¹¹½¨ÊäÈëÊý¾ÝµÄpipeline¡£
´Ëǰ£¬ÔÚTensorFlowÖжÁÈ¡Êý¾ÝÒ»°ãÓÐÁ½ÖÖ·½·¨£º
1.ʹÓÃplaceholder¶ÁÄÚ´æÖеÄÊý¾Ý
2.ʹÓÃqueue¶ÁÓ²ÅÌÖеÄÊý¾Ý£¨¹ØÓÚÕâÖÖ·½Ê½£¬¿ÉÒԲο¼ÎÒ֮ǰµÄһƪÎÄÕ£ºÊ®Í¼Ïê½âTensorFlowÊý¾Ý¶ÁÈ¡»úÖÆ£©
ÏàDataset APIͬʱ֧³Ö´ÓÄÚ´æºÍÓ²Å̵ĶÁÈ¡£¬Ïà±È֮ǰµÄÁ½ÖÖ·½·¨ÔÚÓï·¨Éϸü¼Ó¼ò½àÒ×¶®¡£´ËÍ⣬Èç¹ûÏëÒªÓõ½TensorFlowгöµÄEagerģʽ£¬¾Í±ØÐëҪʹÓÃDataset
APIÀ´¶ÁÈ¡Êý¾Ý¡£
±¾ÎľÍÀ´Îª´ó¼ÒÏêϸµØ½éÉÜÒ»ÏÂDataset APIµÄʹÓ÷½·¨£¨°üÀ¨ÔÚ·ÇEagerģʽºÍEagerģʽÏÂÁ½ÖÖÇé¿ö£©¡£
Dataset APIµÄµ¼Èë
ÔÚTensorFlow 1.3ÖУ¬Dataset APIÊÇ·ÅÔÚcontrib°üÖеģº
¶øÔÚTensorFlow 1.4ÖУ¬Dataset APIÒѾ´Ócontrib°üÖÐÒÆ³ý£¬±ä³ÉÁ˺ËÐÄAPIµÄÒ»Ô±£º
ÏÂÃæµÄʾÀý´úÂ뽫ÒÔTensorFlow 1.4°æ±¾ÎªÀý£¬Èç¹ûʹÓÃTensorFlow 1.3µÄ»°£¬ÐèÒª½øÐмòµ¥µÄÐ޸썼´¼ÓÉÏcontrib£©¡£
»ù±¾¸ÅÄDatasetÓëIterator
ÈÃÎÒÃÇ´Ó»ù´¡µÄÀàÀ´Á˽âDataset API¡£²Î¿¼Google¹Ù·½¸ø³öµÄDataset
APIÖеÄÀàͼ£º

ÔÚ³õѧʱ£¬ÎÒÃÇÖ»ÐèÒª¹Ø×¢Á½¸ö×îÖØÒªµÄ»ù´¡ÀࣺDatasetºÍIterator¡£
Dataset¿ÉÒÔ¿´×÷ÊÇÏàͬÀàÐÍ¡°ÔªËØ¡±µÄÓÐÐòÁÐ±í¡£ÔÚʵ¼ÊʹÓÃʱ£¬µ¥¸ö¡°ÔªËØ¡±¿ÉÒÔÊÇÏòÁ¿£¬Ò²¿ÉÒÔÊÇ×Ö·û´®¡¢Í¼Æ¬£¬ÉõÖÁÊÇtuple»òÕßdict¡£
ÏÈÒÔ×î¼òµ¥µÄ£¬DatasetµÄÿһ¸öÔªËØÊÇÒ»¸öÊý×ÖΪÀý£º
import
tensorflow as tf
import numpy as np
dataset = tf.data.Dataset.from_tensor_slices(np.array([1.0,
2.0, 3.0, 4.0, 5.0]))
|
ÕâÑù£¬ÎÒÃǾʹ´½¨ÁËÒ»¸ödataset£¬Õâ¸ödatasetÖк¬ÓÐ5¸öÔªËØ£¬·Ö±ðÊÇ1.0, 2.0, 3.0,
4.0, 5.0¡£
ÈçºÎ½«Õâ¸ödatasetÖеÄÔªËØÈ¡³öÄØ£¿·½·¨ÊÇ´ÓDatasetÖÐʾÀý»¯Ò»¸öIterator£¬È»ºó¶ÔIterator½øÐеü´ú¡£
ÔÚ·ÇEagerģʽÏ£¬¶ÁÈ¡ÉÏÊödatasetÖÐÔªËØµÄ·½·¨Îª£º
iterator
= dataset.make_one_shot_iterator()
one_element = iterator.get_next()
with tf.Session() as sess:
for i in range(5):
print(sess.run(one_element))
|
¶ÔÓ¦µÄÊä³ö½á¹ûÓ¦¸Ã¾ÍÊÇ´Ó1.0µ½5.0¡£Óï¾äiterator = dataset.make_one_shot_iterator()´ÓdatasetÖÐʵÀý»¯ÁËÒ»¸öIterator£¬Õâ¸öIteratorÊÇÒ»¸ö¡°one
shot iterator¡±£¬¼´Ö»ÄÜ´ÓÍ·µ½Î²¶Áȡһ´Î¡£one_element = iterator.get_next()±íʾ´ÓiteratorÀïÈ¡³öÒ»¸öÔªËØ¡£ÓÉÓÚÕâÊÇ·ÇEagerģʽ£¬ËùÒÔone_elementÖ»ÊÇÒ»¸öTensor£¬²¢²»ÊÇÒ»¸öʵ¼ÊµÄÖµ¡£µ÷ÓÃsess.run(one_element)ºó£¬²ÅÄÜÕæÕýµØÈ¡³öÒ»¸öÖµ¡£
Èç¹ûÒ»¸ödatasetÖÐÔªËØ±»¶ÁÈ¡ÍêÁË£¬ÔÙ³¢ÊÔsess.run(one_element)µÄ»°£¬¾Í»áÅ׳ötf.errors.OutOfRangeErrorÒì³££¬Õâ¸öÐÐΪÓëʹÓöÓÁз½Ê½¶ÁÈ¡Êý¾ÝµÄÐÐΪÊÇÒ»Öµġ£ÔÚʵ¼Ê³ÌÐòÖУ¬¿ÉÒÔÔÚÍâ½ç²¶×½Õâ¸öÒì³£ÒÔÅжÏÊý¾ÝÊÇ·ñ¶ÁÈ¡Í꣬Çë²Î¿¼ÏÂÃæµÄ´úÂ룺
dataset
= tf.data.Dataset.from_tensor_
slices(np.array([1.0,
2.0, 3.0, 4.0, 5.0]))
iterator = dataset.make_one_shot_iterator()
one_element = iterator.get_next()
with tf.Session() as sess:
try:
while True:
print(sess.run(one_element))
except tf.errors.OutOfRangeError:
print("end!")
|
ÔÚEagerģʽÖУ¬´´½¨IteratorµÄ·½Ê½ÓÐËù²»Í¬¡£ÊÇͨ¹ýtfe.Iterator(dataset)µÄÐÎʽֱ½Ó´´½¨Iterator²¢µü´ú¡£µü´úʱ¿ÉÒÔÖ±½ÓÈ¡³öÖµ£¬²»ÐèҪʹÓÃsess.run()£º
dataset
= tf.data.Dataset.from_tensor_slices
(np.array([1.0,
2.0, 3.0, 4.0, 5.0]))
iterator = dataset.make_one_shot_iterator()
one_element = iterator.get_next()
with tf.Session() as sess:
try:
while True:
print(sess.run(one_element))
except tf.errors.OutOfRangeError:
print("end!")
|
´ÓÄÚ´æÖд´½¨¸ü¸´ÔÓµÄDataset
֮ǰÎÒÃÇÓÃtf.data.Dataset.from_tensor_slices´´½¨ÁËÒ»¸ö×î¼òµ¥µÄDataset£º
dataset
= tf.data.Dataset.from_tensor_slices
(np.array([1.0,
2.0, 3.0, 4.0, 5.0]))
|
Æäʵ£¬tf.data.Dataset.from_tensor_slicesµÄ¹¦Äܲ»Ö¹Èç´Ë£¬ËüµÄÕæÕý×÷ÓÃÊÇÇзִ«ÈëTensorµÄµÚÒ»¸öά¶È£¬Éú³ÉÏàÓ¦µÄdataset¡£
ÀýÈ磺
dataset
= tf.data.Dataset.from_tensor_slices(np.random.
uniform(size=(5,
2)))
|
´«ÈëµÄÊýÖµÊÇÒ»¸ö¾ØÕó£¬ËüµÄÐÎ״Ϊ(5, 2)£¬tf.data.Dataset.from_tensor_slices¾Í»áÇзÖËüÐÎ×´ÉϵĵÚÒ»¸öά¶È£¬×îºóÉú³ÉµÄdatasetÖÐÒ»¸öº¬ÓÐ5¸öÔªËØ£¬Ã¿¸öÔªËØµÄÐÎ×´ÊÇ(2,
)£¬¼´Ã¿¸öÔªËØÊǾØÕóµÄÒ»ÐС£
ÔÚʵ¼ÊʹÓÃÖУ¬ÎÒÃÇ¿ÉÄÜ»¹Ï£ÍûDatasetÖеÄÿ¸öÔªËØ¾ßÓиü¸´ÔÓµÄÐÎʽ£¬Èçÿ¸öÔªËØÊÇÒ»¸öPythonÖеÄÔª×飬»òÊÇPythonÖеĴʵ䡣ÀýÈ磬ÔÚͼÏñʶ±ðÎÊÌâÖУ¬Ò»¸öÔªËØ¿ÉÒÔÊÇ{¡°image¡±:
image_tensor, ¡°label¡±: label_tensor}µÄÐÎʽ£¬ÕâÑù´¦ÀíÆðÀ´¸ü·½±ã¡£
dataset
= tf.data.Dataset.from_tensor_slices(
{
"a": np.array([1.0, 2.0, 3.0, 4.0,
5.0]),
"b": np.random.uniform(size=(5, 2))
}
)
|
Õâʱº¯Êý»á·Ö±ðÇз֡±a¡±ÖеÄÊýÖµÒÔ¼°¡±b¡±ÖеÄÊýÖµ£¬×îÖÕdatasetÖеÄÒ»¸öÔªËØ¾ÍÊÇÀàËÆÓÚ{¡°a¡±:
1.0, ¡°b¡±: [0.9, 0.1]}µÄÐÎʽ¡£
ÀûÓÃtf.data.Dataset.from_tensor_slices´´½¨Ã¿¸öÔªËØÊÇÒ»¸ötupleµÄdatasetÒ²ÊÇ¿ÉÒԵģº
dataset
= tf.data.Dataset.from_tensor_slices(
(np.array([1.0, 2.0, 3.0, 4.0, 5.0]), np.random.uniform(size=(5,
2)))
)
|
¶ÔDatasetÖеÄÔªËØ×ö±ä»»
DatasetÖ§³ÖÒ»ÀàÌØÊâµÄ²Ù×÷£ºTransformation¡£Ò»¸öDatasetͨ¹ýTransformation±ä³ÉÒ»¸öеÄDataset¡£Í¨³£ÎÒÃÇ¿ÉÒÔͨ¹ýTransformationÍê³ÉÊý¾Ý±ä»»£¬´òÂÒ£¬×é³Ébatch£¬Éú³ÉepochµÈһϵÁвÙ×÷¡£
³£ÓõÄTransformationÓУº
1.map
2.batch
3.shuffle
4.repeat
ÏÂÃæ¾Í·Ö±ð½øÐнéÉÜ¡£
£¨1£©map
map½ÓÊÕÒ»¸öº¯Êý£¬DatasetÖеÄÿ¸öÔªËØ¶¼»á±»µ±×÷Õâ¸öº¯ÊýµÄÊäÈ룬²¢½«º¯Êý·µ»ØÖµ×÷ΪеÄDataset£¬ÈçÎÒÃÇ¿ÉÒÔ¶ÔdatasetÖÐÿ¸öÔªËØµÄÖµ¼Ó1£º
dataset
= tf.data.Dataset.from_tensor_slices
(np.array([1.0,
2.0, 3.0, 4.0, 5.0]))
dataset = dataset.map(lambda x: x + 1) #
2.0, 3.0, 4.0, 5.0,
6.0
|
£¨2£©batch
batch¾ÍÊǽ«¶à¸öÔªËØ×éºÏ³Ébatch£¬ÈçÏÂÃæµÄ³ÌÐò½«datasetÖеÄÿ¸öÔªËØ×é³ÉÁË´óСΪ32µÄbatch£º
dataset
= dataset.batch(32)
|
£¨3£©shuffle
shuffleµÄ¹¦ÄÜΪ´òÂÒdatasetÖеÄÔªËØ£¬ËüÓÐÒ»¸ö²ÎÊýbuffersize£¬±íʾ´òÂÒʱʹÓõÄbufferµÄ´óС£º
dataset
= dataset.shuffle(buffer_size=10000)
|
£¨4£©repeat
repeatµÄ¹¦ÄܾÍÊǽ«Õû¸öÐòÁÐÖØ¸´¶à´Î£¬Ö÷ÒªÓÃÀ´´¦Àí»úÆ÷ѧϰÖеÄepoch£¬¼ÙÉèÔÏȵÄÊý¾ÝÊÇÒ»¸öepoch£¬Ê¹ÓÃrepeat(5)¾Í¿ÉÒÔ½«Ö®±ä³É5¸öepoch£º
dataset
= dataset.repeat(5)
|
Èç¹ûÖ±½Óµ÷ÓÃrepeat()µÄ»°£¬Éú³ÉµÄÐòÁоͻáÎÞÏÞÖØ¸´ÏÂÈ¥£¬Ã»ÓнáÊø£¬Òò´ËÒ²²»»áÅ׳ötf.errors.OutOfRangeErrorÒì³££º
dataset
= dataset.repeat()
|
¶ÁÈë´ÅÅÌͼƬÓë¶ÔÓ¦label
½²µ½ÕâÀÎÒÃÇ¿ÉÒÔÀ´¿¼ÂÇÒ»¸ö¼òµ¥£¬µ«Í¬Ê±Ò²·Ç³£³£ÓõÄÀý×Ó£º¶ÁÈë´ÅÅÌÖеÄͼƬºÍͼƬÏàÓ¦µÄlabel£¬²¢½«Æä´òÂÒ£¬×é³Ébatch_size=32µÄѵÁ·Ñù±¾¡£ÔÚѵÁ·Ê±Öظ´10¸öepoch¡£
¶ÔÓ¦µÄ³ÌÐòΪ£¨´Ó¹Ù·½Ê¾Àý³ÌÐòÐ޸ĶøÀ´£©£º
#
º¯ÊýµÄ¹¦ÄÜʱ½«filename¶ÔÓ¦µÄͼƬÎļþ¶Á½øÀ´£¬
²¢Ëõ·Åµ½Í³Ò»µÄ´óС
def _parse_function(filename, label):
image_string = tf.read_file(filename)
image_decoded = tf.image.decode_image(image_string)
image_resized = tf.image.resize_images
(image_decoded,
[28, 28])
return image_resized, label
# ͼƬÎļþµÄÁбí
filenames = tf.constant(["/var/data/image1.jpg",
"/var/data/image2.jpg", ...])
# label[i]¾ÍÊÇͼƬfilenames[i]µÄlabel
labels = tf.constant([0, 37, ...])
# ´ËʱdatasetÖеÄÒ»¸öÔªËØÊÇ(filename, label)
dataset = tf.data.Dataset.from_tensor_slices
((filenames, labels))
# ´ËʱdatasetÖеÄÒ»¸öÔªËØÊÇ(image_resized, label)
dataset = dataset.map(_parse_function)
# ´ËʱdatasetÖеÄÒ»¸öÔªËØÊÇ(image_resized_batch,
label_batch)
dataset = dataset.shuffle(buffersize=1000).batch(32)
.repeat(10)
|
ÔÚÕâ¸ö¹ý³ÌÖУ¬dataset¾ÀúÈý´Îת±ä£º
1.ÔËÐÐdataset = tf.data.Dataset.from_tensor_slices((filenames,
labels))ºó£¬datasetµÄÒ»¸öÔªËØÊÇ(filename, label)¡£filenameÊÇͼƬµÄÎļþÃû£¬labelÊÇͼƬ¶ÔÓ¦µÄ±êÇ©¡£
2.Ö®ºóͨ¹ýmap£¬½«filename¶ÔÓ¦µÄͼƬ¶ÁÈ룬²¢Ëõ·ÅΪ28x28µÄ´óС¡£´ËʱdatasetÖеÄÒ»¸öÔªËØÊÇ(image_resized,
label)
3.×îºó£¬dataset.shuffle(buffersize=1000).batch(32).repeat(10)µÄ¹¦ÄÜÊÇ£ºÔÚÿ¸öepochÄÚ½«Í¼Æ¬´òÂÒ×é³É´óСΪ32µÄbatch£¬²¢Öظ´10´Î¡£×îÖÕ£¬datasetÖеÄÒ»¸öÔªËØÊÇ(image_resized_batch,
label_batch)£¬image_resized_batchµÄÐÎ״Ϊ(32, 28, 28, 3)£¬¶ølabel_batchµÄÐÎ״Ϊ(32,
)£¬½ÓÏÂÀ´ÎÒÃǾͿÉÒÔÓÃÕâÁ½¸öTensorÀ´½¨Á¢Ä£ÐÍÁË¡£
DatasetµÄÆäËü´´½¨·½·¨
³ýÁËtf.data.Dataset.from_tensor_slicesÍ⣬ĿǰDataset API»¹ÌṩÁËÁíÍâÈýÖÖ´´½¨DatasetµÄ·½Ê½£º
1.tf.data.TextLineDataset()£ºÕâ¸öº¯ÊýµÄÊäÈëÊÇÒ»¸öÎļþµÄÁÐ±í£¬Êä³öÊÇÒ»¸ödataset¡£datasetÖеÄÿһ¸öÔªËØ¾Í¶ÔÓ¦ÁËÎļþÖеÄÒ»ÐС£¿ÉÒÔʹÓÃÕâ¸öº¯ÊýÀ´¶ÁÈëCSVÎļþ¡£
2.tf.data.FixedLengthRecordDataset()£ºÕâ¸öº¯ÊýµÄÊäÈëÊÇÒ»¸öÎļþµÄÁбíºÍÒ»¸örecord_bytes£¬Ö®ºódatasetµÄÿһ¸öÔªËØ¾ÍÊÇÎļþÖй̶¨×Ö½ÚÊýrecord_bytesµÄÄÚÈÝ¡£Í¨³£ÓÃÀ´¶ÁÈ¡ÒÔ¶þ½øÖÆÐÎʽ±£´æµÄÎļþ£¬ÈçCIFAR10Êý¾Ý¼¯¾ÍÊÇÕâÖÖÐÎʽ¡£
3.tf.data.TFRecordDataset()£º¹ËÃû˼Ò壬Õâ¸öº¯ÊýÊÇÓÃÀ´¶ÁTFRecordÎļþµÄ£¬datasetÖеÄÿһ¸öÔªËØ¾ÍÊÇÒ»¸öTFExample¡£
ËüÃǵÄÏêϸʹÓ÷½·¨¿ÉÒÔ²ÎÔÄÎĵµ£ºModule: tf.data
¸ü¶àÀàÐ͵ÄIterator
ÔÚ·ÇEagerģʽÏ£¬×î¼òµ¥µÄ´´½¨IteratorµÄ·½·¨¾ÍÊÇͨ¹ýdataset.make_one_shot_iterator()À´´´½¨Ò»¸öone
shot iterator¡£³ýÁËÕâÖÖone shot iteratorÍ⣬»¹ÓÐÈý¸ö¸ü¸´ÔÓµÄIterator£¬¼´£º
1.initializable iterator
2.reinitializable iterator
3.feedable iterator
initializable iterator±ØÐëÒªÔÚʹÓÃǰͨ¹ýsess.run()À´³õʼ»¯¡£Ê¹ÓÃinitializable
iterator£¬¿ÉÒÔ½«placeholder´úÈëIteratorÖУ¬Õâ¿ÉÒÔ·½±ãÎÒÃÇͨ¹ý²ÎÊý¿ìËÙ¶¨ÒåеÄIterator¡£Ò»¸ö¼òµ¥µÄinitializable
iteratorʹÓÃʾÀý£º
limit
= tf.placeholder(dtype=tf.int32, shape=[])
dataset = tf.data.Dataset.from_tensor_slices
(tf.range(start=0, limit=limit))
iterator = dataset.make_initializable_iterator()
next_element = iterator.get_next()
with tf.Session() as sess:
sess.run(iterator.initializer, feed_dict={limit:
10})
for i in range(10):
value = sess.run(next_element)
assert i == value
|
´ËʱµÄlimitÏ൱ÓÚÒ»¸ö¡°²ÎÊý¡±£¬Ëü¹æ¶¨ÁËDatasetÖÐÊýµÄ¡°ÉÏÏÞ¡±¡£
initializable iterator»¹ÓÐÒ»¸ö¹¦ÄÜ£º¶ÁÈë½Ï´óµÄÊý×é¡£
ÔÚʹÓÃtf.data.Dataset.from_tensor_slices(array)ʱ£¬Êµ¼ÊÉÏ·¢ÉúµÄÊÂÇéÊǽ«array×÷Ϊһ¸ötf.constants±£´æµ½Á˼ÆËãͼÖС£µ±arrayºÜ´óʱ£¬»áµ¼Ö¼ÆËãͼ±äµÃºÜ´ó£¬¸ø´«Êä¡¢±£´æ´øÀ´²»±ã¡£Õâʱ£¬ÎÒÃÇ¿ÉÒÔÓÃÒ»¸öplaceholderÈ¡´úÕâÀïµÄarray£¬²¢Ê¹ÓÃinitializable
iterator£¬Ö»ÔÚÐèҪʱ½«array´«½øÈ¥£¬ÕâÑù¾Í¿ÉÒÔ±ÜÃâ°Ñ´óÊý×é±£´æÔÚͼÀʾÀý´úÂëΪ£¨À´×Ô¹Ù·½Àý³Ì£©£º
#
´ÓÓ²ÅÌÖжÁÈëÁ½¸öNumpyÊý×é
with np.load("/var/data/training_data.npy")
as data:
features = data["features"]
labels = data["labels"]
features_placeholder = tf.placeholder(features.
dtype, features.shape)
labels_placeholder = tf.placeholder(labels.dtype,
labels.shape)
dataset = tf.data.Dataset.from_tensor_slices(
(features_placeholder, labels_placeholder))
iterator = dataset.make_initializable_iterator()
sess.run(iterator.initializer, feed_dict={feature
s_placeholder: features,
labels_placeholder: labels})
|
reinitializable iteratorºÍfeedable iteratorÏà±Èinitializable
iterator¸ü¸´ÔÓ£¬Ò²¸ü¼ÓÉÙÓã¬Èç¹ûÏëÒªÁ˽âËüÃǵŦÄÜ£¬¿ÉÒÔ²ÎÔĹٷ½½éÉÜ£¬ÕâÀï¾Í²»ÔÙ׸ÊöÁË¡£
×ܽá
±¾ÎÄÖ÷Òª½éÉÜÁËDataset APIµÄ»ù±¾¼Ü¹¹£ºDatasetÀàºÍIteratorÀ࣬ÒÔ¼°ËüÃǵĻù´¡Ê¹Ó÷½·¨¡£
ÔÚ·ÇEagerģʽÏ£¬DatasetÖжÁ³öµÄÒ»¸öÔªËØÒ»°ã¶ÔÓ¦Ò»¸öbatchµÄTensor£¬ÎÒÃÇ¿ÉÒÔʹÓÃÕâ¸öTensorÔÚ¼ÆËãͼÖй¹½¨Ä£ÐÍ¡£
ÔÚEagerģʽÏ£¬Dataset½¨Á¢IteratorµÄ·½Ê½ÓÐËù²»Í¬£¬´Ëʱͨ¹ý¶Á³öµÄÊý¾Ý¾ÍÊǺ¬ÓÐÖµµÄTensor£¬·½±ãµ÷ÊÔ¡£
×÷Ϊ¼æÈÝÁ½ÖÖģʽµÄDataset API£¬ÔÚ½ñºóÓ¦¸Ã»á³ÉΪTensorFlow¶ÁÈ¡Êý¾ÝµÄÖ÷Á÷·½Ê½¡£¹ØÓÚDataset
APIµÄ½øÒ»²½½éÉÜ£¬¿ÉÒÔ²ÎÔÄÏÂÃæµÄ×ÊÁÏ£º
1.Importing Data £º¹Ù·½Guide
2.Module: tf.data£º APIÎĵµ
3.Introduction to TensorFlow Datasets
and Estimators£ºÈçºÎÁªºÏʹÓÃDatasetºÍEstimator
|