In addition to Weibo, there is also WeChat
Please pay attention
WeChat public account
Shulou
2025-03-28 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > Development >
Share
Shulou(Shulou.com)05/31 Report--
This article mainly introduces the relevant knowledge of "how to use python's tf.train.batch function". Xiaobian shows you the operation process through actual cases. The operation method is simple, fast and practical. I hope this article "how to use python's tf.train.batch function" can help you solve the problem.
tf.train.batch function tf.train.batch( tensors, batch_size, num_threads=1, capacity=32, enqueue_many=False, shapes=None, dynamic_pad=False, allow_smaller_final_batch=False, shared_name=None, name=None)
Of which:
Tensors: Combinations of data obtained using slice_input_producer.
batch_size: Sets the number of dequeued data to be retrieved from the queue at a time.
num_threads: used to control the number of threads, if its value is not unique, due to the nature of thread execution, data acquisition may become out of order.
capacity: an integer that sets the maximum number of elements in the queue
5. allow_samller_final_batch: When True, if the number of samples in the queue is less than batch_size, the number of samples left behind will be dequeued; when False, samples less than batch_size will not be queued.
6, name: name
Test Code 1, allow_samller_final_batch=Trueimport pandas as pdimport numpy as npimport tensorflow as tf#generate data def generate_data(): num = 18 label = np.arange(num) return label#get data def get_batch_data(): label = generate_data() input_queue = tf.train.slice_input_producer([label], shuffle=False,num_epochs=2) label_batch = tf.train.batch(input_queue, batch_size=5, num_threads=1, capacity=64,allow_smaller_final_batch=True) return label_batch#data group label = get_batch_data()sess = tf.Session()#initialize variables sess.run(tf.global_variables_initializer())sess.run(tf.local_variables_initializer())#initialize batch training parameters coord = tf.train.Coordinator()threads = tf.train.start_queue_runners(sess,coord)try: while not coord.should_stop(): #Get the next set of data automatically l = sess.run(label) print(l)except tf.errors.OutOfRangeError: print('Done training')finally: coord.request_stop()coord.join(threads)sess.close()
The results of the run were:
[0 1 2 3 4]
[5 6 7 8 9]
[10 11 12 13 14]
[15 16 17 0 1]
[2 3 4 5 6]
[ 7 8 9 10 11]
[12 13 14 15 16]
[17]
Done training
2、allow_samller_final_batch=False
Less output than allow_samller_final_batch=True [17]
import pandas as pdimport numpy as npimport tensorflow as tf#generate data def generate_data(): num = 18 label = np.arange(num) return label#get data def get_batch_data(): label = generate_data() input_queue = tf.train.slice_input_producer([label], shuffle=False,num_epochs=2) label_batch = tf.train.batch(input_queue, batch_size=5, num_threads=1, capacity=64,allow_smaller_final_batch=False) return label_batch#data group label = get_batch_data()sess = tf.Session()#initialize variables sess.run(tf.global_variables_initializer())sess.run(tf.local_variables_initializer())#initialize batch training parameters coord = tf.train.Coordinator()threads = tf.train.start_queue_runners(sess,coord)try: while not coord.should_stop(): #Get the next set of data automatically l = sess.run(label) print(l)except tf.errors.OutOfRangeError: print('Done training')finally: coord.request_stop()coord.join(threads)sess.close()
The results of the run were:
[0 1 2 3 4]
[5 6 7 8 9]
[10 11 12 13 14]
[15 16 17 0 1]
[2 3 4 5 6]
[ 7 8 9 10 11]
[12 13 14 15 16]
Done training
About "Python tf.train.batch function how to use" the content is introduced here, thank you for reading. If you want to know more about industry-related knowledge, you can pay attention to the industry information channel. Xiaobian will update different knowledge points for you every day.
Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.
Views: 0
*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.
Continue with the installation of the previous hadoop.First, install zookooper1. Decompress zookoope
"Every 5-10 years, there's a rare product, a really special, very unusual product that's the most un
© 2024 shulou.com SLNews company. All rights reserved.