In addition to Weibo, there is also WeChat
Please pay attention
WeChat public account
Shulou
2025-01-14 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > Development >
Share
Shulou(Shulou.com)06/01 Report--
This article focuses on "how to use python generator to read pictures", interested friends may wish to take a look. The method introduced in this paper is simple, fast and practical. Let's let the editor take you to learn how to read pictures with the python generator.
It is not realistic to read image data in the process of training model in deep learning, so it is necessary to use generator to read data.
Through list generation, we can create a list directly. However, due to memory constraints, the list capacity is certainly limited. Moreover, creating a list of 1 million elements not only takes up a lot of storage space, but if we only need to access the first few elements, the space occupied by most of the later elements is wasted.
So, if the list elements can be calculated according to some algorithm, can we constantly calculate the subsequent elements in the process of the loop? This eliminates the need to create a complete list, saving a lot of space. In Python, this mechanism of calculating while looping is called generator: generator.
There are many ways to create a generator, the first one is very simple
Simply change a list generator [] to (), and a generator is created:
> L = [x * x for x in range (10)] > L [0,1,4,9,16,25,36,49,64,81] > > g = (x * x for x in range (10)) > > g
The elements in list can be printed directly, and generator needs to be printed out one by one.
You can get the next return value of generator through the next () function:
> > next (g)
0
> > next (g)
one
> > next (g)
four
> > next (g)
nine
> > next (g)
sixteen
Generator saves the algorithm, and every time next (g) is called, it calculates the value of the next element of g until the last element is calculated, and when there are no more elements, the error of StopIteration is thrown.
The above constant call to next (g) is really sick.
The correct way is to use a for loop, because generator is also an iterable object:
> g = (x * x for x in range (10)) > for n in GRV. Print (n)
Except for the first and second numbers, any number of the famous Fibonacci series (Fibonacci) can be obtained by adding the first two numbers together:
1, 1, 2, 3, 5, 8, 13, 21, 34,...
Fibonacci series cannot be written in list generation.
However, it is easy to print it out with a function:
Def fib (max): n, a, b = 0,0,1 while n
< max: print(b) a, b = b, a + b n = n + 1 return 'done' 仔细观察,可以看出,fib函数实际上是定义了斐波拉契数列的推算规则,可以从第一个元素开始,推算出后续任意的元素,这种逻辑其实非常类似generator。 也就是说,上面的函数和generator仅一步之遥。要把fib函数变成generator,只需要把print(b)改为yield b就可以了: def fib(max): n, a, b = 0, 0, 1 while n < max: yield b a, b = b, a + b n = n + 1 return 'done' 这就是定义generator的另一种方法。如果一个函数定义中包含yield关键字,那么这个函数就不再是一个普通函数,而是一个generator: >> > f = fib (6) > f
Here, the most difficult thing to understand is that the execution process of generator and function is different. The function is executed sequentially and returns when it encounters the declare statement or the last line of the function statement. Instead, the function that becomes generator executes each time next () is called, encounters a return of the yielding statement, and continues execution from the last returned yield statement when it is executed again.
If you keep calling yield during the loop, you will keep interrupting. Of course, you have to set a condition for the loop to exit the loop, otherwise an infinite sequence will be generated.
Similarly, after changing the function to generator, we basically never use next () to get the next return value, but directly use the for loop to iterate:
For n in fib (6):... Print (n)... Finally, the code in the practical application of reading pictures is as follows: def train_data (train_file,batch_size,resize_shape): datas Labels = read_data (train_file) num_batch = len (datas) / / batch_size for i in range (num_batch): imgs = [] train_datas = train_datas [batch _ size*i:batch_size* (item1)] train_lables = labels [batch _ size*i:batch_size* (item1)] for img_path in train_datas: img = cv2.imread (img_path ) img = cv2.resize (img Resize_shape) img = img/255 # Normalized processing imgs.append (img) yield np.array (imgs), np.array (train_lables)
Supplement: deep learning algorithm-- fit_generator () function is used
If we have a large amount of data, it is impossible to load all the data into memory, which will lead to memory leakage.
At this point, we can use the fit_generator function to train from keras.datasets import imdbfrom keras.preprocessing.sequence import pad_sequencesfrom keras.models import Sequentialfrom keras import layersimport numpy as npimport randomfrom sklearn.metrics import f1_score, accuracy_scoremax_features = 100maxlen = 50batch_size = 320 (x_train, y_train), (x_test, y_test) = imdb.load_data (num_words=max_features) x_train = pad_sequences (x_train, maxlen=maxlen) x_test = pad_sequences (x_test Maxlen=maxlen) def generator (): while 1: row = np.random.randint (0, len (x_train), size=batch_size) x = np.zeros ((batch_size, x_train.shape [- 1])) y = np.zeros ((batch_size,)) x = x Y # generator () model = Sequential () model.add (layers.Embedding (max_features, 32, input_length=maxlen)) model.add (layers.GRU (64, return_sequences=True)) model.add (layers.GRU (32)) # model.add (layers.Flatten ()) # model.add (layers.Dense (32)) model.add (layers.Dense (1, activation='sigmoid')) model.compile (optimizer='rmsprop', loss='binary_crossentropy' Metrics= ['acc']) print (model.summary ()) # history = model.fit (x_train, y_train, epochs=1,batch_size=32, validation_split=0.2) # Keras the x_train and y_train passed in the fit () function are fully loaded into memory, and of course it is very convenient to use But if we have a large amount of data, # then it is impossible to load all the data into memory, which will lead to memory leaks, so we can use the fit_generator function for training. # fit_generator function must be passed into a generator, and our training data is also generated by the generator history = model.fit_generator (generator (), epochs=1, steps_per_epoch=len (x_train) / / (batch_size)) print (model.evaluate (x_test, y_test)) y = model.predict_classes (x_test) print (accuracy_score (y_test, y)) I believe you have a deeper understanding of "how to read pictures with python generator". You might as well do it in practice. Here is the website, more related content can enter the relevant channels to inquire, follow us, continue to learn!
Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.
Views: 0
*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.
Continue with the installation of the previous hadoop.First, install zookooper1. Decompress zookoope
"Every 5-10 years, there's a rare product, a really special, very unusual product that's the most un
© 2024 shulou.com SLNews company. All rights reserved.