Network Security Internet Technology Development Database Servers Mobile Phone Android Software Apple Software Computer Software News IT Information

In addition to Weibo, there is also WeChat

Please pay attention

WeChat public account

Shulou

How to realize Creative RNN and Deep RNN

2025-01-16 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > Internet Technology >

Share

Shulou(Shulou.com)05/31 Report--

This article mainly explains "how to achieve creative RNN and deep RNN". The content in the article is simple and clear, and it is easy to learn and understand. Please follow the editor's train of thought to study and learn "how to achieve creative RNN and deep RNN".

1. Creative RNN

After the study in the previous period, we have a model that can predict the future timing signals. Then we can also use it to produce some creative sentences. All we need to do is provide a seed sentence containing n_steps values (for example, all zeros) and use this model to predict the next value. Then add the predicted value to the sentence and produce a new sentence in turn. The specific implementation is as follows:

Sequence = [0.] * n_steps

For iteration in range (300):

X_batch = np.array (sequence [- n_steps:]) .reshape (1, n_steps, 1)

Y_pred = sess.run (outputs, feed_dict= {X: X_batch})

Sequence.append (y_pred [0,-1, 0])

Now we can input Jay Chou's album into the RNN network and see what kind of songs we can produce. However, sometimes we need a deeper and more powerful RNN with more neurons, so next, let's take a look at the deep RNN.

two。 Deep RNN

The depth RNN is actually similar to the depth CNN, superimposing more layers of neurons, as shown below:

So how do you implement it in tensorflow? We can wear a few neurons and stack them into the MultiRNNCell. In the following code, we stack three identical neurons (we can also stack many different types and different numbers of neurons):

N_neurons = 100

N_layers = 3

Basic_cell = tf.contrib.rnn.BasicRNNCell (num_units=n_neurons)

Multi_layer_cell = tf.contrib.rnn.MultiRNNCell ([basic_cell] * n_layers)

Outputs, states = tf.nn.dynamic_rnn (multi_layer_cell, X, dtype=tf.float32)

Thank you for reading, the above is the content of "how to achieve creative RNN and deep RNN". After the study of this article, I believe you have a deeper understanding of how to achieve creative RNN and deep RNN, and the specific use needs to be verified in practice. Here is, the editor will push for you more related knowledge points of the article, welcome to follow!

Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.

Views: 0

*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.

Share To

Internet Technology

Wechat

© 2024 shulou.com SLNews company. All rights reserved.

12
Report