Network Security Internet Technology Development Database Servers Mobile Phone Android Software Apple Software Computer Software News IT Information

In addition to Weibo, there is also WeChat

Please pay attention

WeChat public account

Shulou

What is the Tensorflow implementation of basic RNN

2025-01-16 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > Internet Technology >

Share

Shulou(Shulou.com)06/01 Report--

This article introduces you to the basic RNN Tensorflow implementation is how, the content is very detailed, interested small partners can refer to, I hope to help you.

Today we'll look at how basic RNNs are implemented in tensorflow.

First, in order to better understand the operation mechanism of RNN, we do not use tensorflow related RNN operations to implement a simple RNN model. Here we create an initial recurrent neural network (RNN)

A one-layer RNN network with 5 recurrent neurons is presented in, where the activation function is tanh, and assuming that the RNN runs for only two instants, the size of the input vector at each instant is 3, shown by two instants. The code is as follows:

n_inputs = 3

n_neurons = 5

X0 = tf.placeholder(tf.float32, [None, n_inputs])

X1 = tf.placeholder(tf.float32, [None, n_inputs])

Wx = tf.Variable(tf.random_normal(shape=[n_inputs, n_neurons],dtype=tf.float32))

Wy = tf.Variable(tf.random_normal(shape=[n_neurons,n_neurons],dtype=tf.float32))

b = tf.Variable(tf.zeros([1, n_neurons], dtype=tf.float32))

Y0 = tf.tanh(tf.matmul(X0, Wx) + b)

Y1 = tf.tanh(tf.matmul(Y0, Wy) + tf.matmul(X1, Wx) + b)

init = tf.global_variables_initializer()

This network is a bit convoluted. At first glance, it looks like a network with two layers of forward transmission. In fact, it is not. First, the same weights and bias terms are used by both layers. Second, there are inputs at each layer, and there are separate outputs at each layer. To run this model, we need to input data to the model at two times, as follows:

import numpy as np

# Mini-batch: instance 0,instance 1,instance 2,instance 3

X0_batch = np.array([[0, 1, 2], [3, 4, 5], [6, 7, 8], [9, 0, 1]]) # t = 0

X1_batch = np.array([[9, 8, 7], [0, 0, 0], [6, 5, 4], [3, 2, 1]]) # t = 1

with tf.Session() as sess:

init.run()

Y0_val, Y1_val = sess.run([Y0, Y1], feed_dict={X0: X0_batch, X1: X1_batch})

The mini-batch above contains inputs at two moments, each with four samples, each containing three features. The final Y0_val and Y1_val contain the outputs of all the neurons in the network on the mini-batch at two times. Here is the output:

>>> print(Y0_val) # output at t = 0

[[-0.2964572 0.82874775 -0.34216955 -0.75720584 0.19011548] # instance 0

[-0.12842922 0.99981797 0.84704727 -0.99570125 0.38665548] # instance 1

[ 0.04731077 0.99999976 0.99330056 -0.999933 0.55339795] # instance 2

[ 0.70323634 0.99309105 0.99909431 -0.85363263 0.7472108 ]] # instance 3

>>> print(Y1_val) # output at t = 1

[[ 0.51955646 1. 0.99999022 -0.99984968 -0.24616946] # instance 0

[-0.70553327 -0.11918639 0.48885304 0.08917919 -0.26579669] # instance 1

[-0.32477224 0.99996376 0.99933046 -0.99711186 0.10981458] # instance 2

[-0.43738723 0.91517633 0.97817528 -0.91763324 0.11047263]] # instance 3 About the basic RNN Tensorflow implementation is how to share here, I hope the above content can be of some help to everyone, you can learn more knowledge. If you think the article is good, you can share it so that more people can see it.

Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.

Views: 0

*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.

Share To

Internet Technology

Wechat

© 2024 shulou.com SLNews company. All rights reserved.

12
Report