Network Security Internet Technology Development Database Servers Mobile Phone Android Software Apple Software Computer Software News IT Information

In addition to Weibo, there is also WeChat

Please pay attention

WeChat public account

Shulou

How to use the activation function activation

2025-02-27 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > Internet Technology >

Share

Shulou(Shulou.com)06/01 Report--

Today, I will talk to you about how to use the activation function activation, many people may not know much about it. In order to make you understand better, the editor has summarized the following contents for you. I hope you can get something according to this article.

Let's introduce the activation function.

First, an overview of activation functions

Activation function plays a very important role in deep learning, which gives nonlinearity to the network, so that the neural network can fit any complex function.

If there is no activation function, no matter how complex the network is, it is equivalent to a single linear transformation, and the nonlinear function can not be fitted.

At present, the most popular activation function in deep learning is relu, but there are also some newly introduced activation functions, such as swish and GELU, which are said to be better than relu activation functions.

An overview of activation functions can be found in the following two articles.

"an overview of activation functions in deep learning"

Https://zhuanlan.zhihu.com/p/98472075

"from ReLU to GELU, an overview of activation functions in neural networks"

Https://zhuanlan.zhihu.com/p/98863801

Second, commonly used activation functions

Activation function plays a very important role in deep learning, which gives nonlinearity to the network, so that the neural network can fit any complex function.

1memore tf.nn.sigmoid: compress the real number to between 0 and 1, which is generally used only in the last output layer of the second category. The main defects are the disappearance of gradient, high computational complexity and non-zero-centered output.

(2) the multi-classification extension of magnets tf.nn.softmaxvision sigmoid is generally used only in the last output layer of the multi-classification problem.

3TF.nn.tanh: compress the real number to between-1 and 1, and the output is expected to be 0. The main defect is the problem of gradient disappearance and high computational complexity.

4Jet tf.nn.relu: modified linear unit, the most popular activation function. The general hidden layer is used. The main defect is that the output is not centered on 0, and there is the problem of gradient disappearance (dead relu) when the input is less than 0.

The improvement of modified linear unit solves the problem of death relu.

6Tf.nn.elu: exponential linear unit. The improvement of relu can alleviate the problem of death relu.

7Tf.nn.selu: extended exponential linear unit. The neural network can be self-normalized under the premise of weight reuse tf.keras.initializers.lecun_normal initialization. There can be no problem of gradient explosion or gradient disappearance. It needs to be used with AlphaDropout, a variant of Dropout.

8pm tf.nn.swish: activate the function from the gate. Google, and related studies have shown that using swish instead of relu will achieve a slight improvement.

9 refine gelu: Gauss error linear unit activation function. The best performance in Transformer. The tf.nn module has not implemented this function yet.

Third, use activation functions in the model

There are generally two ways to use the activation function in the keras model, one is specified as the activation parameter of some layers, and the other is to explicitly add the layers.Activation activation layer.

Import numpy as np

Import pandas as pd

Import tensorflow as tf

From tensorflow.keras import layers,models

Tf.keras.backend.clear_session ()

Model = models.Sequential ()

Model.add (layers.Dense (32 model.add shape = (None,16), activation = tf.nn.relu)) # specified by the activation parameter

Model.add (layers.Dense (10))

Model.add (layers.Activation (tf.nn.softmax)) # explicitly add layers.Activation activation layer

Model.summary ()

After reading the above, do you have any further understanding of how to use the activation function activation? If you want to know more knowledge or related content, please follow the industry information channel, thank you for your support.

Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.

Views: 0

*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.

Share To

Internet Technology

Wechat

© 2024 shulou.com SLNews company. All rights reserved.

12
Report