In addition to Weibo, there is also WeChat
Please pay attention
WeChat public account
Shulou
2025-02-28 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > Internet Technology >
Share
Shulou(Shulou.com)06/01 Report--
What is the principle and code implementation of the neural network MPNN that sends messages on the graph? I believe that many inexperienced people are at a loss about it. Therefore, this paper summarizes the causes and solutions of the problem. Through this article, I hope you can solve this problem.
Welcome to the world of graph neural networks, where we build a deep learning model on the graph. You can think it's easy. After all, can't we reuse models that use normal data?
Actually, no. In the figure, all the data points (nodes) are connected to each other. This means that the data is no longer independent, which makes most standard machine learning models useless because their derivations are strongly based on this assumption. To overcome this problem, you can extract digital data from the graph, or use a model that manipulates such data directly.
It is more desirable to create a model that works directly on the diagram because we can get more information about the structure and properties of the diagram. In this article, we will study an architecture specially designed for this kind of data, namely message passing neural network (MPNNs).
Various variants of the model
Several independent researchers have released different variants before standardizing the model into a single MPNN framework. This type of structure is particularly popular in chemistry and can help predict the properties of molecules.
Duvenaud et al published one of the first works on the subject in 2015. He uses the messaging architecture to extract valuable information from graph molecules and then convert it into a single feature vector. At the time, his work was groundbreaking because he made the architecture different. It is actually one of the earliest convolution neural network architectures that can be run on a graph.
The messaging architecture created by Duvenaud et al. He defines the model as a stack of distinguishable layers, each of which is another round of messages. Modified from [1]
Li et al made another attempt at this framework in 2016 [2]. Here, they focus on the sequential output of the graph, such as finding the best path in figure [2]. To this end, they embed GRU (gated Loop Unit) in their algorithm.
Although these algorithms seem completely different, they share the same basic concept that messages are passed between the nodes in the graph. We will soon see how to combine these models into a framework.
Unify the model into the MPNN framework
A very simple example of the messaging architecture of node V1. In this case, a message is the sum of the hidden states of the neighbors. The update function is the average between messages m and H2.
After all, the idea behind MPNN is conceptually simple.
Each node in the graph has a hidden state (that is, feature vector). For each node Vt, we aggregate the hidden state function and the edges of all neighboring nodes with the node Vt itself. Then, we update the hidden state of the node Vt with the obtained message and the previous hidden state of the node.
There are three main equations that define the MPNN framework on figure [3]. The messages obtained from neighboring nodes are given by the following formula:
Get the message from the neighbor node.
It is the sum of all the messages Mt received from the neighbors. Mt is an arbitrary function that depends on the hidden state and the edges of neighboring nodes. We can simplify this function by retaining some input parameters. In the above example, we only want to sum the different hidden state hw.
Then, we use a simple equation to update the hidden state of node Vt:
Update the status of the node with the previous hidden state and new messages.
To put it simply, the hidden state of the node Vt is obtained by updating the old hidden state with the newly obtained message mv. In the case of the above example, the update function Ut is the average between the previously hidden state and the message.
We repeat this messaging algorithm a specified number of times. After that, we enter the final readout stage.
The resulting hidden state is mapped to a single feature vector that describes the entire drawing.
In this step, we extract all the newly updated hidden states and create a final feature vector that describes the entire drawing. This feature vector can then be used as the input of the standard machine learning model.
okay! These are the foundations of MPNN. This framework is very powerful because we can define different messages and update the functionality according to the functionality we want to implement. I suggest looking at [3] for more information about the different variants of the MPNN model.
The MPNN framework standardizes different messaging models created independently by multiple researchers. The main ideas of the framework include messages, updates, and readout functions, which run on different nodes in the diagram. Some variants of the MPNN model share this functionality, but their definitions are different.
After reading the above, have you mastered the principle and code implementation of the neural network MPNN that sends messages on the diagram? If you want to learn more skills or want to know more about it, you are welcome to follow the industry information channel, thank you for reading!
Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.
Views: 0
*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.
Continue with the installation of the previous hadoop.First, install zookooper1. Decompress zookoope
"Every 5-10 years, there's a rare product, a really special, very unusual product that's the most un
© 2024 shulou.com SLNews company. All rights reserved.