In addition to Weibo, there is also WeChat
Please pay attention
WeChat public account
Shulou
2025-04-08 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > Development >
Share
Shulou(Shulou.com)06/02 Report--
This article introduces you how to write a robot that can make poems in Python. The content is very detailed. Interested friends can refer to it for reference. I hope it can help you.
1. introduction of the principle
First of all, to make the machine automatically compose poetry requires the use of natural language processing means, so that the machine can learn to understand "verse" and then make the verse we need. How do you get a machine to "understand" poetry? We used Long Short Term Memory Networks (LSTM) in deep learning. A little dizzy, don't worry, we'll explain it later in plain English.
LSTM is a variant of recurrent neural network (RNN), RNN can solve natural language processing tasks well, but for long-dependent sentences performance is not very good, such as:
In the above example, whether "was" or "were" is used depends on the singular and plural form, but since "was" is too far away from "dog," RNN does not solve this problem well.
In order to solve the above problem, LSTM is introduced. For a more intuitive explanation, I introduce a less appropriate example here:
Let's say we're watching a movie, and we can tell how the story is going by changing shots. And as the story develops, we will know the character, age, preferences, etc. of certain protagonists, which will not be forgotten immediately with the switch of the lens. These are long-term memories, and when the story takes place in a certain scene, such as the following happy picture:
From our long-term memory for this anime, we know that this is a happy thinking, and in this scene, we use the long-term memory of the "happy thinking action" memory, and the long-term memory that needs to be used in this scene is called "working memory."
2. Vernacular Interpretation LSTM
So how does LSTM work?
1). LSTM has to learn to forget first.
For example, when a shot ends, the LSTM should forget the location, time, or all information about the shot. But if something happens that an actor has taken a lunch box, then the LSTM should remember that this person has taken a lunch box, just like we watch a movie, we choose to forget some memories and keep the memories we need. So the LSTM should have the ability to know what to remember and what to forget when there is new lens input.
2). The second is to add a retention mechanism
As the LSTM inputs new lens information, the LSTM should learn what information is worth using and preserving. Then, according to the previous two, when new shots are input, the LSTM forgets those that are not needed in long-term memory, and then learns which of the input shots are worth using and saves them in long-term memory.
3).*** It is necessary to know which points of long-term memory are to be used immediately
For example, if we see a person writing something in a movie, we may invoke the long-term memory of age (the schoolboy may be writing homework, and the adult may be writing copywriting), but the age information may not be relevant to the current scene.
4). So LSTM only learns what it needs to focus on, rather than using all of the memories at once. Therefore, LSTM can solve the above problems very well. The following is a visual representation of LSTM:
3. Combat robots
The following is the actual link, although the LSTM effect is very good, but still need to pre-process the data, LSTM needs to process each verse to the same length, and need to convert Chinese characters into digital form. So how to carry out pretreatment, mainly divided into 3 steps:
Reading in data, we collected a lot of poetry data.
Count the number of times each character appears, and take the number of times it appears as the id of each Chinese character.
When generating batch data, we need to unify the length of each verse to the same length, so for sentences that are not long enough, we fill them with "*"
Therefore, when the effect of *** is displayed, the word "*" may appear in the verse. Part of the code for data preprocessing is shown in the following figure:
In the above code, the following steps are mainly completed:
1). The first is to read in the data and reduce the sentence length to 100, deleting the part after 100 characters.
2). Then add '^' and '$' at the beginning and end of each sentence as signs of the sentence. Delete sentences shorter than MIN_LENGTH
3).*** counting the number of characters of the processed verse, counting the number of times each character appears, and taking the number of times as the id of each Chinese character.
For the data preprocessing part of the code, I have made comments, convenient for everyone to understand, for our understanding of data processing, as well as python statements are of great help.
To train the model, you need to make sure that the tensorflow and numpy libraries are configured in the computer. When the model training is completed, we can directly call the model and embed it into our chatbot program to implement our chatbot (for the introduction of chatbots, please refer to the history article at the end of this article).
Here's a partial demo:
4. effect display
Having said that, let's look at some of the effects of trained robots writing poetry.
In Figure A, we show the poetry-making robot effect. The robot outputs "Please input the Tibetan poem prompt: ". When we input the Tibetan poem prompt, the robot will make the Tibetan poem that meets our requirements.
Figure B shows the existence of the "*" character. Of course, due to the profound Chinese culture and the limitation of the training data, when there is a character that does not appear in the training data in our Tibetan poetry prompt, the robot will prompt that the character is not in the dictionary.
In the red marked part of Figure C, it will handle abnormal situations, indicating that it is not in the dictionary!
About how to use Python to write a robot that can make poems to share here, I hope the above content can be of some help to everyone, you can learn more knowledge. If you think the article is good, you can share it so that more people can see it.
Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.
Views: 0
*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.
Continue with the installation of the previous hadoop.First, install zookooper1. Decompress zookoope
"Every 5-10 years, there's a rare product, a really special, very unusual product that's the most un
© 2024 shulou.com SLNews company. All rights reserved.