In addition to Weibo, there is also WeChat
Please pay attention
WeChat public account
Shulou
2025-02-28 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > IT Information >
Share
Shulou(Shulou.com)11/24 Report--
According to US media on April 26, artificial intelligence tools based on large language models, such as ChatGPT, are better at empathy than at providing accurate factual information. This is because designers give priority to fluency over accuracy. In addition, the training database contains a lot of emotional information, so it makes no sense to insert large language models into search engines.
Tu Yuan Pexels earlier this year, Arvind Narayanan (Arvind Narayanan), a computer science professor at Princeton University, developed a voice interface that supports ChatGPT for his four-year-old daughter. This is not only an experiment, but also because Narayanan believes that artificial intelligence robots will one day become an important part of her daughter's life.
Narayanan's daughter is naturally curious and often asks him questions about animals, plants and the human body. He thinks the ChatGPT developed by OpenAI can provide a useful answer. To his surprise, the chatbot showed impeccable empathy once he told the system that he was talking to the child.
Narayanan's daughter asked, "what happens when the lights go out?"
"when the lights go out, it gets dark, and it might be a little scary," ChatGPT replied in a synthetic voice. "but don't worry!" there are many things you can do to make yourself feel safe and comfortable in the dark. "
ChatGPT then made some suggestions for using night lights, and finally reminded: "it's normal to feel a little scared in the dark." This explanation is clearly reassuring to Narayanan's daughter.
Microsoft and Google are eager to use large language modeling technology to enhance search engines. But there are good reasons to think that this technique is more suitable as an emotional partner than providing factual information.
It may sound strange. But whether it's Google's Bard or Microsoft's new Bing, this kind of artificial intelligence has made a series of embarrassing factual errors when used to search for information. Bard gave misinformation about the James Webb Space Telescope in its first public demonstration, and the new Bing got a lot of financial statement data wrong.
When artificial intelligence chatbots are used in search, the cost of factual errors is high. Replika is an artificial intelligence application that has been downloaded more than 5 million times. Eugenia Kuyda, founder of the app, says that when artificial intelligence is designed as a partner, the cost of making mistakes is much lower. "it doesn't affect the user experience," she said. "unlike search, small mistakes can destroy users' trust in the product."
Margaret Mitchell, a former artificial intelligence researcher at Google, co-authored a paper on the risks of large language models. She says large language models are "not suitable" as a search engine at all. These large language models go wrong because the data used for training often contains error information, and the model does not have basic facts to verify the generated content. In addition, designers of large language models may give priority to the fluency of generated content over accuracy.
This is one of the reasons why these tools are so good at catering to users. After all, today's large language models are trained to crawl text from the Internet, including emotional content posted on social media platforms such as Twitter and Facebook, as well as personal psychological counseling on forums such as Reddit and Quora. Lines in movies and TV dramas, conversations in novels, and research papers on EQ have all entered the training database, which makes these tools more empathetic.
It is reported that some people use ChatGPT as a robot therapist. 'they do this to avoid becoming a burden to others, 'one of them said.
In order to test the empathy ability of artificial intelligence, people conducted an online EQ test on ChatGPT. The results show that it performs well, getting full marks in social awareness, relationship management and self-management, but only slightly worse in self-awareness.
To some extent, ChatGPT performed better than some people in the test.
Although it is a bit unreal that a machine can bring empathy to people, it does make sense. People are born with social connections, and the human brain has the ability to reflect the feelings of others, which means that even if the other person doesn't really "feel" what we think, we can gain a sense of understanding. Mirror neurons in the human brain are activated when we feel the empathy of others, giving us a sense of being connected to each other.
Of course, empathy is a multifaceted concept, and in order to really experience this feeling, people still need to communicate with real people.
Thomas Ward (Thomas Ward), a clinical psychologist at King's College London, who has studied the role of software in psychotherapy, warns people not to take it for granted that artificial intelligence can fully meet people's mental health needs, especially when psychological problems are serious. For example, chatbots may not be able to understand the emotional complexity of people. In other words, ChatGPT rarely says "I don't know" because it is designed to be confident rather than careful to answer questions.
People should not use chatbots as a habitual channel to vent their feelings. "in a world where artificial intelligence chatbots are seen as a way to eliminate loneliness, subtle relationships, such as holding hands or knowing when to speak and when to listen, may disappear," Ward said. "
This may eventually lead to more problems. But for now, artificial intelligence's emotional skills are at least more reliable than the facts they have.
Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.
Views: 0
*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.
Continue with the installation of the previous hadoop.First, install zookooper1. Decompress zookoope
"Every 5-10 years, there's a rare product, a really special, very unusual product that's the most un
© 2024 shulou.com SLNews company. All rights reserved.