Network Security Internet Technology Development Database Servers Mobile Phone Android Software Apple Software Computer Software News IT Information

In addition to Weibo, there is also WeChat

Please pay attention

WeChat public account

Shulou

Can robots treat mental illness? The method is risky and should be used with caution.

2025-03-28 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > IT Information >

Share

Shulou(Shulou.com)11/24 Report--

This article comes from the official account of Wechat: SF Chinese (ID:kexuejiaodian), author: SF

Artificial intelligence chatbots can make it easier for people to access mental health services, but it is not clear whether this approach is risky.

/ photos by Kate Darling (Kate Darling)

Editor | Liu Jiayu

Visual Design | Xia Meng

It's not easy to get help from a psychiatrist. In recent years, the number of people with depression and anxiety has increased significantly, the demand for mental health services is huge, and there is a serious shortage of professionals in the field of mental health. On the other hand, some people with mental health problems can not seek the help of a psychiatrist because they do not have enough understanding of mental health problems, cannot afford medical expenses, and are ashamed to declare them. Then, the gradual rise of healing chat robots seems to be able to solve all the above problems.

Are people more willing to talk to chatbots? Healing chatbots such as Woebot of the American psychotherapy innovation company and Wysa of the Indian artificial intelligence health company and Youper of the United States are becoming more and more popular. These robots have learned a lot about psychotherapy, such as cognitive behavioral therapy (common psychotherapy, which has a lot of empirical basis). Human-computer interaction research shows that people can establish a harmonious relationship and trust with chatbots. Some people even trust chat robots more than they trust human psychiatrists. Because robots don't judge them, they prefer to talk to robots.

However, although the chat robot uses a standard treatment framework, the effectiveness of the treatment depends on the interaction of users. In contrast, human psychiatrists communicate more smoothly and interact better with patients.

There are risks in the healing chat robot so far, whether the therapeutic robot is effective, whether there is risk and other issues need to be further studied.

In 2018, Woebot sparked controversy for taking the wrong political stance in a chat. This issue has been resolved, but this will not be the last time that chatbots have caused controversy. The method of generative artificial intelligence can make the robot's answer more flexible, but there is still a problem: the text system based on artificial intelligence includes sexism, racial discrimination and false information, and no one can predict exactly what the robot might say, which is especially dangerous in a therapeutic environment.

Even if there are pre-written, rule-based answers, it is easy to hurt those who seek mental health advice because their mental state is inherently fragile. For example, although chatbots can recognize languages with suicidal tendencies and thoughts and find ways to stop them, in most cases, the robot's answers may be misled or misunderstood.

Good psychiatrists are good at observing and controlling patients' emotions. They can understand semantics, observe gestures, pay attention to tone changes, and respond professionally. They will adjust the mode of treatment according to the patient's state. This is very difficult and professional, and even human psychologists make mistakes.

Unreliable human psychiatrists are very dangerous. Some psychiatrists will give unprofessional advice to aggravate the illness of patients, and some psychiatrists will even defraud money. But professional psychiatrists usually strictly enforce the moral code of psychiatrists and try their best to treat patients. Human psychiatrists may make mistakes, but the risk is small.

The method is risky, and the biggest selling point of using carefully cured chatbots is to increase the chances of getting treatment. Lowering the threshold for mental health services is undoubtedly valuable, but its risks need to be tested before it can be scaled up.

Some designers of therapeutic robots claim that chatbots only provide supplementary tools for psychotherapy, not as an intervention for psychiatrists to solve psychological problems. It mainly provides an alternative for people who are unable or unwilling to be treated by a psychiatrist.

Therapeutic robots are likely to be of great help to people with mental health problems. But we should be wary of any products that are rushed to market without enough research, especially artificial intelligence applications that may contain a variety of known and unknown hazards.

References:

Https://www.sciencefocus.com/news/therapy-chatbots-ai-mental-health/

Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.

Views: 0

*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.

Share To

IT Information

Wechat

© 2024 shulou.com SLNews company. All rights reserved.

12
Report