In addition to Weibo, there is also WeChat
Please pay attention
WeChat public account
Shulou
2025-03-28 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > IT Information >
Share
Shulou(Shulou.com)11/24 Report--
This article comes from the official account of Wechat: SF Chinese (ID:kexuejiaodian), author: SF
By Kate Darling / tr. by Phil Newell)
Compilation | editorial Department of Science focus
In 2018, in a traffic accident in the United States, a self-driving car hit a pedestrian and the human driver was charged with negligence. In February 2023, an accident occurred during the test drive of the Xilai car. So, who should be responsible for the accidents caused by self-driving cars?
Traffic regulations in Europe and the United States in August 2022, the British government announced plans to invest 100 million pounds (about 800 million yuan) to speed up the development and application of self-driving cars, while revising traffic laws and regulations. car manufacturers should be held responsible for self-driving car accidents. This means that if you drive a self-driving car, human drivers are not responsible for any driving accidents.
This rule is very different from that in the United States, where human drivers, as "backup drivers", are held responsible if a self-driving car has a traffic accident. Of course, Britain's new regulations can be implemented on the premise that carmakers do not evade responsibility for traffic accidents.
The limitations of autopilot the concept of fully autopilot has been around for a long time, but it may take much longer than expected for this technology to be implemented. Although car manufacturing companies have invested a lot of resources in research and development, they are still struggling with all kinds of accidents that may occur during driving.
Extreme weather is an expected problem, but there are still many unpredictable conditions. For example, British media have reported that a self-driving car once mistook the sunset for a traffic light. Another self-driving car crashed directly into a plane worth $2 million (about 14.62 million yuan). At present, the large-scale application of self-driving cars expected in the UK is still a long way off.
With automatic steering, automatic acceleration and automatic braking, automakers have equipped more and more auxiliary driving functions for cars. Before reliable fully autopilot is put into use, humans and robots will control the steering wheel together as a team. Therefore, we must carefully define the responsibilities of all parties.
Is it really irresponsible of the driver? Robots and humans have different but complementary advantages. Robots are good at performing predictable tasks and can respond more quickly and accurately. Humans are better at dealing with accidents, such as the appearance of staid traffic police in front of them, or the sudden appearance of a carriage on the highway. In theory, the safest driving is to combine the advantages of humans and robots, but it is fraught with difficulties in practice.
For example, a self-driving Uber (Uber, an American ride-hailing company that provides taxi-hailing services) hit a woman who was pushing a bike across the road. The car's autopilot system can not judge whether she is a pedestrian, a bicycle or a vehicle, nor can it correctly predict her walking path, coupled with the human driver's failure to respond in time, which eventually leads to the accident. The driver was charged with manslaughter. An investigation by the National Transportation Safety Board found some evidence that the handover mechanism between vehicles and drivers failed, but Uber was not held accountable. Why only human drivers should be held responsible?
In fact, it is a prejudice to think that human beings should take more responsibility. Related research shows that if cars are on autopilot most of the time, it is demanding to require human drivers to stay focused for a long time. Therefore, Britain's new autopilot rules are more reasonable. It is only when human drivers are exempt from accountability that automakers do their best to ensure safe driving, rather than passing the responsibility on to users through a "disclaimer".
Tesla UK previously made it clear that Tesla's self-driving function "cannot allow the vehicle to drive completely independently" and that "when fully autonomous driving is enabled, human drivers must remain focused and their hands must be on the steering wheel. Be ready to take over the driving task at any time. In addition to issuing "disclaimers", automakers will also install cars with driving systems that do not meet autopilot standards, so that the driving system and human drivers need to switch frequently-in the event of an accident, drivers need to take more responsibility.
Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.
Views: 0
*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.
Continue with the installation of the previous hadoop.First, install zookooper1. Decompress zookoope
"Every 5-10 years, there's a rare product, a really special, very unusual product that's the most un
© 2024 shulou.com SLNews company. All rights reserved.