In addition to Weibo, there is also WeChat
Please pay attention
WeChat public account
Shulou
2025-01-19 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > Internet Technology >
Share
Shulou(Shulou.com)06/03 Report--
With the rapid development of self-driving technology, it seems that the time has come to pick the fruits of the Fengmei industry.
L2-level auxiliary driving technology has been applied on a large scale, and the low-hanging fruits are being carved up, and car companies are climbing to the higher stage of L3 self-driving. However, at a time when many car companies gladly yearned for it, Audi, the high-end brand of Volkswagen, which has been in the leading position, took the lead in withdrawing from the L3 track.
Recently, Hans, director of Audi's technology research and development department, said that Audi has abandoned plans to introduce L3 self-driving technology into the next generation of A8 flagship models. You know, Audi started research on L3 self-driving as early as 2011, and first launched the L3 self-driving Traffic Jam Pilot (TJP) in the Audi A8 in 2017.
But this technology is rarely used. The reason is simple: Audi has been waiting for countries to implement the relevant L3 autopilot policies. But until now, international regulators have not even agreed on the approval process for the most basic L3 autopilot functions, and several countries in their major markets have not issued relevant L3 road policies. The new A8 will be on the market next year, so I can't afford to wait.
Compared with Audi's hesitation, many peer players are entering the L3 autopilot one after another. BMW and Mercedes-Benz, which also belong to the BBA camp, are stepping up research and development and plan to launch their own L3-class models this year and next. Due to Audi's "inaction", some domestic car companies have recently begun to fight for the top brand of "the world's first mass-production L3".
You may wonder, do the car companies have any misunderstanding about this L3?
Yes, but the car companies did it on purpose. Before the boots of the relevant national policies and regulations fell to the ground, the L3 technology of most car companies could only be sealed in brochures or internal systems. Therefore, we can see such slogans as L2.5 and L2.99 in the publicity of some car companies, which not only want to rub against the concept marketing of L3 level, but also can not break through the red line of policy regulation, so we can only play these game concepts on words.
So why is Audi so honest in giving up L3 autopilot on the new A8? There may be many answers, but the reasons may eventually point to a question: what is the problem with L3 autopilot that can't be solved?
The national standard has been issued, but the L3 autopilot standard is still in doubt.
On March 9 this year, the Ministry of Industry and Information Technology issued China's national standard "Automation Classification of Automobile driving", which is scheduled to be formally implemented on January 1, 2021; the new national standard is the same as the general standard established by SAE in the United States, which locates the level of self-driving from L0 to L5.
Among them, the definition of L3 in the national standard, like SAE, defines it as "autopilot under limited conditions, that is, under the operating conditions specified by the autopilot system, the vehicle itself can complete the tasks of steering and acceleration and deceleration, as well as road condition detection and response; under some conditions, the driver can completely hand over the driving power to the self-driving vehicle, but need to take over if necessary.
In other words, in L3 autopilot, drivers do not have to supervise the vehicle all the time, can play distracted, work or even close their eyes, but can take over driving tasks at any time when "necessary".
The problem lies in this "necessary takeover". This is indeed an awkward state. If I were a driver, should I relax? Or is it time to wait for the "call" of this autopilot fairy beast?
This "necessary" moment defined by self-driving cars is like the unknown boot in the classic sketch "throw Boots." drivers can only wait for this "critical moment". After all, people are in a mobile space of tens of kilometers per hour, and people are betting on their personal safety.
On the contrary, this goes against the original intention of L3 autopilot, which is clearly intended to liberate the driver's job, but becomes more cautious because he has to worry about "taking over" the task. This obviously outweighs the gain.
Of course, this "necessary takeover" moment may not be as thrilling as the layman thought. Auto companies naturally have corresponding technical preparations and solutions. Then in L3 autopilot, how should car companies ensure the personal safety of drivers and passengers, and how do they design this "necessary take-over" moment?
Create double redundancy and strong reminders to "drive" the L3 autopilot?
If car companies want to make consumers feel at ease to deliver their personal safety to L3 cars, they must make their autopilot systems safe and reliable. The solution is not mysterious, that is to do enough work on the vehicle, give enough "basic material".
At present, it is a common practice in the industry that the autopilot system of a vehicle must have a double redundancy design from perception, decision-making and control, that is, in all the key links of autopilot, including induction, decision-making and execution, it is equipped with two sets of software and hardware to ensure that the autopilot system can still operate normally when one of them fails.
For example, Waymo has implemented double redundancy in all parts of its autopilot system, including power supply, positioning, sensing, controller and actuator. As a Tier1, Bosch's L3 solution has corresponding redundant design in the four technical aspects of perception, positioning, decision-making and execution.
Of course, the scope of redundant design is very wide, and some radical car companies are designing the whole vehicle system with double redundancy, but the cost is likely to get out of control. Considering the balance between system stability and cost control, not all car players will be equipped with double redundancy design in the whole system. "stacking" will increase the redundancy of the system, on the one hand, it will increase the cost of software and hardware. At the same time, it poses a challenge to the whole vehicle architecture.
At present, more automobile companies mainly achieve double redundancy in key links, such as double redundancy systems on sensors and computing chips, at least to ensure that there will be no serious consequences after the current system failure.
Double redundancy design can only be said to be a kind of ex post facto protection. If there is nothing wrong with the current system, drivers will not have any perception of redundant design. Compared with worrying about the failure of the car, the driver is more worried that the self-driving car is suddenly unable to cope with the complex road conditions. if the driver does not respond in time, there is likely to be an accident.
A strong reminder becomes a more necessary design. The strong reminder system includes intelligent reminder tone, warning lights, vehicle camera detection, seat belt warning and so on. For example, in Audi's L3 autopilot system, once the car asks you to take over driving, if you are watching a movie, a phone call, etc., the system will automatically pause for you and sound to take over, and the seat belt will automatically tighten; if you are asleep at that time, the car will allow you to regain control of the steering wheel for as long as possible 15 seconds in advance, but if you have been unresponsive, the car will brake automatically. If you don't respond, the car will automatically make emergency calls for you and check your health through the on-board camera.
Although in terms of technical safety, dual redundancy and strong reminder design can basically exceed the level of human driving, L3 autopilot still has to respond to the "Schrodinger" takeover question: who is responsible for the safety accident?
Who is responsible? the "Schrodinger" problem of L3 landing
From the perspective of the autopilot classification standard, the L3 level is a watershed for autopilot. In L0-L2 autopilot, the human driver is always the subject of action and responsibility, and the system plays an auxiliary role in driving. Even if there is a traffic accident, the human driver is also responsible. However, at the L3 level, the autopilot system begins to become the main body of vehicle action, and at the same time, it requires human beings to be responsible for the driving state and consequences after human beings take over the system. Then, this forms the subject of double responsibility, which brings a lot of uncertain factors to the whole driving process.
In theory, of course, the question is simple. When there is a traffic accident, the vehicle is in the L3 autopilot state, the responsibility is by the depot, if the vehicle is in the human driving state, the responsibility is by the human.
But what if it's in the middle? For example, the autopilot system failed to detect unexpected road conditions, did not fulfill the duty of reminder? What if the system is in a reminder status and the driver does not respond in time and there is an accident? If the driver finds that there is a misjudgment in the autopilot system, and the forced intervention still does not prevent the accident, then it is the driver's responsibility or the question of autopilot.
In reality, this problem has already happened. In Tesla's many traffic casualties, car owners generally used Tesla's autopilot system and were distracted at the time of the accident. However, because Tesla marked in advance that the autopilot system only plays an auxiliary role, drivers have to bear the final responsibility for the final driving behavior of the car.
But when it comes to L3 autopilot, automakers cannot use "auxiliary driving" to excuse themselves. If you clearly mark yourself as an L3 autopilot system, then you have to take responsibility for the safety accident of the vehicle.
Similarly, car owners who use L3 autopilot will also have the same worry. It is obvious that there is something wrong with your autopilot system, how can you blame me after the accident?
This is where Audi is most concerned about these international car companies. Once a safety accident occurs and the responsibility is not clear, it will have an inestimable negative impact on the entire car brand. According to Audi, their corporate legal advisers are basically opposed to the L3 self-driving system, because in the event of an accident during autopilot, even if the system has a 99.9% safety factor when delivered to customers, car companies still have to bear the corresponding responsibility. Even if the car owner does not take over in time and other problems, it may eventually be attributed to the system.
The division and definition of human and vehicle responsibility is not only difficult for policy makers to weigh, but also the fundamental problem that Audi finally chooses to give up.
For many car companies, in the journey to L3, they should not only pay attention to the technology premium brought by new technologies for new models, but also the huge risks brought by unclear policies and regulations and the division of responsibilities after safety accidents.
If you want to wear a crown, you must bear its weight.
Instead of waiting in the dim state of L3 self-driving, car companies should either do a good job of L2-level auxiliary driving and maximize the intelligent driving experience. master "core technology" in the integration capability of autopilot system and the technical research and development of electronic and electrical architecture Or it is better to skip the L3 level and invest resources directly into the L4 stage, competing with Waymo, Uber, Baidu and other self-driving leaders in the highly autopilot field.
At present, traditional car companies such as Ford and Volvo, Tesla, the leader of electric cars, and the new power of car-building in China have all chosen this "more difficult" road.
But at least there will be no more of that vexing question of responsibility. Take full responsibility for driving, car companies will not have the slightest fluke, there will not be any hesitation, car owners can really rest assured to play happily in the car.
Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.
Views: 0
*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.
Continue with the installation of the previous hadoop.First, install zookooper1. Decompress zookoope
"Every 5-10 years, there's a rare product, a really special, very unusual product that's the most un
© 2024 shulou.com SLNews company. All rights reserved.