In addition to Weibo, there is also WeChat
Please pay attention
WeChat public account
Shulou
2025-01-17 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > IT Information >
Share
Shulou(Shulou.com)11/24 Report--
Wen | Aohu
Exactly ten years ago, on July 6, 2013, an Asiana Boeing 777 hit the ground ahead of schedule and disintegrated when it landed at San Francisco airport, killing three Chinese passengers.
The 777 has always been one of the models with a good safety record, and the crash of Asiana Flight 214 really has little to do with Boeing, mainly due to improper operation of the crew.
The results of the air crash investigation are shocking, and although the captain has nearly 10,000 hours of flight experience, there are only a handful of manual landings. Asiana even has rules that encourage pilots to use automated systems as much as possible and rarely train manual landings.
"if you subtract the total length of his flight from the time of autopilot, the remaining manual flight experience may be only a few hundred hours."
Modern aviation already has a fully automated landing guidance system, and airport equipment sends signals to guide the plane to fully automatic landing and landing. However, the automatic landing guidance system at San Francisco airport happened to be temporarily disabled at the time of the crash.
So Asiana flight 214 must land on its own. Due to the lack of manual driving experience, the crew made a series of chaotic operations, inadvertently cut off the automatic throttle of the aircraft, and completely lost the protection of the automatic system. By the time they realized that the plane was under their own control, it was too late.
The Asiana 214 crash is a classic "automation dependence" accident, which reveals at least three important issues:
1. Modern automation technology has improved security, but with the improvement of the degree of automation and the increase of automation systems, the logical relationship between each automatic control system may make it more difficult for operators to understand and memorize. It is difficult to judge the working state of each system (such as whether it is normal).
two。 Although the probability is very small, in an automated system, once some potentially inconspicuous components encounter special conditions, the operator's manual skills may be the last guarantee of life and death.
3. There may be great differences in automation systems and design concepts among different companies, and senior operators of company A's products need to adapt to company B's products (the captain of Asiana 214 is more familiar with Airbus's autopilot, but just transferred to Boeing at the time of the accident).
Just like the representative name "AutoPilot" comes from the aviation industry, the development of intelligent assisted driving may also face similar hidden dangers.
When the very small probability is likely to be represented by new forces and independent brands, intelligent driving assistance is stepping into the era of NOA navigation assistance, and NOA has become a new battleground. Today, these navigation auxiliary systems have moved from high-speed to cities, from relying on high-precision maps to "no maps".
Coverage is getting wider and wider, and the need for human intervention is decreasing. All companies will focus on promoting the "low takeover rate", and the media will certainly pay attention to and emphasize the "XXX kilometer zero takeover". The lower the takeover rate, the safer it is, which has become the default consensus.
At present, the excellent NOA function has been able to achieve a low takeover rate of an average of 100 kilometers. In practical use, there is no shortage of hundreds of kilometers of expressways without taking over requests, even if dozens of kilometers of more complex urban areas do not require human intervention.
When the NOA coverage area extends to urban areas, when the takeover rate under more ideal road conditions may be very low, and there is even a "customized" commuter NOA (specifically for users' high-frequency commuter route training, opening, and optimization), a low probability but already existing possibility arises:
Imagine a driver who has just got a driver's license and commutes regularly on a simple NOA-covered route every day. He can get on the car and turn on the NOA until he arrives at his destination. If there had been no accident, it would have been like a day for several years, and it was theoretically possible that he had never driven the car manually.
So the question is, if the above is true, is the legal driver a veteran with several years of driving experience or a "paper driver" who has already lost his basic driving skills?
(of course, some friends will think that there are also many people in life who have not been driving for many years and have lost their driving ability but have legal driver status. However, we should not think that another kind of irrationality is reasonable because of the existing irrationality, and the rationality of being valid for life once passing the driving test is doubtful. )
Furthermore, as far as individuals are concerned, when such a driver suddenly needs to drive to an uncovered area one day, when a relevant part of autopilot suddenly fails (even if only the button is off), when the weather suddenly deteriorates to the point where the auxiliary driver cannot continue to work.
In the event of an accident, even if it is a simple and innocuous accident, can we assume that he has the ability to judge the situation and to safely take over manually?
Of course, the probability of these extreme cases being superimposed is actually very small. It requires not only the particularity of individual travel rules (only commuting to a fixed and ideal route), objective luck, but also that the NOA is really mature enough that it is difficult to meet the conditions for drivers to drive without manual driving and take over manually for a long time in real life.
But it doesn't matter whether a specific case is possible, how likely it is, or whether it has happened now. What matters is that with the progress of maturity and the expansion of coverage, urban NOA has made this extreme situation from "completely impossible" to "highly unlikely, and will become more and more likely."
In the era of only providing high-speed NOA, no matter how low the takeover rate is, it is impossible for a driver to turn on navigation assistance as soon as he gets on the car, unless he lives on the highway all his life.
But when the city is covered, even if only part of the urban area is covered, it makes it possible for the extreme case of "always only within the coverage area". With the expansion of the scope of urban NOA and the emergence of commuter NOA, this extreme or ideal situation will become more and more possible.
Most or almost all road environments are unlikely to consistently provide a clean, simple and accident-free ideal environment for auxiliary drivers. But it is not unthinkable if a very small number of individuals always travel only in an ideal traffic environment, avoiding all corner case.
At present, the potential specific danger may still be very small, but standing in the present and imagining the future, as a possible direction, as the NOA coverage becomes larger and larger and the takeover rate becomes lower and lower, the minimum probability events that seem harsh today will no longer be a minimum probability.
In addition to the extreme absolute case of "years of zero takeover and never manual driving" (although it is already possible in theory), from a relative point of view, with the development of automated assistive technology, the chance of manual driving has been greatly reduced, and the driver's ability to take over the vehicle correctly and safely will gradually be called into question.
No matter how low the takeover rate of the auxiliary driving system is, as long as it is not a real zero takeover, it is not fully autonomous driving in the real sense, and it ultimately requires the driver to have the ability to take over manually, that is, manual driving.
The driver's driving ability must be able to "cover" the range of auxiliary driving ability. It is difficult for a person who has never driven a highway manually to trust him to take over the vehicle in time to ensure safety when the system fails suddenly due to the accident of driving NOA with 120km/h.
We are all probability blind. Today, people generally regard low takeover rates as expensive, so it is of course reasonable to evaluate technology maturity, but it is not the same as "safer" in actual use. A higher degree of automation means technological superiority, but individuals who do not make any preparations may not be able to turn technological superiority into real benefits.
The expansion of the scope of "can" will of course lead to the premise of improved security, but if the scope of "can" is expanded and users begin to ignore the small probability of "no", the consequences may also be counterproductive.
On the premise that full autopilot is a long way off, advanced auxiliary driving functions such as NOA still require drivers to be ready to take over. Therefore, although the low takeover rate objectively provides a more secure basis, in practice, it depends on the user's perception, but it may not be reflected as safer.
For example, it is easy to understand that if the average takeover rate of a NOA function is once every 10 kilometers, the driver will probably concentrate so as to deal with accidents at any time, and when the average takeover rate reaches once every 100 kilometers, will the driver remain vigilant at all times? How about every five hundred kilometers?
Even if the average takeover rate is as low as once per thousand kilometers, if the driver is not ready to take over this time, the risk will not be reduced because of the low takeover rate.
It must be pointed out that the so-called average takeover rate does not mean that "approximately one takeover request occurs per xx km" and that the average ≠ is equal. The request of NOA to take over manually is a random event, and the extreme situation on the road that exceeds the capacity of NOA is even more random, and there is no rule to speak of.
Only when the number of samples is infinite, and only when the mileage is infinite, can it be considered that the probability of taking over at every moment is equal-which is obviously impossible. Therefore, the average takeover rate of "xx km" is only a summary of data after the event, and has nothing to do with actual use.
A system with an average takeover rate of every 100 kilometers may travel hundreds of kilometers without a manual takeover, or it may sound an alarm the next second. This not only depends on the objective difference of road conditions, but also may be disturbed by unexpected conditions, which is completely unpredictable.
So you can imagine a counterintuitive phenomenon: low takeover rate, highly mature NOA, the impact on security may be increased, but it may also be reduced. Because the more mature automation system not only improves the objective security, but also gives people the opportunity to reduce the subjective security awareness.
Today, it is not uncommon for NOA to suddenly "drop the line" and make dangerous actions on the Internet. If there is a risk of failure to take over in time when NOA is not reliable and drivers remain vigilant, will more drivers trust NOA as it becomes more mature and takes over less frequently?
As long as NOA does not evolve and is identified as a fully autopilot system, the possibility of manual intervention to take over cannot be ignored by individuals as long as the frequency of manual intervention is low: when the consequences are too serious, even if the population probability is only 1/10000, once it falls on the individual, it will become 100%.
"Human beings have never been good at probability animals". In popular narration, people often understand "risk" as "danger", but in fact, risk is more accurately defined as "unknown and unpredictable", and predictable danger is often not risk.
Even if the risk is minimal, if the consequences are too serious to bear, then this little risk must be redoubled. The promotion of low takeover rates makes it easy for people to focus on "minimal risk probability" and ignore whether the severity of the consequences has been reduced (not).
So we can understand that even if the average takeover rate of NOA is as low as 1000 km or 10,000 km once, automakers will not just call it an autopilot system. Otherwise, even if there is only one serious accident, the consequences will be enough to destroy any car company.
Today, the car companies that have played the L3 signboard, such as Mercedes-Benz, are very clear and determined to define the scope of use and the division of rights and responsibilities of their L3 system. It is not that people do not know how capable the domestic NOA is and what the masses like to hear and see, but after all, the L3 needs to be held responsible by the car companies, and they will know who loves them.
First of all, only in part of the highway and limit the speed (currently below 60km/h), even if there is an accident, it is difficult to exceed its vehicle passive safety bearing capacity, resulting in casualties, can be solved with money is not a problem.
Secondly, after entering L3, the driver can not issue any driving instructions, completely isolated from the auxiliary driving system, to ensure that there will not be driver error intervention affecting safety, so the responsibility of people and vehicles is blurred and difficult to divide.
The application and maturity of navigation assistance, on the one hand, on the premise of risk tips and specifications, of course effectively improve driving comfort and safety; on the other hand, with the expansion of coverage and the reduction of takeover requests, it makes possible many low-probability situations that were never possible in the past.
In the past, any driver only needed moderate autonomous driving to keep his basic driving skills intact. The more sophisticated, mature and capable NOA in the future may cause drivers to unwittingly lose or significantly reduce their ability to drive manually and deal with unexpected accidents.
Then for those car companies whose NOA is really excellent, perhaps they should consider certain "anti-addiction" measures: if it can be confirmed that the driver has been using auxiliary driving continuously for a long time (say, tens of thousands of kilometers) and there has been no manual takeover, the driver must be forced to drive manually for a certain period of time / mileage.
Of course, this is not a suggestion about specific details. On the one hand, it may be too early to discuss the implementation details, on the other hand, there are many practical problems, such as how to distinguish whether different drivers are involved in privacy issues, how to know whether drivers have driven other vehicles, and so on.
To perfectly prevent any driver from "forgetting" how to drive a vehicle manually because of the lower and lower takeover rate of the intelligent driving assistance system may now seem a bit alarmist and difficult in many operational details. but we can at least be sure of this direction and strengthen our awareness:
Until fully autonomous driving is really realized, we should not sit idly by and ignore the possibility that any intelligent assisted driving weakens the ability of human drivers to take over manually. While the automatic system brings security, it may also bring new dangers due to over-dependence, which has been with human beings for a long time and needs to be noticed.
This article comes from the official account of Wechat: autocarweekly (ID:autocarweekly)
Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.
Views: 0
*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.
Continue with the installation of the previous hadoop.First, install zookooper1. Decompress zookoope
"Every 5-10 years, there's a rare product, a really special, very unusual product that's the most un
© 2024 shulou.com SLNews company. All rights reserved.