13
votes
Why emergency braking systems sometimes hit parked cars and lane dividers: Recent Tesla autopilot crashes hold a lesson for the whole industry
Link information
This data is scraped automatically and may be incorrect.
- Authors
- Timothy B. Lee
- Published
- Jun 8 2018
- Word count
- 1983 words
This article brings up a point that I've notice for a little while. The public doesn't seem to understand the vast difference between the sort of self-driving technology developed by google and the low-sophistication of the systems used by Tesla, Uber, and others providing "Driving Assistance" features. I believe these features are actually incredibly irresponsible because their limitations aren't clearly stated and are only shown when accidents happen and people die. They also ignore human psychology that has a strong understanding of systems like this. If these system preforms optimally 98% of the time, people will not be ready for that failure event. It's very unfortunate, people will die and the reputation of self-driving cars as safe will decrease.
It doesn't help that Tesla calls their system "Autopilot". It's wildly irresponsible to give it a name like that if it really is only a driving assistance system.
And, as Rocket_Man mentioned, ignoring human nature when it comes to not following clearly stated rules. All these cars come with "keep your hands on the wheel at all times", but that gets ignored. Until...
This article provides important information about driver assist that everyone should read. It also makes me realize that true self driving cars won't be here for awhile. In this practical demonstration of human error using driving, which person is at fault? The driver, the designers (and which ones) or the corporation? How would a court decide? How are insurance companies going to cover such oversights? Even if statistically they could be whittled down to a small source of fatal accidents, how would law rule on fault finding?
I'm thrilled that so much progress has been made, and unfortunately it seems that many corrections are going to have to be made in hindsight, like this one is. Driving Assist Cars aren't designed to brake for stationary objects when the speed is over 50 mph. Really?
This article was a great behind-the-scenes look of sorts at how automatic driving systems actually work. I for one was surprised to find out that most systems operate as a collection of independent components!
This is something that worries me about the inevitable adoption of self-driving cars. Most people who are good at driving are good because they have driven so much that most actions they take are simply reflex. What's going to happen when people no longer have that practice and suddenly need to take control of the car in an emergency situation?
It's not just reflexes, when you're driving you're aware of the road conditions and can respond to changes in them. Switching your attention back to the task takes a long time. The car has to identify a problem, you have to stop what you were doing (put down the book or whatever), then start paying attention to what's happening on the road and finally, actually avoid the problem. All of that has to occur before whatever is actually causing the problem happens.
Those delays are why Level 3 (Humans take over in emergencies) systems are so dangerous. The transition is incredibly difficult to pull off safely.
Edit: Full disclosure, I'm an automotive engineer, but this is not my area of specialty.
The Air France flight that crashed over the Atlantic being a very good example of this, and in a much less immediate situation than a split second decision behind the wheel requires.
I'm not familiar with that specific example, but as I understand it there are major difference between autopilot on a plane and 'autopilot' in a car. 1)There is a lot less stuff in the sky and 2)Pilots are specifically trained in the use of autopilot and its limitations.
With cars you have a much more chaotic situation with a much less trained operator. This just compounds the risk.
Exactly. And that was my point (which I failed to make, due to not explaining anything).
If autopilot on planes can be trouble once the pilots retake control and are not really aware of what's going on, imagine that scenario but with a truck coming your way. Chance of working out is less than good, I'd say...