13 votes

Why emergency braking systems sometimes hit parked cars and lane dividers: Recent Tesla autopilot crashes hold a lesson for the whole industry

9 comments

  1. [3]
    Rocket_Man
    Link
    This article brings up a point that I've notice for a little while. The public doesn't seem to understand the vast difference between the sort of self-driving technology developed by google and...

    This article brings up a point that I've notice for a little while. The public doesn't seem to understand the vast difference between the sort of self-driving technology developed by google and the low-sophistication of the systems used by Tesla, Uber, and others providing "Driving Assistance" features. I believe these features are actually incredibly irresponsible because their limitations aren't clearly stated and are only shown when accidents happen and people die. They also ignore human psychology that has a strong understanding of systems like this. If these system preforms optimally 98% of the time, people will not be ready for that failure event. It's very unfortunate, people will die and the reputation of self-driving cars as safe will decrease.

    8 votes
    1. [2]
      Flashynuff
      Link Parent
      It doesn't help that Tesla calls their system "Autopilot". It's wildly irresponsible to give it a name like that if it really is only a driving assistance system.

      the low-sophistication of the systems used by Tesla, Uber, and others providing "Driving Assistance" features

      It doesn't help that Tesla calls their system "Autopilot". It's wildly irresponsible to give it a name like that if it really is only a driving assistance system.

      8 votes
      1. flip
        Link Parent
        And, as Rocket_Man mentioned, ignoring human nature when it comes to not following clearly stated rules. All these cars come with "keep your hands on the wheel at all times", but that gets...

        And, as Rocket_Man mentioned, ignoring human nature when it comes to not following clearly stated rules. All these cars come with "keep your hands on the wheel at all times", but that gets ignored. Until...

        2 votes
  2. DonQuixote
    Link
    This article provides important information about driver assist that everyone should read. It also makes me realize that true self driving cars won't be here for awhile. In this practical...

    This article provides important information about driver assist that everyone should read. It also makes me realize that true self driving cars won't be here for awhile. In this practical demonstration of human error using driving, which person is at fault? The driver, the designers (and which ones) or the corporation? How would a court decide? How are insurance companies going to cover such oversights? Even if statistically they could be whittled down to a small source of fatal accidents, how would law rule on fault finding?

    I'm thrilled that so much progress has been made, and unfortunately it seems that many corrections are going to have to be made in hindsight, like this one is. Driving Assist Cars aren't designed to brake for stationary objects when the speed is over 50 mph. Really?

    5 votes
  3. [5]
    Flashynuff
    Link
    This article was a great behind-the-scenes look of sorts at how automatic driving systems actually work. I for one was surprised to find out that most systems operate as a collection of...

    This article was a great behind-the-scenes look of sorts at how automatic driving systems actually work. I for one was surprised to find out that most systems operate as a collection of independent components!

    Early driver assistance systems assumed that the driver could monitor the car and intervene if the car made a mistake. But a driver's ability to monitor a car's progress depends crucially on reflexes built up over years of driving. Those reflexes depend on cars behaving in consistent and predictable ways: for example, that if you take your eyes off the road for a couple of seconds, that it will continue traveling in the same direction.

    Once a driver-assistance system reaches a certain level of complexity, the assumption that it's safest for the system to do nothing no longer makes sense. Complex driver assistance systems can behave in ways that surprise and confuse drivers, leading to deadly accidents if the driver's attention wavers for just a few seconds. At the same time, by handling most situations competently, these systems can lull drivers into a false sense of security and cause them to pay less careful attention to the road.

    This is something that worries me about the inevitable adoption of self-driving cars. Most people who are good at driving are good because they have driven so much that most actions they take are simply reflex. What's going to happen when people no longer have that practice and suddenly need to take control of the car in an emergency situation?

    4 votes
    1. [4]
      PapaNachos
      (edited )
      Link Parent
      It's not just reflexes, when you're driving you're aware of the road conditions and can respond to changes in them. Switching your attention back to the task takes a long time. The car has to...

      It's not just reflexes, when you're driving you're aware of the road conditions and can respond to changes in them. Switching your attention back to the task takes a long time. The car has to identify a problem, you have to stop what you were doing (put down the book or whatever), then start paying attention to what's happening on the road and finally, actually avoid the problem. All of that has to occur before whatever is actually causing the problem happens.

      Those delays are why Level 3 (Humans take over in emergencies) systems are so dangerous. The transition is incredibly difficult to pull off safely.

      Edit: Full disclosure, I'm an automotive engineer, but this is not my area of specialty.

      3 votes
      1. [3]
        flip
        Link Parent
        The Air France flight that crashed over the Atlantic being a very good example of this, and in a much less immediate situation than a split second decision behind the wheel requires.

        The Air France flight that crashed over the Atlantic being a very good example of this, and in a much less immediate situation than a split second decision behind the wheel requires.

        1. [2]
          PapaNachos
          Link Parent
          I'm not familiar with that specific example, but as I understand it there are major difference between autopilot on a plane and 'autopilot' in a car. 1)There is a lot less stuff in the sky and...

          I'm not familiar with that specific example, but as I understand it there are major difference between autopilot on a plane and 'autopilot' in a car. 1)There is a lot less stuff in the sky and 2)Pilots are specifically trained in the use of autopilot and its limitations.

          With cars you have a much more chaotic situation with a much less trained operator. This just compounds the risk.

          3 votes
          1. flip
            Link Parent
            Exactly. And that was my point (which I failed to make, due to not explaining anything). If autopilot on planes can be trouble once the pilots retake control and are not really aware of what's...

            Exactly. And that was my point (which I failed to make, due to not explaining anything).

            If autopilot on planes can be trouble once the pilots retake control and are not really aware of what's going on, imagine that scenario but with a truck coming your way. Chance of working out is less than good, I'd say...

            1 vote