35 votes

Tesla braces for its first trial involving Autopilot US fatality

10 comments

  1. [8]
    Eji1700
    Link
    I'd like to point out the main "how the fuck did we get here" part of all this- This is what's going to be constantly argued about. First off, I think even if this were a sane position, Tesla...

    I'd like to point out the main "how the fuck did we get here" part of all this-

    Tesla denied liability for both accidents, blamed driver error and said Autopilot is safe when monitored by humans. Tesla said in court documents that drivers must pay attention to the road and keep their hands on the steering wheel.

    This is what's going to be constantly argued about. First off, I think even if this were a sane position, Tesla should eat shit on this case because of how they market autopilot in a way that encourages dangerous behavior.

    That said, this is, and always will be, a failed concept. Ignoring IF there was enough time to react (as I suspect in many cases we'll find there wasn't), it assumes the driver is paying attention. There are 2 pilots in most planes because it's very very very well documented that just one is going to get bored, zone out, and not pay good enough attention, and that's in situations where you can have 10+ seconds to react.

    Car accidents have less time to react and are often still long enough to easily get into that "you're not really paying attention" dead zone. Humans are terrible with infrequent events, and are often going to be lulled in to comfort when they shouldn't, and THESE ARE THE GOOD ONES THAT TRAIN FOR IT. Not the everyday slobs you're already pissed at for not paying attention.

    I sincerely believe that we're eventually going to find fudged numbers and bad data on this stuff, and it's going to turn out to be on average more dangerous than a normal driver.

    23 votes
    1. [6]
      devilized
      (edited )
      Link Parent
      I couldn't find a reliable statistic on this (other than Tesla's own claims). But what are the actual accident rates of cars on autopilot compared to cars controlled by humans? Tesla apparently...

      I couldn't find a reliable statistic on this (other than Tesla's own claims). But what are the actual accident rates of cars on autopilot compared to cars controlled by humans? Tesla apparently claims that Autopilot has 0.18 accidents per million miles, compared to 1.53 on average source. But hearing that Tesla tries to keep itself out of accident reports and such, I'm curious what the real number is?

      I'm not convinced that autopilot is more dangerous than a normal driver, but I suspect that the overall numbers aren't as rosy as they're claiming right now. Reaction time for humans is much worse than a computer.

      10 votes
      1. [3]
        Eji1700
        Link Parent
        Tesla's claims are highly suspect, not just from being a biased source, but because it's not comparing apples to apples. Their argument in this case "oh it's safe when a human is paying...

        Tesla's claims are highly suspect, not just from being a biased source, but because it's not comparing apples to apples.

        Their argument in this case "oh it's safe when a human is paying attention", implies that they are likely throwing out any data where they deemed the human "didn't pay attention". If their system increases the number of humans not paying attention, that's the sort of thing that will ramp up the more of them you have on the road, since right now other drivers are paying attention and "adjusting" for any autopilot failures when possible.

        If you hit the saturation point where it's autopilot next to autopilot, I think that's when we'll see a lot more accidents.

        Further last time I looked into this the stats being cited by tesla for human drivers, include accidents in situations autopilot literally cannot be used in. Parking lots, construction, and even some residential.

        As for where to find good stats....well that's the concerning part. I've never found any. I haven't seen a government study handled by 3rd parties with actual controls and mass data. Basically everything I've seen falls under "suspect" if not straight up marketing.

        20 votes
        1. [2]
          devilized
          Link Parent
          I'm curious about this view. Part of what makes autonomous driving such a difficult problem to solve is that you have to program for a huge number of potential actions that a human might take in...

          If you hit the saturation point where it's autopilot next to autopilot, I think that's when we'll see a lot more accidents.

          I'm curious about this view. Part of what makes autonomous driving such a difficult problem to solve is that you have to program for a huge number of potential actions that a human might take in any given situation. If the car is driven by a computer, its possible actions are reduced to exactly whatever it is programmed to do. Wouldn't that make it easier for computers to anticipate what another vehicle will do in a specific situation? It would be even better if these cars could communicate with each other or have some kind of central command/control.

          5 votes
          1. Eji1700
            Link Parent
            I'm starting with this because yes, if they were all networked it would be vastly safer. It would also take tremendous infrastructure to do right, and you're going to need ford autopilot to talk...

            It would be even better if these cars could communicate with each other or have some kind of central command/control.

            I'm starting with this because yes, if they were all networked it would be vastly safer. It would also take tremendous infrastructure to do right, and you're going to need ford autopilot to talk to tesla autopilot and so on.

            As for the rest of it you are vastly underestimating the complexity, and how these systems are designed. Because they CAN'T account for everything, they don't, and instead "react" to everything. The problem is when they don't react because they don't see a problem even though there isn't one, or can't react because by the time the problem occurs it's too late.

            So in short, the kind of system you would build if EVERYONE was on it (which uh...we're dangerously close to trains and light rail again at that point) is different from the ground up and not relevant. The kind of system they actually are see's almost no benefit from having other cars like it on the road, and worse, might be more dangerous. If one autopilot system doesn't understand the threat, it's likely the other doesn't as well.

            6 votes
      2. [2]
        Comment deleted by author
        Link Parent
        1. devilized
          Link Parent
          I agree, context is very important here. There are a lot of factors - severity of the accidents, injury, the party at fault, etc that are needed to compare the overall safety metrics at the...

          I agree, context is very important here. There are a lot of factors - severity of the accidents, injury, the party at fault, etc that are needed to compare the overall safety metrics at the various levels of computer-assisted driving.

          4 votes
      3. first-must-burn
        Link Parent
        This is true, but misleading. Humans are very good at recognizing that a situation is out of the ordinary and adapting to it. A computer will (very quickly) do something catastrophically bad...

        Reaction time for humans is much worse than a computer.

        This is true, but misleading. Humans are very good at recognizing that a situation is out of the ordinary and adapting to it. A computer will (very quickly) do something catastrophically bad simply because the situation departs from anything it was designed to handle, but it wasn't able to tell that the situation was anomalous.

        As for Tesla autopilot, saying "the human driver is responsible for safety" is a dodge that lets them beta test an unproven software on public roads while putting the risk into people on the road who don't get benefit (financially or otherwise) from that Tesla develops, as well as shifting the liability risk to the driver.

        3 votes
    2. itdepends
      Link Parent
      The whole "safe if monitored by humans" spiel is so ridiculous in my opinion, I don't understand how people don't immediately call out Tesla. Monitoring an autopilot as suggested, ready to take...

      The whole "safe if monitored by humans" spiel is so ridiculous in my opinion, I don't understand how people don't immediately call out Tesla.

      Monitoring an autopilot as suggested, ready to take action, hands on the wheel, is likely much more mentally draining than simply driving the damn car. So what even is the point of the "autopilot"? We have lane assist and adaptive cruise control and automated emergency braking if we just want a little extra help and comfort on long journeys so what is Tesla's autopilot offering if you have to sit at attention, constantly trying to figure out if the autopilot is changing lanes normally or about to careen into a semi?

      5 votes
  2. first-must-burn
    Link
    If anyone want a good expert take on the safety of autonomous vehicles, Phil Koopman's work is excellent at describing the nuances of the challenges to autonomous vehicles in an approachable way....

    If anyone want a good expert take on the safety of autonomous vehicles, Phil Koopman's work is excellent at describing the nuances of the challenges to autonomous vehicles in an approachable way. You can start with publicly available resources and commentary with his LinkedIn or his youtube channel. His book, How Safe Is Safe Enough is also a great read.

    Full disclosure: I am a friend and long time colleague of Dr. Koopman's, but it is my firm belief that he is out there fighting the good fight when it comes to safety for autonomous vehicles.

    6 votes
  3. riQQ
    Link

    Tesla Inc (TSLA.O) is set to defend itself for the first time at trial against allegations that failure of its Autopilot driver assistant feature led to death, in what will likely be a major test of Chief Executive Elon Musk's assertions about the technology.

    Self-driving capability is central to Tesla’s financial future, according to Musk, whose own reputation as an engineering leader is being challenged with allegations by plaintiffs in one of two lawsuits that he personally leads the group behind technology that failed. Wins by Tesla could raise confidence and sales for the software, which costs up to $15,000 per vehicle.

    Tesla faces two trials in quick succession, with more to follow.

    5 votes