35 votes

Human operator pleads guilty in first ever US self-driving pedestrian fatality case

22 comments

  1. [13]
    MimicSquid
    Link
    Talk about damning with faint praise. Also, the fact that the driver is only getting three years of probation is pretty concerning here.

    Prosecutors alleged Vasquez, who was an operator monitoring Uber’s self-driving capability capabilities, was on her phone watching the talent show The Voice just moments before striking the pedestrian in Tempe, Arizona in 2018. Vasquez’s attorneys took issue with that characterization, though, and claimed she was merely listening to show and checking company Slack messages on another device.

    Talk about damning with faint praise. Also, the fact that the driver is only getting three years of probation is pretty concerning here.

    33 votes
    1. [12]
      raze2012
      Link Parent
      Doesn't 3 years seem like plenty? They pleaded guilty, there was clearly no malice, and at some point past that probation you are simply throwing someone in the slammer to make an example. There...

      Also, the fact that the driver is only getting three years of probation is pretty concerning here.

      Doesn't 3 years seem like plenty? They pleaded guilty, there was clearly no malice, and at some point past that probation you are simply throwing someone in the slammer to make an example. There is also the act of the self-driving functionality that should have been able to sense this (it's like, THE primary case you want a self driving car to address). It of course isn't legal to fall asleep/into neglect at the wheel of a "self driving" vehicle at this stage, but it still bears some thought.

      Also keep in mind that this wasn't some quick trial. They already spent 5 years deliberating and I'm sure this person wasn't exactly allowed to roam free.

      50 votes
      1. [9]
        ackables
        Link Parent
        Well the whole reason a human is behind the wheel monitoring the vehicle is because someone has decided that the self driving AI needs a human to be liable for its actions. If that human took...

        Well the whole reason a human is behind the wheel monitoring the vehicle is because someone has decided that the self driving AI needs a human to be liable for its actions. If that human took responsibility for making sure the AI does not hurt other people or cause property damage, they need to be ensuring they can prevent that to the best of their abilities.

        Being on a phone is not how they will be at peak performance. It’s similar to how drunk drivers can be charged for the deaths of people they hit. Even if the accident would have happened if they were sober, by impairing themselves they made the best case scenario less likely.

        19 votes
        1. [4]
          Beenrak
          Link Parent
          Yes but why are they in prison? Is it for them to learn or for others to learn

          Yes but why are they in prison? Is it for them to learn or for others to learn

          14 votes
          1. [3]
            ackables
            (edited )
            Link Parent
            Well probably a bit of both, but that more so gets into the philosophy of the criminal justice system. Do you think a drunk driver that runs over a pedestrian should go to jail? This article...

            Well probably a bit of both, but that more so gets into the philosophy of the criminal justice system.

            Do you think a drunk driver that runs over a pedestrian should go to jail? This article suggests cognitively demanding activities, such as texting, causes drivers to perform similarly to drivers who are between 0.07 BAC and 0.10 BAC on a simulator testing for braking performance and speed regulation. This older article comes to a similar conclusion.

            No matter what the punishment should be, it should at least be comparable to how we treat drunk drivers who kill people or cause property damage.

            Edit: Also they are not even in prison. They only got 3 years of probation which is a pretty light sentence for killing someone while impaired.

            11 votes
            1. [2]
              raze2012
              (edited )
              Link Parent
              I think the difference is 1) knowledge and 2) intent. Simply put, we have more than enough literature and observations to know how a drunk driver behaves, and we know how people get drunk. There...

              I think the difference is 1) knowledge and 2) intent. Simply put, we have more than enough literature and observations to know how a drunk driver behaves, and we know how people get drunk. There is an active choice to do both of those actions and it's comsequences are well known by society.

              Meanwhile, Self driving cars is still brand new and this was in fact the first instance of such a case. And the intent eventually should be that a self driving car can make proper judgement to not harm pedestrians (in the general case, we don't need to get into all the various AI trolley problems out there). Depending on how it was advertised (I think "self driving" as a whole is false advertising) the car manufacturer should be fined as well. As well as Uber, but Uber was at least trying to be targeted based on this article.

              At the very least I think the punishment should be less severe than vehicular manslaughter, but not completely guilt free like a suicidal pedestrian jumping in front of your car.

              9 votes
              1. carnage431
                Link Parent
                This IMHO is worse than drunk driving. As far as we know the decision to be negligent was made sober. As far as punishment, I couldn't say. They clearly feel guilty. I don't think I could forgive...

                This IMHO is worse than drunk driving. As far as we know the decision to be negligent was made sober. As far as punishment, I couldn't say. They clearly feel guilty. I don't think I could forgive myself, if I were responsible for such a thing..

                8 votes
        2. asciipip
          Link Parent
          But here's my issue with all of this: Humans are bad at being vigilant all the time. When they have to be, it can lead to disorders like PTSD that affect their ability to function in society....

          But here's my issue with all of this:

          Humans are bad at being vigilant all the time. When they have to be, it can lead to disorders like PTSD that affect their ability to function in society. People are even worse at staying vigilant all the time when most of the time there's nothing that's actively engaging them in the task.

          When driving a car, you're engaged with the world around you.⁰ You have to be looking at where you are; adjusting the car's speed to allow for other vehicles; steering to follow the road, make turns, and avoid obstacles; and so on. When the car is self-driving, you're not as engaged with the driving process and it's harder to maintain focus.

          There are all of these studies about how bad people are at driving that are used as evidence for how much we need self-driving cars. And then the companies developing the self-driving cars want to make humans responsible for those cars, but under circumstances that are far less workable than normal driving.

          I really think the whole practice of making humans responsible for the self-driving cars is largely a gambit on the part of the car manufacturers to avoid liability that has the effect of making the streets less safe and expects more from the humans "at the wheel" than is reasonable.


          ⁰At least nominally. Just being in a car gives the driver some sense of separation from the world outside the vehicle.

          5 votes
        3. [3]
          Good_Apollo
          Link Parent
          I just don’t understand why a car needs to have a sole human operator responsible for liability. We don’t have elevator operators anymore? It’s odd to me that autonomous cars are considered this...

          I just don’t understand why a car needs to have a sole human operator responsible for liability. We don’t have elevator operators anymore? It’s odd to me that autonomous cars are considered this unprecedented technology that the law is incapable of dealing with reasonably and we feel constrained in its application because of it.

          1. [2]
            ackables
            Link Parent
            Elevators operate on a set path with controlled entry and a suite of sensors monitoring its activity. They used to not have sensors to make sure people didn’t get hurt and did not even...

            Elevators operate on a set path with controlled entry and a suite of sensors monitoring its activity. They used to not have sensors to make sure people didn’t get hurt and did not even automatically stop in the right areas, so a human operator made sure the riders were safe.

            Self driving cars are not yet at the level of reliable operation that elevators are at, so a human is supposed to monitor and make sure people don’t get hurt.

            5 votes
            1. Good_Apollo
              Link Parent
              That's not really what I was getting at, I'm talking about the widespread panic about self-driving and liability. The pervasive question of "who is liable?" with self-driving cars when we don't...

              That's not really what I was getting at, I'm talking about the widespread panic about self-driving and liability. The pervasive question of "who is liable?" with self-driving cars when we don't seem to have a problem figuring that out with other automated equipment. I mean there are still court battles over it but some people act like it will upend society.

              As always though, most errors are human. The idea that a human observer makes the vehicle safer is sort of laughable to me.

              How many planes have gone down because the pilots thought they knew better than the equipment? Happens all the damn time.

              1 vote
      2. [3]
        Comment deleted by author
        Link Parent
        1. [2]
          godzilla_lives
          Link Parent
          They are avoiding prison time by entering a guilty plea and will serve three years probation.

          They are avoiding prison time by entering a guilty plea and will serve three years probation.

          1 vote
          1. bioemerl
            Link Parent
            Oh, that's way more reasonable and understandable.

            Oh, that's way more reasonable and understandable.

            1 vote
  2. [5]
    reborn
    Link
    I wonder what is the point of self-driving cars if they need to be constantly monitored by a human being?

    I wonder what is the point of self-driving cars if they need to be constantly monitored by a human being?

    15 votes
    1. SleepyGary
      (edited )
      Link Parent
      By industry definitions we are not at the point where the vehicles are driving and, despite the marketing, even Tesla's "Full self driving" are considered level 2.

      By industry definitions we are not at the point where the vehicles are driving and, despite the marketing, even Tesla's "Full self driving" are considered level 2.

      23 votes
    2. skybrian
      Link Parent
      There are different kinds. Some are more like super cruise control and others (like Waymo) really do drive without a safety driver. But only in certain places, so far. In this case, though, it was...

      There are different kinds. Some are more like super cruise control and others (like Waymo) really do drive without a safety driver. But only in certain places, so far.

      In this case, though, it was a test. The safety driver was there because the technology was still experimental. I don’t think it’s any different than a train engineer not paying attention, even though there’s little to do most of the time.

      9 votes
    3. Maxi
      Link Parent
      The common reason - so rich people can earn more money by paying less and offloading risk and responsibilities.

      The common reason - so rich people can earn more money by paying less and offloading risk and responsibilities.

      8 votes
    4. blivet
      Link Parent
      Yeah, if I have to maintain constant awareness of what is going on around my vehicle in case I might have to take over, I’d rather just drive it myself.

      Yeah, if I have to maintain constant awareness of what is going on around my vehicle in case I might have to take over, I’d rather just drive it myself.

      2 votes
  3. zazowoo
    Link
    This is only somewhat related, but I recently read this write-up about the current state of the self-driving car landscape and found it well-written and pretty interesting/enlightening:...

    This is only somewhat related, but I recently read this write-up about the current state of the self-driving car landscape and found it well-written and pretty interesting/enlightening: https://www.understandingai.org/p/the-death-of-self-driving-cars-is

    2 votes
  4. [4]
    Comment removed by site admin
    Link
    1. [3]
      ackables
      Link Parent
      Actually Uber hired this woman to prevent the car from running over pedestrians. Unless they were requiring her to use slack while the vehicle was moving or knew about her watching TV while the...

      Actually Uber hired this woman to prevent the car from running over pedestrians. Unless they were requiring her to use slack while the vehicle was moving or knew about her watching TV while the vehicle was moving, they shouldn’t be at fault.

      20 votes
      1. [2]
        Comment deleted by author
        Link Parent
        1. ackables
          Link Parent
          I just don’t think you can say this is Uber’s fault. The NTSB is right to release a report evaluating what went wrong with the software so the car can improve, but Uber never claimed their system...

          I just don’t think you can say this is Uber’s fault. The NTSB is right to release a report evaluating what went wrong with the software so the car can improve, but Uber never claimed their system was complete. The human is there to make up for the shortcomings of the software. Uber did the right thing by placing a human in the vehicle who should have been ready to stop it as soon as it entered a dangerous situation.

          If Uber released their self driving car with no human in the driver’s seat to monitor the vehicle that couldn’t detect unexpected pedestrians, then I would be upset. This is the fault of the driver who did not monitor the vehicle properly.

          5 votes
      2. [2]
        Comment removed by site admin
        Link Parent
        1. Pistos
          Link Parent
          Perhaps Uber should have been held liable for not doing more to prevent what happened. Yes, an operator was put in the vehicle, but maybe they could have put in technology to watch for the driver...

          Perhaps Uber should have been held liable for not doing more to prevent what happened. Yes, an operator was put in the vehicle, but maybe they could have put in technology to watch for the driver not paying attention, like gaze tracking, or eyelid closure detection. Perhaps prohibit their operators from using the phone at all when in motion; perhaps phone blocking tech; perhaps an app on the phone to do said blockage; perhaps not allowing phones at all while on duty (except perhaps hands-free calling).

          5 votes