28 votes

GM's Cruise recalling 950 driverless cars after pedestrian dragged in US crash

12 comments

  1. [8]
    mattw2121
    Link
    Yeah, I can see that having the Cruise pull over to the side after ANY crash is problematic. The car is not going to be able to assess every single thing that is happening after a crash. It...

    Yeah, I can see that having the Cruise pull over to the side after ANY crash is problematic. The car is not going to be able to assess every single thing that is happening after a crash. It especially may not be able to detect if something is now broken after the crash. Obviously it can detect many components being broken or in an undesirable state, but something like a human or bicycle now lodged into it's front grill may be undetectable.

    It should have a failure mode that says if(crash) then (stop moving). May be an inconvenience to flow of traffic, but will be a lot safer.

    10 votes
    1. [7]
      GunnarRunnar
      Link Parent
      That sounds so obvious but also if that is the best solution maybe these cars aren't ready for primetime.

      That sounds so obvious but also if that is the best solution maybe these cars aren't ready for primetime.

      10 votes
      1. [2]
        scroll_lock
        Link Parent
        I'm sure this is a difficult problem to model and I cannot speak to the intricacies of software engineering, but as with most transportation technology, safety measures for automobiles are written...

        I'm sure this is a difficult problem to model and I cannot speak to the intricacies of software engineering, but as with most transportation technology, safety measures for automobiles are written in the blood of pedestrians. In this case, if the hit-and-run driver was going fast enough to fling a pedestrian into another lane of traffic, that probably speaks to an unnecessarily car-centric roadway design whose layout encourages higher speeds than is safe. Driverless car or not, this could possibly have been avoided with better infrastructure.

        Driverless trains have operated effectively for years on many transit systems around the world. This is possible because a train track is an extremely predictable space when fully grade-separated from auto traffic. Streets just aren't the same. They have so much more going on and so many Vulnerable Road Users (VRUs).

        It would not surprise me if statistically these cars were already safer than human-operated ones in well-defined spaces. It's not like they can become intoxicated. But the unpredictability of the streetscape, and in particular its inherently dynamic nature—people, vehicles, obstructions, and surfaces are always changing faster than a model incapable of understanding context can necessarily be trained to "see" them.

        Short of a general artificial intelligence which could be centuries away (or impossible), I am hesitant to physically remove interior steering wheels, and human drivers, from vehicles that have the capacity to cause a lot of human casualty, even if they end up being automatically operated most of the time.

        11 votes
        1. redwall_hp
          (edited )
          Link Parent
          Speaking as a software engineer: entrusting lives to software is lunacy, unless it's used sparingly and done in an almost infuriatingly painstaking method like NASA or medical devices. If you know...

          Speaking as a software engineer: entrusting lives to software is lunacy, unless it's used sparingly and done in an almost infuriatingly painstaking method like NASA or medical devices. If you know what the sausage is and how it's made, you wouldn't want to be near two tons of mobile metal controlled by it. Beta testing deadly machines on an unconsenting public is supremely unethical and irresponsible, and at this point the software to handle it is a fairy tale they're hoping most people will be taken by.

          It's rampant behavior like this that makes me think software engineering (outside of genuine hobby activity) should be professionally licensed, like civil engineering or medicine. There should be strict ethical requirements, malpractice insurance and the threat of being barred from practicing in egregious cases. In a scenario like that, there would not be any autonomous vehicle projects on public roads.

          14 votes
      2. [4]
        unkz
        Link Parent
        What could be an alternate solution though? After an accident, when its sensors are likely unreliable, can there be an even theoretically better solution besides stop doing everything?

        What could be an alternate solution though? After an accident, when its sensors are likely unreliable, can there be an even theoretically better solution besides stop doing everything?

        2 votes
        1. DiggWasCool
          Link Parent
          That sometimes happens with cars operated by humans. There are plenty of cars which die on the road or accidents happen and cars flip over, and can't be moved, then they have to be pushed or towed...

          That sometimes happens with cars operated by humans. There are plenty of cars which die on the road or accidents happen and cars flip over, and can't be moved, then they have to be pushed or towed away. What do we do in those situations? We don't look for solutions that prevent cars from dying. and we don't make all cars impossible to flip over, do we?

          When I was much younger and was driving an old beater, my car once died on the highway. I was lucky enough that the vehicle didn't come to a sudden stop and instead it kept going but slowed down and I was even luckier that people behind me realized something was off and they slowed down. My car ended up completely dead in the middle lane of a highway. There were cars going 60-80 on both of my sides so I couldn't get out to push it out of the way. I had to wait for the police to come and block one of the lanes to let me move my car out of the way and then eventually have it towed.

          Obviously this isn't comparable to someone being dragged to death but it's comparable in the sense that my car, operated by me, and I am most certainly a human, stopped working in the middle of a busy highway and couldn't be moved without the help of other humans and the highway getting blocked off by police cars.

          3 votes
        2. [2]
          GunnarRunnar
          Link Parent
          Doesn't stuff like airplanes and ships have backup systems after backup systems?

          Doesn't stuff like airplanes and ships have backup systems after backup systems?

          1. unkz
            (edited )
            Link Parent
            That doesn’t sound very comparable, they have a human that takes control in emergencies don’t they?

            That doesn’t sound very comparable, they have a human that takes control in emergencies don’t they?

  2. [3]
    BoomerTheMoose
    Link
    My sympathies to the person who was struck. I notice the article mentions that they were struck by a hit and run driver and thrown into the path of the driverless vehicle. Reading this makes me...

    My sympathies to the person who was struck. I notice the article mentions that they were struck by a hit and run driver and thrown into the path of the driverless vehicle.

    Reading this makes me wonder;

    Robot car hits person = Halt production! Recall vehicles! Stop everything, back to the drawing board!

    Human driven car hits person = 🤷 too bad I guess!

    I'm not sure what to make of it. Shouldn't human drivers hitting people be similarly scrutinized?

    6 votes
    1. [2]
      unkz
      Link Parent
      We throw humans that do this kind of thing in jail though. So if we are supposed to act similarly, shouldn’t we be prosecuting companies? Sounds to me like the companies are getting the much...

      We throw humans that do this kind of thing in jail though. So if we are supposed to act similarly, shouldn’t we be prosecuting companies? Sounds to me like the companies are getting the much easier treatment.

      3 votes
      1. BoomerTheMoose
        Link Parent
        That's a fair point. Was the culprit of the initial hit and run apprehended? Please, don't mistake my post for defending the self driving car company, I'm just raising the question in general....

        That's a fair point. Was the culprit of the initial hit and run apprehended?

        Please, don't mistake my post for defending the self driving car company, I'm just raising the question in general.

        Cars are extremely dangerous things, I feel slight anxiety every time I get behind the wheel. I feel like a lot of people forget that in our society.

        2 votes
  3. Eji1700
    Link
    Whole article is worth reading but the main meat is probably-

    Whole article is worth reading but the main meat is probably-

    The cars are being recalled because the collision detection subsystem of the Cruise Automated Driving Systems (ADS) software may respond improperly after a crash, according to a notice made public by the National Highway Traffic Safety Administration (NHTSA) Wednesday.

    The recall is the latest setback for GM's Cruise unit that faces growing questions about its technology that GM says it key to its growth plans.

    GM Chief Executive Mary Barra reiterated in June a forecast Cruise could generate $50 billion in revenue by 2030. Cruise lost more than $700 million in the third quarter of this year.

    GM shares fell 1.6% to $27.95 on Wednesday.

    Last month, a pedestrian in San Francisco was struck by a hit-and-run driver and thrown into an adjacent lane and was hit a second time by a Cruise robotaxi that was not able to stop in time and then dragged the pedestrian.

    The recall addresses circumstances when the software may cause the Cruise AV to attempt to pull over out of traffic instead of remaining stationary "when a pullover is not the desired post-collision response," Cruise said.

    5 votes