17 votes

Hackers can trick a Tesla into accelerating by fifty miles per hour

10 comments

  1. [8]
    teaearlgraycold
    Link
    Even if this were true, which it isn't, almost no one would drive 85 mph in a zone that actually has a speed limit of 35 mph. You can pretty easily guess the intended speed limit of a road by...

    The modified speed limit sign reads as 85 on the Tesla's heads-up display. A Mobileye spokesperson downplayed the research by suggesting this sign would fool a human into reading 85 as well.

    Even if this were true, which it isn't, almost no one would drive 85 mph in a zone that actually has a speed limit of 35 mph. You can pretty easily guess the intended speed limit of a road by context. Are you going down a sub-highway used to access commercial districts? You probably want to go less than 40 mph. One random 85 mph sign wouldn't get me to floor the gas pedal.

    Obviously Tesla needs their car to consume information from multiple sources. What does Google Maps think the speed is? What speed are other cars driving? What do the road signs say? What kind of road is this? Basically, they need to replicate a human driver's intuition. Should this road have its reported speed limit?

    14 votes
    1. [7]
      vord
      Link Parent
      And that's why self-driving cars are nowhere near ready for prime time. We might be able to brute force vision and mechanics, but not so much awareness, which is so much more.

      Basically, they need to replicate a human driver's intuition

      And that's why self-driving cars are nowhere near ready for prime time. We might be able to brute force vision and mechanics, but not so much awareness, which is so much more.

      7 votes
      1. [4]
        teaearlgraycold
        Link Parent
        It depends on how you look at it. Is it wrong to green light self-driving cars if they make mistakes that no human would make, but they also prevent accidents enough to counteract those mistakes?...

        It depends on how you look at it. Is it wrong to green light self-driving cars if they make mistakes that no human would make, but they also prevent accidents enough to counteract those mistakes? As a responsible driver, I might not be comfortable driving a Tesla at the moment. But I'm not opposed to people opting into a car that can't drive drunk.

        16 votes
        1. [3]
          vord
          Link Parent
          I'd rather be able to hold a drunk driver accountable than Tesla. At least then I have a chance of getting it resolved within a year, instead of having to fight Tesla's army of lawyers during the...

          I'd rather be able to hold a drunk driver accountable than Tesla.

          At least then I have a chance of getting it resolved within a year, instead of having to fight Tesla's army of lawyers during the arbitration which was a condition of your purchase.

          1 vote
          1. RNG
            (edited )
            Link Parent
            I think the accountability of self-driving car vendors in accidents is an interesting dilemma, but does the necessity of solving it outweigh the higher-order good of reducing vehicle fatalities?...

            I'd rather be able to hold a drunk driver accountable than Tesla.

            I think the accountability of self-driving car vendors in accidents is an interesting dilemma, but does the necessity of solving it outweigh the higher-order good of reducing vehicle fatalities?

            If it is sufficiently demonstrated that an AI driver is across the board less likely to kill its occupants and other drivers/pedestrians, it seems there's a moral duty on society to green light this tech as quickly as possible. Any alternative is something you pay for in lives.

            (To be clear, this is not intended to say that further work shouldn't be put in to secure these systems, just that the presence of such a life-saving technology creates ethical obligations.)

            Edit: Grammar

            4 votes
          2. teaearlgraycold
            Link Parent
            Has this not already been handled in court? I would expect the first few cases will set a precedence, and after that insurance claims won't be any worse than they are now. In America I would guess...

            Has this not already been handled in court? I would expect the first few cases will set a precedence, and after that insurance claims won't be any worse than they are now.

            In America I would guess that Tesla will successfully lobby the courts into leaving all accountability on the car owner.

            2 votes
      2. [2]
        babypuncher
        Link Parent
        All that matters for the purpose of adoption is that the self driving cars have a better safety record than humans. It doesn't matter if they aren't as good as humans in some areas as long as they...

        All that matters for the purpose of adoption is that the self driving cars have a better safety record than humans. It doesn't matter if they aren't as good as humans in some areas as long as they are still better than us in others.

        8 votes
        1. petrichor
          Link Parent
          Self driving cars are going to need to have a much better safety record than humans, because the manufactures take the blame for accidents. You can already see this happening with Tesla's...

          Self driving cars are going to need to have a much better safety record than humans, because the manufactures take the blame for accidents. You can already see this happening with Tesla's Autopilot feature.

          8 votes
  2. [2]
    Autoxidation
    Link
    This article isn't really true. The cars don't try to drive the speed limit; the driver sets an upper limit, and the cars react to that if it is below that limit. Currently, Tesla vehicles can't...

    This article isn't really true. The cars don't try to drive the speed limit; the driver sets an upper limit, and the cars react to that if it is below that limit. Currently, Tesla vehicles can't drive more than 5 MPH over the posted speed limit on a non-divided road. If the speed limit is 35 MPH on a road nearby, the car won't allow the driver to set the speed beyond 40 MPH.

    4 votes
    1. skybrian
      Link Parent
      Clicking through to McAffee's article, it seems that it only worked with an earlier version of the software: Also, it's apparently not a very practical attack:

      Clicking through to McAffee's article, it seems that it only worked with an earlier version of the software:

      Of note is that all these findings were tested against earlier versions (Tesla hardware pack 1, mobilEye version EyeQ3) of the MobilEye camera platform. We did get access to a 2020 vehicle implementing the latest version of the MobilEye camera and were pleased to see it did not appear to be susceptible to this attack vector or misclassification, though our testing was very limited.

      Also, it's apparently not a very practical attack:

      Is there a feasible scenario where an adversary could leverage this type of an attack to cause harm? Yes, but in reality, this work is highly academic at this time.

      4 votes