62 votes

Tesla influencers tried Elon Musk’s coast-to-coast self-driving, crashed before sixty miles

23 comments

  1. [10]
    brews_hairy_cats
    Link

    In 2016, Elon Musk infamously said that Tesla would complete a fully self-driving coast-to-coast drive between Los Angeles and New York by the end of 2017.

    The idea was to livestream or film a full unedited drive coast-to-coast with the vehicle driving itself at all times.

    We are in 2025 and Tesla never made that drive.

    Two of these Tesla shareholders and online influencers attempted to undertake a coast-to-coast drive between San Diego, CA, and Jacksonville, FL, in a Tesla Model Y equipped with the latest FSD software update.

    They didn’t make it out of California without crashing into easily avoidable road debris that badly damaged the Tesla Model Y

    35 votes
    1. [8]
      Eji1700
      Link Parent
      I'd be mildly more interested to see this done with Waymo. I also expect it to fail the first time it hits any real weather, but it would set a rather realistic data point for all the automation...

      I'd be mildly more interested to see this done with Waymo. I also expect it to fail the first time it hits any real weather, but it would set a rather realistic data point for all the automation fans that "It can be good, it's not there yet, the details are very much devilish".

      I'd be really curious to sit down with the actuaries looking at all this and see the tables they're building. I know tesla's supposedly have the most accidents or worst drivers, but that's hardly rock solid data. Someone somewhere is going over all the data with a fine tooth comb to classify it and that's probably the most real output we have so far.

      20 votes
      1. [5]
        redwall_hp
        (edited )
        Link Parent
        Waymo confidently handles the typical rain and fog in the Bay Area, and they are rapidly expanding into the northeast with plans to tackle snow. They're also coming close to 100 million miles...

        Waymo confidently handles the typical rain and fog in the Bay Area, and they are rapidly expanding into the northeast with plans to tackle snow. They're also coming close to 100 million miles driven without human operator, and to date have only had one fatality. (For which they were not at fault.) And they, you know, actually talk about statistics and methodology.

        Waymo is playing chess while Tesla is chewing on Candyland pieces.

        45 votes
        1. [4]
          hobbes64
          Link Parent
          Here is a video I watched recently on Nebula (It’s probably also on YouTube). It’s a decent overview of the different strategies used by Tesla and Waymo. The big disadvantage for Waymo is that...

          Here is a video I watched recently on Nebula (It’s probably also on YouTube).

          It’s a decent overview of the different strategies used by Tesla and Waymo. The big disadvantage for Waymo is that they don’t control the supply chain and have to augment another company’s car. So even though Waymo might be better, it’s going to be less profitable for the company for a long time.

          In this video, Tesla apparently takes a better route which is faster, it also makes a mistake and gets stuck by a road hazard.

          The route is just in the Bay Area, not cross country of course.

          Maxinomics: I raced a Tesla against a Waymo

          8 votes
          1. [3]
            teaearlgraycold
            Link Parent
            Kinda crazy how he didn't test Tesla's new Robotaxi service.

            Maxinomics: I raced a Tesla against a Waymo

            Kinda crazy how he didn't test Tesla's new Robotaxi service.

            2 votes
            1. [2]
              kari
              Link Parent
              Aren't those just in Austin right now (and the video in San Francisco?)

              Kinda crazy how he didn't test Tesla's new Robotaxi service.

              Aren't those just in Austin right now (and the video in San Francisco?)

              6 votes
              1. teaearlgraycold
                Link Parent
                It's a closed beta, but an easy one to get into. I haven't used it yet myself so maybe I'm mistaken.

                It's a closed beta, but an easy one to get into. I haven't used it yet myself so maybe I'm mistaken.

      2. [2]
        Luna
        Link Parent
        I would also be interested in seeing Waymo attempt this, but I'm not sure it could (for now) since it seems to rely heavily on pre-mapping. From the Waymo website: So it's not just relying on...

        I would also be interested in seeing Waymo attempt this, but I'm not sure it could (for now) since it seems to rely heavily on pre-mapping. From the Waymo website:

        Before our Waymo Driver begins operating in a new area, we first map the territory with incredible detail, from lane markers to stop signs to curbs and crosswalks. Then, instead of relying solely on external data such as GPS which can lose signal strength, the Waymo Driver uses these highly detailed custom maps, matched with real-time sensor data and artificial intelligence (AI) to determine its exact road location at all times.

        So it's not just relying on local sensors and publicly-accessible map data. I'm not sure how much Tesla also relies on maps, but given that they don't geo-fence it to specific cities, I imagine the reliance on map data (beyond GPS instructions to the self-driving software) is minimal.

        16 votes
        1. moonwalker
          Link Parent
          Correct, it would stop automatically after leaving it's geofence. This is a fundamental limitation with their current software approach, and likely will continue to be for 5-10 years

          Correct, it would stop automatically after leaving it's geofence. This is a fundamental limitation with their current software approach, and likely will continue to be for 5-10 years

    2. burkaman
      Link Parent
      This is minor but I just want to point out that even this goal is the least ambitious drive that would still satisfy the "coast-to-coast" challenge. It's the shortest possible path and also has...

      a coast-to-coast drive between San Diego, CA, and Jacksonville, FL

      This is minor but I just want to point out that even this goal is the least ambitious drive that would still satisfy the "coast-to-coast" challenge. It's the shortest possible path and also has generally dry and easy driving conditions for almost the whole way. Something like Portland-to-Portland would be much longer, more difficult, and catchier-sounding.

      11 votes
  2. [11]
    beeef
    Link
    I was expecting the accident to take place in the city, with a traffic cone or a pothole. My assumption was that a (somewhat) clear, controlled access highway would be the simplest possible...

    I was expecting the accident to take place in the city, with a traffic cone or a pothole. My assumption was that a (somewhat) clear, controlled access highway would be the simplest possible situation for FSD to handle. I was surprised to see it was in actuality a large, easy to see piece of debris on a clear road during a sunny day. High risk, high speed, and completely avoidable.

    21 votes
    1. [10]
      TurtleCracker
      Link Parent
      I think optical only is a mistake. This probably would’ve been caught with other sensors more easily.

      I think optical only is a mistake. This probably would’ve been caught with other sensors more easily.

      15 votes
      1. [5]
        IsildursBane
        (edited )
        Link Parent
        The argument Musk made for this was that humans only drive by sight so why can't autonomous cars? However, I feel there are a few things he missed: We do not drive by sight alone. If the car is...

        The argument Musk made for this was that humans only drive by sight so why can't autonomous cars? However, I feel there are a few things he missed:

        1. We do not drive by sight alone. If the car is sliding around on the road, we feel it. When ABS kicks in, we get tactile feedback on our brake pedal (although this could be coded in to notify the autonomous driving program). We also feel when the road is bumpy, and get feedback on the steering wheel when a pothole is moving the front wheels. Also, we hear what is happening around us to some extent.

        2. Humans are not great drivers, so why use that as a benchmark on how to make a good autonomous driving system. Why not use other sensors that humans have to be better than humans

        3. Cameras can struggle in changing lighting conditions that human eyes can adjust to.

        4. Edit: I thought about the adaptability/past experiences of humans vs. machines, but felt I was not as informed on the topic to be worth commenting. However, papasquat made a good post further down that covers this area well.

        But the most likely reason on why they did it was not to emulate human drivers, but instead to cut costs and some hubris.

        19 votes
        1. PuddleOfKittens
          Link Parent
          The best response to this, IMO, is to talk about how we replaced blacksmithing: we didn't re-invent hands, but instead constrained the metal and meticulously planned to have power-hammers...

          The best response to this, IMO, is to talk about how we replaced blacksmithing: we didn't re-invent hands, but instead constrained the metal and meticulously planned to have power-hammers hammering from a single predefined angle.

          Humans can adjust course in response to errors. Robots are far less talented at this, so throwing sensors at the problem to reduce error rate is a better option.

          13 votes
        2. [3]
          Fiachra
          Link Parent
          I think the reasoning for wanting it to be optical-only is that if they figure it out, it can be rolled out to every consumer Tesla vehicle (a lot of cars!) via a software patch, and suddenly you...

          I think the reasoning for wanting it to be optical-only is that if they figure it out, it can be rolled out to every consumer Tesla vehicle (a lot of cars!) via a software patch, and suddenly you have a fleet of a million potential autonomous Tesla Ubers already on the road. Dwarfing waymo and others in numbers. Doesn't look to me like that's ever happening, but what do I know

          4 votes
          1. [2]
            IsildursBane
            Link Parent
            I have tried typing a response a few times on this, but never fully got an argument fully formed. However, I do not buy that the reason for removing the lidar sensor was not to offer a fleet of...

            I have tried typing a response a few times on this, but never fully got an argument fully formed. However, I do not buy that the reason for removing the lidar sensor was not to offer a fleet of autonomous taxis. I believe early Teslas were using lidar, but they ended up switching to their optical only path. I feel like including even just lidar sensors on the front would not increase the cost of each Tesla. Also, you could go the BMW route, of having the hardware on all versions, but only enable the features the customer paid for (although BMW started this practice after Tesla dropped lidar). Honestly, removing lidar instead feels like hubris from Musk, and not some genius plan of potentially enabling a taxi fleet down the road. Especially when you consider all the failed promises that Musk has made, I think he vastly underestimates the complexity of self-driving.

            4 votes
            1. CptBluebear
              Link Parent
              The official™️ explanation is that it would massively expand the fleet and focusing on one technology allows for more expertise instead of trying to marry both video and lidar together. So, a...

              The official™️ explanation is that it would massively expand the fleet and focusing on one technology allows for more expertise instead of trying to marry both video and lidar together. So, a simpler software stack.

              While true, focusing on improving a single piece of tech would be easier, it's probably not sufficient.

              At the time they created the first Tesla's and made this argument, lidar was prohibitively expensive. Nowadays you can find it on cheap commercial robot vacuums.

              3 votes
      2. [4]
        pete_the_paper_boat
        Link Parent
        I really like their approach, I think pushing optical to the limits is super interesting because of how cheap it is.

        I really like their approach, I think pushing optical to the limits is super interesting because of how cheap it is.

        1. Minori
          Link Parent
          Interesting, yes. Successful, no. There are limits to what you can do with a single camera driving a car. Replacing a human entirely with a single camera is extraordinarily ambitious. Replacing...

          Interesting, yes. Successful, no.

          There are limits to what you can do with a single camera driving a car. Replacing a human entirely with a single camera is extraordinarily ambitious.

          Replacing depth-sensing sensors like ultrasonic with cameras doesn't make sense. We can't yet replicate a human's understanding of a video feed.

          18 votes
        2. [2]
          TurtleCracker
          Link Parent
          It’s interesting but foolish. I think the argument that if a human only needs optical then a car should only need optical is odd. If I could have thermal, ultrasonic, lidar, etc as a human I’d...

          It’s interesting but foolish. I think the argument that if a human only needs optical then a car should only need optical is odd. If I could have thermal, ultrasonic, lidar, etc as a human I’d want them too!

          15 votes
          1. papasquat
            Link Parent
            This is an argument that I keep seeing Tesla fanboys tout, and it makes no sense. Human beings don't only need optical sensors. We also need minds, which AI systems do not have. We understand that...

            This is an argument that I keep seeing Tesla fanboys tout, and it makes no sense.

            Human beings don't only need optical sensors. We also need minds, which AI systems do not have.

            We understand that it's okay to run over an inflated shopping bag, but it's not okay to run over a piece of scrap metal, because we've interacted with shopping bags and metal. We know that we shouldn't swerve off the road to avoid hitting a raccoon, but we should swerve off the road to avoid hitting a child even if it's Halloween and that child is wearing a raccoon costume. We know that driving over a puddle of oil is more dangerous than driving over a puddle of water, and we can identify puddles of oil because of the subtle shimmer it gives off. We know that a dump truck carrying gravel is likely to damage our car if we drive behind it, but a cargo truck isn't.

            AI systems don't understand any of that. They might be trained on some of it, if the training data exists, or they might be explicitly coded to deal with some of it, but I listed four examples. There are millions, if not more unique scenarios where a human being that's lived a full life of experiences can make good judgements, even if they've never been in that specific scenario before, and an AI system that's solely been trained with reinforcement methods cannot.

            So the argument that cars don't need lidar because humans don't makes about as much sense as saying that cars don't need brakes because humans don't have those either. They're completely separate problems with completely separate requirements.

            20 votes
  3. [2]
    ackables
    Link
    I don’t want to put too much blame on the driver because they were trying to test Tesla’s claim of coast to coast self driving, but he really didn’t do anything to try and prevent that crash. If...

    I don’t want to put too much blame on the driver because they were trying to test Tesla’s claim of coast to coast self driving, but he really didn’t do anything to try and prevent that crash.

    If you are trying to prove a car has self driving capabilities, you should be prepared for the chance that it does not. The driver didn't see the debris until long after the passenger spotted it and did nothing to try and avoid it after spotting it.

    The driver didn’t have their hands on the wheel, which definitely impacted their obstinate avoidance abilities, but I’m not sure if that is still against the Tesla FSD guidelines. At the very least stepping on the brakes should have been possible almost instantly if the driver was paying attention.

    This is one of those weird situations where it’s ambiguous whether the driver or the car is supposed to be ultimately responsible for operating the vehicle. You can put the blame on either the car or the driver for the crash, but I would put the blame on Tesla for creating a situation where there’s ambiguity about how to safely operate their cars.

    2 votes
    1. infpossibilityspace
      Link Parent
      Not sure if you watched the video rather than the gif, but you can hear both of them discussing it well in advance of the collision. Reads to me like they wanted to "prove" the car would take...

      Not sure if you watched the video rather than the gif, but you can hear both of them discussing it well in advance of the collision.

      Reads to me like they wanted to "prove" the car would take action rather than just ploughing into it and he only took the wheel at the last second so it doesn't suddenly change direction.

      2 votes