If this does work, it would be interesting to see a class action lawsuit for all of us who were endangered throughout his "testing" phase. It's wild to me that they were able to legally release...
If this does work, it would be interesting to see a class action lawsuit for all of us who were endangered throughout his "testing" phase. It's wild to me that they were able to legally release those auto-pilot/full self driving features with limited testing onto our roadways.
The article even mentions a motorcyclist who was killed by a Tesla. Musk is too similar to Stockton Rush with the idea that all these regulations are restrictive to innovation. But unlike Rush,...
The article even mentions a motorcyclist who was killed by a Tesla. Musk is too similar to Stockton Rush with the idea that all these regulations are restrictive to innovation. But unlike Rush, Musk is endangering those who did not agree to his testing of experimental craft.
Partially. Those who were involved in the search efforts either volunteered or worked in a job that would require them to do search operations. Also, those who were involved in the search did not...
Partially. Those who were involved in the search efforts either volunteered or worked in a job that would require them to do search operations. Also, those who were involved in the search did not have to use his sub that did not meet regulations while searching. That is not at the same level of involvement as the pedestrian walking down the street in their town who is at risk of Tesla's test of autonomous vehicles.
Obviously, yes, that is correct, it's not a perfect match of liability and danger. It's also similar enough that saying "unlike Rush..." is inappropriate IMO.
Obviously, yes, that is correct, it's not a perfect match of liability and danger. It's also similar enough that saying "unlike Rush..." is inappropriate IMO.
I expect a massive class action suit for every current Tesla owner who purchased FSD under the lie that their car could become a "robotaxi" and drive unattended/unsupervised
I expect a massive class action suit for every current Tesla owner who purchased FSD under the lie that their car could become a "robotaxi" and drive unattended/unsupervised
I've been working towards getting my license recently (you don't truly need one in much of NL) and one thing that I've steadily had to work on is when something unexpected happens. Sometimes a...
I've been working towards getting my license recently (you don't truly need one in much of NL) and one thing that I've steadily had to work on is when something unexpected happens.
Sometimes a participant in traffic is to blame, sometimes maintenance on roads or utilities makes things complicated. But whatever the reason things can get messy quickly in traffic.
And for anyone participating in traffic, from pedestrians to cyclists to car drivers to anyone else, this requires a lot of instant choices, judgement calls, and more. It's really not easy. Automating all that is not just linear algebra and statistics in the way most ML programs work. Nor is it a lot of if-else statements. It's a lot of experience, guessing what is the most save, looking at social cues from other participants, coordination with whatever information you have at the time and constantly adjusting that for safety.
It's actually pretty amazing when you think about what the human brain is capable of.(And how many people will still not use their lights to indicate the direction they're taking grmbl)
Automating all of that safely is just... Well if we'd be capable of that we could make an utopia. I'd love for that to be possible because so many issues could be solved both on traffic and in say, factories, but we're not even close to that.
I know most people here know about this already but it's been something that's on my mind recently as I've been taking driving lessons. The human brain is pretty amazing.
The strongest argument against self-driving trucks goes as follows: If self-driving vehicles are so easy, why aren't self-driving trains universal? It's clearly a far easier problem than...
The strongest argument against self-driving trucks goes as follows:
If self-driving vehicles are so easy, why aren't self-driving trains universal? It's clearly a far easier problem than self-driving automobiles.
Because the incentives to invest in solving a problem are very different when the driver is a majority of the cost of the transport (truck) vs an extreme minority (train). There’s many examples of...
If self-driving vehicles are so easy, why aren't self-driving trains universal? It's clearly a far easier problem than self-driving automobiles.
Because the incentives to invest in solving a problem are very different when the driver is a majority of the cost of the transport (truck) vs an extreme minority (train).
There’s many examples of self driving metros, which are an example where the driver cost is a higher proportion than trains.
As we are noticing with the autonomous systems that have already been developed or are in development, in many cases they may able to automate a large share of the work, but the edge cases are...
As we are noticing with the autonomous systems that have already been developed or are in development, in many cases they may able to automate a large share of the work, but the edge cases are where it gets tricky and often still requires a human to deal with. Waymo is dealing with this too with remote interventions and having to dispatch a person to actually go out and physically drive the vehicle in some cases to address a problem.
So especially true when as you mentioned, the operator of a train being a minority of the costs, any system they likely could come up with would likely still have edge cases that require a human to intervene. It really just doesn't make sense to remove the relatively low cost of the human, and then still have a system where you likely will need a human to deal with edge cases. Considering the amount of distance covered by rails, that would make it even harder to introduce a solution like Waymo is using of dispatching a human when needed, and so far anyhow they're mostly just using a stock vehicle which millions of people are trained to operate. Compare that to trains and it's perhaps not as feasible you'd just have train operators/engineers available all over the country ready to step in during an edge case nearby in the middle of nowhere.
To be fair, that last circumstance is perhaps quite likely to be a big problem for autonomous semi-trucks, but it goes back to there being a stronger portion of costs to driver labor to increase the pool of resources available to spend to solve that problem.
I think self driving cars could/will be more safe than humans when humans are no longer allowed to drive and the whole network of cars can communicate with each other. But if you are talking about...
I think self driving cars could/will be more safe than humans when humans are no longer allowed to drive and the whole network of cars can communicate with each other. But if you are talking about letting them drive when human element is still there, humans are just not predictable enough for a self driving car to be able to figure out all the different scenarios.
I mean I don't know enough to say I'm 100 percent certain, but I am sure enough to say self driving cars on roads where humans are also driving is magnitudes more difficult to implement than if every car is a self driving car and there are standards where they can communicate with each other.
I think if we truly want safe self driving vehicles, in the end we need to not allow humans to drive. And as some one who likes cars, that is a sad scenario :(. Then again I'm getting old enough I wouldn't mind being driven around lol.
I don't care if it works or doesn't. I don't even have anything against autonomous vehicles in general. but I will never willingly get in a Tesla for any reason.
I don't care if it works or doesn't. I don't even have anything against autonomous vehicles in general. but I will never willingly get in a Tesla for any reason.
What reputation? The one where it's obvious that he's almost as bad a liar as the presidential candidate that he's stumping for now? I don't think it's possible for him to take any reputation hits...
If Musk fails to deliver or shows off some obvious vaporware, his reputation — and Tesla’s stock price — could take a real hit.
What reputation? The one where it's obvious that he's almost as bad a liar as the presidential candidate that he's stumping for now? I don't think it's possible for him to take any reputation hits at this point. For anyone who has been paying attention to Musk at all over the years, you either already know that he is constantly full of shit or no amount of evidence could ever convince you otherwise.
Tesla’s approach to the hardware of driverless vehicles also diverges from the rest of the industry. Musk infamously relies on a camera-only approach, in contrast to the widely used practice of relying on a “fusion” of different sensors, including radar, ultrasonic, and lidar, to power autonomous driving. Musk calls lidar, in particular, a “crutch” and claims any company that relies on the laser sensor is “doomed.” Waymo’s robotaxis are adorned with large, obvious sensors, a style expressly at odds with the sleekness of Musk’s vehicles.
I remember reading an article some years back, I'm not entirely certain where. (Probably Car and Driver, but maybe not.) It was talking about the removal of radar from Teslas. Elon Musk was convinced that, because humans could drive entirely with vision alone, Tesla's cars should be able to, too. He wanted to, he said, "go back to first principles" on self driving cars. Now, of course, this couldn't possibly have anything to do with wanting to cut costs by removing sensors and just hoping that software could make up the gap. But even if you take Musk at face value (and I don't know why anyone would), isn't the goal for self driving cars to be able to drive better than humans? Particularly in situations where we don't drive terribly well? If so, it's obvious that giving self driving vehicles the ability to see better than us would be an obvious benefit, particularly while it's a nascent technology.
Sadly, despite what this article's title suggests, it seems like the bill never comes due for this blowhard.
If this does work, it would be interesting to see a class action lawsuit for all of us who were endangered throughout his "testing" phase. It's wild to me that they were able to legally release those auto-pilot/full self driving features with limited testing onto our roadways.
The article even mentions a motorcyclist who was killed by a Tesla. Musk is too similar to Stockton Rush with the idea that all these regulations are restrictive to innovation. But unlike Rush, Musk is endangering those who did not agree to his testing of experimental craft.
Rush endangered the unwilling and uninvolved by requiring the search efforts that resulted from his negligence.
Partially. Those who were involved in the search efforts either volunteered or worked in a job that would require them to do search operations. Also, those who were involved in the search did not have to use his sub that did not meet regulations while searching. That is not at the same level of involvement as the pedestrian walking down the street in their town who is at risk of Tesla's test of autonomous vehicles.
Obviously, yes, that is correct, it's not a perfect match of liability and danger. It's also similar enough that saying "unlike Rush..." is inappropriate IMO.
I expect a massive class action suit for every current Tesla owner who purchased FSD under the lie that their car could become a "robotaxi" and drive unattended/unsupervised
I've been working towards getting my license recently (you don't truly need one in much of NL) and one thing that I've steadily had to work on is when something unexpected happens.
Sometimes a participant in traffic is to blame, sometimes maintenance on roads or utilities makes things complicated. But whatever the reason things can get messy quickly in traffic.
And for anyone participating in traffic, from pedestrians to cyclists to car drivers to anyone else, this requires a lot of instant choices, judgement calls, and more. It's really not easy. Automating all that is not just linear algebra and statistics in the way most ML programs work. Nor is it a lot of if-else statements. It's a lot of experience, guessing what is the most save, looking at social cues from other participants, coordination with whatever information you have at the time and constantly adjusting that for safety.
It's actually pretty amazing when you think about what the human brain is capable of.(And how many people will still not use their lights to indicate the direction they're taking grmbl)
Automating all of that safely is just... Well if we'd be capable of that we could make an utopia. I'd love for that to be possible because so many issues could be solved both on traffic and in say, factories, but we're not even close to that.
I know most people here know about this already but it's been something that's on my mind recently as I've been taking driving lessons. The human brain is pretty amazing.
The strongest argument against self-driving trucks goes as follows:
If self-driving vehicles are so easy, why aren't self-driving trains universal? It's clearly a far easier problem than self-driving automobiles.
Because the incentives to invest in solving a problem are very different when the driver is a majority of the cost of the transport (truck) vs an extreme minority (train).
There’s many examples of self driving metros, which are an example where the driver cost is a higher proportion than trains.
Railroad workers are also better unionized than truckers.
As we are noticing with the autonomous systems that have already been developed or are in development, in many cases they may able to automate a large share of the work, but the edge cases are where it gets tricky and often still requires a human to deal with. Waymo is dealing with this too with remote interventions and having to dispatch a person to actually go out and physically drive the vehicle in some cases to address a problem.
So especially true when as you mentioned, the operator of a train being a minority of the costs, any system they likely could come up with would likely still have edge cases that require a human to intervene. It really just doesn't make sense to remove the relatively low cost of the human, and then still have a system where you likely will need a human to deal with edge cases. Considering the amount of distance covered by rails, that would make it even harder to introduce a solution like Waymo is using of dispatching a human when needed, and so far anyhow they're mostly just using a stock vehicle which millions of people are trained to operate. Compare that to trains and it's perhaps not as feasible you'd just have train operators/engineers available all over the country ready to step in during an edge case nearby in the middle of nowhere.
To be fair, that last circumstance is perhaps quite likely to be a big problem for autonomous semi-trucks, but it goes back to there being a stronger portion of costs to driver labor to increase the pool of resources available to spend to solve that problem.
I think self driving cars could/will be more safe than humans when humans are no longer allowed to drive and the whole network of cars can communicate with each other. But if you are talking about letting them drive when human element is still there, humans are just not predictable enough for a self driving car to be able to figure out all the different scenarios.
I mean I don't know enough to say I'm 100 percent certain, but I am sure enough to say self driving cars on roads where humans are also driving is magnitudes more difficult to implement than if every car is a self driving car and there are standards where they can communicate with each other.
I think if we truly want safe self driving vehicles, in the end we need to not allow humans to drive. And as some one who likes cars, that is a sad scenario :(. Then again I'm getting old enough I wouldn't mind being driven around lol.
I don't care if it works or doesn't. I don't even have anything against autonomous vehicles in general. but I will never willingly get in a Tesla for any reason.
What reputation? The one where it's obvious that he's almost as bad a liar as the presidential candidate that he's stumping for now? I don't think it's possible for him to take any reputation hits at this point. For anyone who has been paying attention to Musk at all over the years, you either already know that he is constantly full of shit or no amount of evidence could ever convince you otherwise.
I remember reading an article some years back, I'm not entirely certain where. (Probably Car and Driver, but maybe not.) It was talking about the removal of radar from Teslas. Elon Musk was convinced that, because humans could drive entirely with vision alone, Tesla's cars should be able to, too. He wanted to, he said, "go back to first principles" on self driving cars. Now, of course, this couldn't possibly have anything to do with wanting to cut costs by removing sensors and just hoping that software could make up the gap. But even if you take Musk at face value (and I don't know why anyone would), isn't the goal for self driving cars to be able to drive better than humans? Particularly in situations where we don't drive terribly well? If so, it's obvious that giving self driving vehicles the ability to see better than us would be an obvious benefit, particularly while it's a nascent technology.
Sadly, despite what this article's title suggests, it seems like the bill never comes due for this blowhard.