Virginia sheriff’s office says Tesla was running on Autopilot moments before tractor-trailer crash::Virginia authorities have determined that a Tesla was operating on its Autopilot system and was speeding in the moments leading to a crash with a crossing tractor-trailer last July that killed the Tesla driver. The death of Pablo Teodoro III, 57, is the third since 2016 in which a Tesla that was using Autopilot ran underneath a crossing tractor-trailer, raising questions about the partially automated system’s safety and where it should be allowed to operate. The crash south of Washington remains under investigation by the U.S. National Highway Traffic Safety Administration, which sent investigators to Virginia last summer and began a broader probe of Autopilot more than two years ago.
The reason it wasn’t the moment of the crash is because it’s programmed to disengage if a crash can’t be avoided so that Tesla can skirt liability.
This would be like a drunk driver letting go of the wheel the moment before a crash and telling the judge, ‘your honor. I wasn’t driving when the car wrecked.’
I hope Tesla gets held accountable.
More like a drunk Uber driver jumping out while you’re in the backseat and yelling “you got this” on the way out.
Lol, yeah. Like that.
Except he was in the front driver seat, with access to everything needed to bring it to a safe stop.
It turns off so that it doesn’t do anything stupid after the cash as the sensors or systems might be damaged, it still gets counted as a crash where autopilot was active.
https://www.tesla.com/VehicleSafetyReport
We also receive a crash alert anytime a crash is reported to us from the fleet, which may include data about whether Autopilot was active at the time of impact. To ensure our statistics are conservative, we count any crash in which Autopilot was deactivated within 5 seconds before impact, and we count all crashes in which the incident alert indicated an airbag or other active restraint deployed. <…> In practice, this correlates to nearly any crash at about 12 mph (20 kph) or above, depending on the crash forces generated. We do not differentiate based on the type of crash or fault
And if the time between the crash and autopilot disengaging is over 5 seconds, I think “your honor. I wasn’t driving when the car wrecked, it was the autopilot” doesn’t work either, especially as you are still always responsible for what the car does, autopilot or not.
Don’t get me wrong though, Musk is still an idiot and Tesla lies about the capability of the system in many ways which is partly the reason people think they can stop paying attention and let it do things on its own.
I hope Tesla gets held accountable.
Why would this not be the driver’s fault? There are no completely autonomous diving cars(i know there technically are but they are very limited) so the driver is responsible for paying attention. Why didn’t the driver brake or take any kind of evasive maneuver?
The Tesla was traveling 70 mph (112.7 kilometers per hour) on four-lane U.S. 29 near Opal, and was 25 mph over the 45 mph speed limit in that area,
I guess autopilot made the choice to speed too?
Because Musk overstates what Teslas are capable of. It’s why Tesla is being sued by shareholders (Lamontagne v Tesla).
Anecdotally, five years ago, my Uber was driving a Tesla and insisted it was fully autonomous on the highway and he could text without issue.
Honestly I think it’s people who are trusting these systems way too much. They think it’s safe because of marketing.
No coincidence Tesla is attacking those that are trying to hold them accountable for false advertising. They claim it infringes upon their 1A rights, fucking capitalist pigs
Well, people wouldn’t trust it if there wasn’t lying marketing in the first place, kind of.
There’s a huge warning sign when you turn on autopilot saying you need to always be paying attention and ready to take over at any time, and you need to acknowledge this before it turns it on. It also checks periodically that you’re in control of the wheel, or else will disengage.
The reasont these people crash is that they’re morons.
It also checks periodically that you’re in control of the wheel, or else will disengage.
Disengaging autopilot when it detects you are not in control of the wheel sounds a bit dangerous.
Honda has lane assist and adaptive cruise control. It also disengage both of them if it doesn’t detect driver feedback after a while.
I tested it when alone on a long straight empty freeway out of curiosity for anyone wondering. I did have my hands hovering the wheel ready tk take control just in case but otherwise didn’t touch it after I engaged both systems.
Did a couple tests and it disengaged them every time after a little while.
It alerts you. You’d have to be asleep to not get the alerts.
In cars with adaptive cruise, lane keeping, etc, it’ll beep and flash on the dash if you’re not steering enough (I have a car that complains all the time because I don’t grip the wheel like an ape).
Some will shake the wheel too. Or slow the car down if you don’t respond.
It doesn’t just disengage: https://www.youtube.com/watch?v=oBIKikBmdN8
Here is an alternative Piped link(s):
https://www.piped.video/watch?v=oBIKikBmdN8
Piped is a privacy-respecting open-source alternative frontend to YouTube.
I’m open-source; check me out at GitHub.
Tesla’s biggest downfall is probably going to be the fact that they called this system Autopilot. It still requires drivers to pay attention. These drivers still treat it as a driver replacement then everyone wonders why they crash.
And make you pay for “Full Self Drive”, another allusion to a fully self-driving vehicle.
this has to be a massive class action waiting to happen imo. it’s not autopilot and it sure as shit isn’t full self driving. lies need to be held accountable by Tesla.
Agreed. They are deliberately taking advantage of the fact that people don’t understand how autopilot is actually used in aircraft.
Sure, the most pedantic of us will point out that, with autopilot enabled, the pilot-flying is still in command of the aircraft and still responsible for the safe conduct of the flight. Pilots don’t** engage autopilot and then leave the cockpit unattended. They prepare for the next phase of flight, monitor their surroundings, prepare for top-of-descent, and to stay mentally ahead of the rapid-fire events and requirements for a safe approach and landing. Good pilots let the autopilot free them up for other tasks, while always preparing for the very real possibility that the autopilot will malfunction in the most lethal way possible at the worst possible moment.
Do non-pilots understand that? No. The parent poster is absolutely correct: Tesla is taking advantage of peoples’ misunderstanding, and then hiding behind pedantic truth about what a real autopilot is actually for.
** Occasionally pilots do, and many times something goes horribly wrong unexpectedly and they die. Smart, responsible pilots don’t. Further, sometimes pilots fail to manage their autopilot correctly, or use it without understanding how it can behave when something goes wrong. (RIP to aviation Youtuber TNFlygirl who had a fatal accident six days ago, suspected to be due to mismanagement of an unfamiliar autopilot system.)
Pilots, at least at the upper echelons, have it drilled into them that they are responsible for the aircraft, their actions in it, and those aboard it. I cannot stress the difference between the casual attitude the vast majority of people view their actions behind the wheel with vs the attitude and responsibility of operating a complex commercial aircraft.
Autopilot is a generally necessary convenience for operation of aircraft on long flights, for efficiency, comfort, and preventing fatigue…but it gets turned off instantly should safety require it and conditions warrant it.
In a car? People use it for reading, watching video clips, dozing off if they can get away with it, and letting it drive them right into or cause a wreck.
The problem isn’t necessarily the system (though Tesla’s FSD is full of problems), it’s the fact that drivers are willfully dumbasses with no real understanding of their car’s system and their responsibilities regarding them.
Your disclaimer basically describes all these Tesla fatalities just the same. You just substituted Tesla with aircraft.
Plus there are multiple levels of autopilot. The plane I flew had a half-ass single axis that was usually not worth using, although maybe when things were hectic it could help reduce workload slightly
Is Tesla still training the Autopilot neural network in 3D worlds, or are they now entirely relying on driver data?