Tesla software is considered one of the most secure and light years ahead of what other carmakers install in their vehicles. Still, researchers found out...
Jailbreaking automated equipment introduces a ton of risk. I’m generally a supporter of being able to do whatever you want with things you own, but things like this where tinkering with a heavy object on wheels where glitches will kill people, I’d be fine with some amount of regulation here probably from NHTSA. We really don’t want some random person running a 1-click tool to hack their self driving car to install buggy self driving software that may or may not have still have the safety overrides still working. Imagine if what they think they’re installing is FSD beta from Tesla but what they’re really installing is something infected with spyware which corrupts safety overrides?
I don’t know much about how driver takeover and brake pedal take over works on the Teslas, maybe these fears are unfounded, just my 2 cents.
Also brake application disconnects FSD or any other active driver assist. Steering wheel take over disconnects FSD, but switches to adaptive cruise unless you do it in conjunction with the brake pedal.
You don’t have to tug hard on the steering wheel to take over either.
You can also just pull the gear stalk slightly up and remain in full control without any jerk to any input.
But agreed that people messing around with their own car software isn’t necessarily a good idea.
That said, I’ve learned a lot of things about how Teslas systems work by occasionally following this guy; https://twitter.com/greentheonly
We’re talking about an autonomous system controlling the steering, acceleration and braking. Hardly an apt comparison to “engine, brakes, etc.” Those things are components to the functioning of the overall system. The self driving stuff sits on top of that and needs to be able to identify issues with the prime mover, brakes and etc and to disengage if it’s unsafe. Allowing someone to jailbreak the self driving system and override safety shutdowns is a recipe for disaster.
I’m saying that the power to tinker with a car and destroy its ability to be safely operated is not a new technology thing. Getting the government involved to prevent people from tinkering with items they own is a severe overreaction to a “threat” that has existed as long as people have owned wrenches and cars.
That’s a great point. I guess my concerns are more from my bad experiences with computers randomly doing weird stuff and trusting my life to a system like that.
I can understand that. I have an OBD-II module that I’ve used on my EV to unlock certain stupid locked features (including larger gas tank capacity — it’s a PHEV), but I definitely didn’t want to touch anything that could cause, say, a computer crash while driving down the road. But I’ve also had tires blow out, headlights fall out, transmissions break, and engines seize over the years as well. There are plenty of mechanical things that anyone could do that would cause catastrophe on the roads. I just don’t wanna go overboard on the government involvement since I think we should be able to actually repair/tinker with/jailbreak whatever we own, especially when it costs tens of thousands of dollars.
Jailbreaking automated equipment introduces a ton of risk. I’m generally a supporter of being able to do whatever you want with things you own, but things like this where tinkering with a heavy object on wheels where glitches will kill people, I’d be fine with some amount of regulation here probably from NHTSA. We really don’t want some random person running a 1-click tool to hack their self driving car to install buggy self driving software that may or may not have still have the safety overrides still working. Imagine if what they think they’re installing is FSD beta from Tesla but what they’re really installing is something infected with spyware which corrupts safety overrides?
I don’t know much about how driver takeover and brake pedal take over works on the Teslas, maybe these fears are unfounded, just my 2 cents.
Likely won’t ever be able to give you FSD.
https://www.tomshardware.com/news/tesla-mcu-amd-asp-flaw-jailbreak
Also brake application disconnects FSD or any other active driver assist. Steering wheel take over disconnects FSD, but switches to adaptive cruise unless you do it in conjunction with the brake pedal.
You don’t have to tug hard on the steering wheel to take over either.
You can also just pull the gear stalk slightly up and remain in full control without any jerk to any input.
But agreed that people messing around with their own car software isn’t necessarily a good idea.
That said, I’ve learned a lot of things about how Teslas systems work by occasionally following this guy; https://twitter.com/greentheonly
Yeah I don’t know if I agree with this at all. You could make the same argument about nearly any component on any car (engine, brakes, etc).
We’re talking about an autonomous system controlling the steering, acceleration and braking. Hardly an apt comparison to “engine, brakes, etc.” Those things are components to the functioning of the overall system. The self driving stuff sits on top of that and needs to be able to identify issues with the prime mover, brakes and etc and to disengage if it’s unsafe. Allowing someone to jailbreak the self driving system and override safety shutdowns is a recipe for disaster.
I’m saying that the power to tinker with a car and destroy its ability to be safely operated is not a new technology thing. Getting the government involved to prevent people from tinkering with items they own is a severe overreaction to a “threat” that has existed as long as people have owned wrenches and cars.
That’s a great point. I guess my concerns are more from my bad experiences with computers randomly doing weird stuff and trusting my life to a system like that.
I can understand that. I have an OBD-II module that I’ve used on my EV to unlock certain stupid locked features (including larger gas tank capacity — it’s a PHEV), but I definitely didn’t want to touch anything that could cause, say, a computer crash while driving down the road. But I’ve also had tires blow out, headlights fall out, transmissions break, and engines seize over the years as well. There are plenty of mechanical things that anyone could do that would cause catastrophe on the roads. I just don’t wanna go overboard on the government involvement since I think we should be able to actually repair/tinker with/jailbreak whatever we own, especially when it costs tens of thousands of dollars.