Mark Rober just set up one of the most interesting self-driving tests of 2025, and he did it by imitating Looney Tunes. The former NASA engineer and current YouTube mad scientist recreated the classic gag where Wile E. Coyote paints a tunnel onto a wall to fool the Road Runner.
Only this time, the test subject wasn’t a cartoon bird… it was a self-driving Tesla Model Y.
The result? A full-speed, 40 MPH impact straight into the wall. Watch the video and tell us what you think!
There’s a very simple solution to autonomous driving vehicles plowing into walls, cars, or people:
Congress will pass a law that makes NOBODY liable – as long as a human wasn’t involved in the decision making process during the incident.
This will be backed by car makers, software providers, and insurance companies, who will lobby hard for it. After all, no SINGLE person or company made the decision to swerve into oncoming traffic. Surely they can’t be held liable. 🤷🏻♂️
Once that happens, Level 4 driving will come standard and likely be the default mode on most cars. Best of luck everyone else!
Kids already have experience playing hopscotch, so we can just have them jump between the rooves of moving cars in order to cross the street! It will be so much more efficient, and they can pretend that they are action heroes. The ones who survive will make for great athletes too.
There’s a reason GenX trained on hopper. Too bad the newer generations don’t have something equivalent
There is no way insurance companies would go for that. What is far more likely is that policies simply wont cover accidents due to autonomous systems. Im honeslty surprised they wouls cover them now.
If it’s a feature of a car when you bought it and the insurance company insured the car then anything the car does by design must be covered. The only way an insurance company will get out of this is by making the insured sign a statement that if they use the feature it makes their policy void, the same way they can with rideshare apps if you don’t disclose that you are driving for a rideshare. They also can refuse to insure unless the feature is disabled. I can see in the future insurance companies demanding features be disabled before insuring them. They could say that the giant screens blank or the displayed content be simplified while in motion too.
If the risk is that insurance companies won’t pay for accidents and put people on the hook for hundreds of thousands of dollars in medical bills, then people won’t use autonomous systems.
This cannot go both ways. Either car makers are legally responsible for their AI systems, or insurance companies are legally responsible to pay for those damages. Somebody has to foot the bill, and if it’s the general public, they will avoid the risk.
Not sure how it plays for Tesla, but for Waymo, their accidents per mile driven are WAY below non-automation. Insurance companies would LOVE to charge a surplus for automated driving insurance while paying out less incidents.
Uhhhh absolutely not. They would abandon it first.
If no one is liable then it’s tempting to deliberately confuse them to crash
Won’t the people doing that be committing attempted murder?
Self driving cars don’t need to have anyone on board
Ask the KIA boys how much they care about murder charges.