• @burliman@lemm.ee
    link
    fedilink
    English
    1311 months ago

    That’s the bar that automatic driving has. It messes up once and you never trust it again and the news spins the failure far and wide.

    Your uncle doing the same thing just triggers you to yell at him, the guy behind him flips you off, he apologizes, you’re nervous for a while, and you continue your road trip. Even if he killed someone we would blame the one uncle, or some may blame his entire class at worst. But we would not say that no human should drive again until it is fixed like we do with automated cars.

    I do get the difference between those, and I do think that they should try to make automated drivers better, but we can at least agree about that premise: automated cars have a seriously unreasonable bar to maintain. Maybe that’s fair, and we will never accept anything but perfect, but then we may never have automated cars. And as someone who drives with humans every day, that makes me very sad.

    • @maynarkh@feddit.nl
      link
      fedilink
      English
      1511 months ago

      There is a big difference between Autopilot and that hypotethical uncle. If the uncle causes an accident or breaks shit, he or his insurance pays. Autopilot doesn’t.

      By your analogy, it’s like putting a ton of learner drivers on the road with unqualified instructors, and not telling the instructors that they are supposed to be instructors, but that they are actually taking a taxi service. Except it’s somehow their responsibility. And of course pocketing both the instruction and taxi fees.

      The bar is not incredibly high for self driving cars to be accepted. The only thing is that they should take the blame if they mess up, like all other drivers.

      • @burliman@lemm.ee
        link
        fedilink
        English
        211 months ago

        Yeah, for sure. Like I said, I get the difference. But ultimately we are talking about injury prevention. If automated cars prevented one less death per mile than human drivers, we would think they are terrible. Even though they saved one life.

        And even if they only caused one death per year we’d hear about it and we might still think they are terrible.

    • Neato
      link
      fedilink
      511 months ago

      The difference is that Tesla said it was autopilot when it’s really not. It’s also clearly not ready for primetime. And auto regulators have pretty strict requirements about reliability and safety.

      While that’s true that autonomous cars kill FAR less people than human drivers, ever human is different. If an autonomous driver is subpar and that AI is rolled out to millions of cars, we’ve vastly lowered safety of cars. We need autonomous cars to be better than the best driver because, frankly, humans are shit drivers.

      I’m 100% for autonomous cars taking over entirely. But Tesla isn’t really trying to get there. They are trying to sell cars and lying about their capabilities. And because of that, Tesla should be liable for the deaths. We already have them partially liable: this case caused a recall of this feature.

      • @Staiden@lemmy.dbzer0.com
        link
        fedilink
        English
        4
        edit-2
        11 months ago

        But the vaporware salesman said fully automatic driving was 1 year away! In 2018, 2019, 2020, 2021… he should be held responsible. The guy once said to further technology some people will die and that’s just the price we pay. It was in a comment about going to Mars, but we should take that in to accout for everything he does. If I owned a business and one of my workers died or killed someone because of gross negligence I’d be held responsible why does he get away with it.

    • @SlopppyEngineer@discuss.tchncs.de
      link
      fedilink
      English
      111 months ago

      Except Tesla’s uncle had brain damage and doesn’t really learn from the situation so will go it again, and had clones of him driving thousands of other cars.