• Dr. Moose@lemmy.world
    link
    fedilink
    English
    arrow-up
    25
    ·
    edit-2
    7 months ago

    Even with autopilot I feel it’s unlikely that driver would not be liable. We didn’t have a case yet but once this happens and goes higher to courts it’ll immediatly establish a liability precedence.

    Some interesting headlines:

    So I’m pretty sure that autopilot drivers would be found liable very fast if this developed further.

    • dejected_warp_core@lemmy.world
      link
      fedilink
      arrow-up
      10
      ·
      7 months ago

      I am not a lawyer.

      I think an argument can be made that a moving vehicle is no different than a lethal weapon, and the autopilot, nothing more than a safety mechanism on said weapon. Which is to say the person in the driver’s seat is responsible for the safe operation of that device at all times, in all but the most compromised of circumstances (e.g. unconscious, heart attack, taken hostage, etc.).

      Ruling otherwise would open up a transportation hellscape where violent acts are simply passed off to insurance and manufacturer as a bill. No doubt those parties would rush to close that window, but it would be open for a time.

      Cynically, a corrupt government in bed with big monied interests would never allow the common man to have this much power to commit violence. Especially at their expense, fiscal or otherwise.

      So just or unjust, I think we can expect the gavel to swing in favor of pushing all liability to the driver.

      • Hagdos@lemmy.world
        link
        fedilink
        arrow-up
        2
        ·
        7 months ago

        Making that argument completely closes the door for fully autonomous cars though, which is sort of the Holy grail of vehicle automation.

        Fully autonomous doesn’t really exist yet, aside from some pilot projects, but give it a decade or two and it will be there. Truly being a passenger in your own vehicle is a huge selling point, you’d be able to do something else while moving, like reading, working or sleeping.

        These systems can probably be better drivers than humans, because humans suck at multitasking and staying focused. But they will never be 100% perfect, because the world is sometimes wildly unpredictable and unavoidable accidents are a thing. There will be some interesting questions about liability though.

    • SkyezOpen@lemmy.world
      link
      fedilink
      arrow-up
      4
      ·
      7 months ago

      They’re most likely liable. “FSD” is not full self driving, it’s still a test product, and I guarantee the conditions for using it include paying attention and keeping your hands on the wheel. The legal team at tesla definitely made sure they weren’t on the hook.

      Now where there might be a case for liability is Elon and his stupid Twitter posts and false claims about FSD. Many people have been mislead and it’s probably contributed to a few of the autopilot crashes.

    • stom@lemmy.dbzer0.com
      link
      fedilink
      arrow-up
      2
      ·
      7 months ago

      You’re still in control of the vehicle, therefore you’re still liable. Like plopping a 5 year old on your lap to drive while you nap, if they hit people it’s still your fault for handing over the control to something incapable of driving safely while you were responsible for the vehicle.

      • Norodix@lemmy.world
        link
        fedilink
        arrow-up
        1
        ·
        7 months ago

        But a reasonable person would not consider a child capable of driving. An “extremeley advanced algorithm that is better and safer than humans and everyone should use it” is very different in this case. Aftet hearing all the stupid fluff, it is not unreasonable to think that selfdrivong is good.

        • stom@lemmy.dbzer0.com
          link
          fedilink
          arrow-up
          2
          ·
          edit-2
          7 months ago

          Teslas own warnings and guidance assert that drivers should remain ready to take control when using the features. They do not claim it is infallible. Oversight and judgement still need to be used, which is why this argument wouldn’t hold up at all.

          • LovesTha🥧@floss.social
            link
            fedilink
            arrow-up
            1
            ·
            7 months ago

            @stom @Norodix Pity Tesla hasn’t taken reasonable precautions to ensure the driver is driving.

            It isn’t unreasonable to have customers expect the thing they were sold to do the thing they were told it does.

    • SinJab0n@mujico.org
      link
      fedilink
      arrow-up
      2
      ·
      edit-2
      7 months ago

      It was possible to let Musk dealt with his own mess before, but after the last demands for false advertisement they changed the wording from “fully automated” to “assisted driving”, and now even the manuals says;

      "dude, this is some fucky woocky shit, and is gonna kill u and everyone involved if u let us in charge. So… Pls be always over the edge of ur seat ready to jump! We warned u (even if we did everything to be as misleading as possible), u can’t pass us the bill, nor sue us now.

      K, bye."

      So yeah, they ain’t liable anymore.