• abraham_linksys@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 year ago

    We need to build special roads so self driving cars can navigate properly.

    You could even connect self driving cars together, by letting the front car pull them the others could save their batteries.

    And with these “trains” of self driving cars pulling each other, you wouldn’t have to build the self driving car roads very wide, they could just run on narrow “tracks” for the wheels.

    Then we’d have more space for human stuff instead of car stuff like roads and parking lots everywhere.

    He’s done it again. Elon Musk is a god damn genius.

    • amanneedsamaid@sopuli.xyz
      link
      fedilink
      English
      arrow-up
      2
      ·
      1 year ago

      Bill the manufacturer 100%, IMO. Thats why I think self driving cars beg an unanswerable legal question, as when the car drives for you, why would you be at fault? How will businesses survive if they have to take full accountability for accidents caused by self-driving cars?

      I think its almost always pointless to hold back innovation, but in this case I think a full ban on self driving cars would be a great move.

        • sin_free_for_00_days@sopuli.xyz
          link
          fedilink
          arrow-up
          1
          ·
          edit-2
          1 year ago

          I’m pretty sure there are autonomous cars driving around San Francisco, and have been for some time.

          EDIT: Here’s an uplifting story about San Francisco-ians(?) interacting with the self-driving cars.

      • DauntingFlamingo@lemmy.ml
        link
        fedilink
        English
        arrow-up
        1
        ·
        edit-2
        1 year ago

        The most basic driving like long stretches of highway shouldn’t be banned from using AI/automated driving. The fast paced inner city driving should be augmented but not fully automatic. Same goes for driving in inclement weather: augmented with hard limits on speed and automated braking for anything that could result in a crash

        Edit: I meant this statement as referring to the technology in it’s current consumer form (what is available to the public right at this moment). I fully expect that as the technology matures so will the percentage of incidents decline. We are likely to attain a largely driverless society one day in my lifetime

        • Dudewitbow@lemmy.ml
          link
          fedilink
          arrow-up
          1
          ·
          1 year ago

          Its why im all for automated trucking. Truck drivers is a dwindling source and living the lifestyle of a cross country truck driver isnt highly sought after job. The self driving should do the large trip from hub to hub, and each hub ahould do the last few miles. Keeps drivers local and fixes a problem that is only going to get worse.

          • DauntingFlamingo@lemmy.ml
            link
            fedilink
            arrow-up
            1
            ·
            edit-2
            1 year ago

            That would be the augmented part and the AI. ANYTHING that presents a potential hazard already takes a vehicle out of automated driving in most models, because after a few Teslas didn’t stop people started suing

        • snooggums@kbin.social
          link
          fedilink
          arrow-up
          0
          ·
          1 year ago

          “Self driving with driver assist” or whatever they call it when it isn’t 100% automated is basically super fancy cruise control and should be treated as such. The main problem with the term autopilot is that for airplanes it means 100% control and very misleading when used for fancy cruise control in cars.

          I agree that it should be limited in use to highways and other open roads, like when cruise control should be used. People using cruise control in the city without being in control to brake is the same basic issue.

          Not 100% fully automated with no expectation of driver involvement should be allowed when it has surpassed regular drivers. To be honest, we might even be there with how terrible human drivers are…

          • Amju Wolf@pawb.social
            link
            fedilink
            arrow-up
            0
            ·
            1 year ago

            Autopilot in aircraft is actually kinda comparable, it still needs a skilled human operator to set it up and monitor it (and other flight controls) all of the time. And in most modes it’s not even really all that autonomous - at most it follows a pre-programmed route.

              • Amju Wolf@pawb.social
                link
                fedilink
                English
                arrow-up
                1
                ·
                1 year ago

                They can, but the setup is still non-trivial and full auto landing capability isn’t used all that much even if technically available. It also isn’t just the capability of the aircraft, it requires a shitton of supporting infrastructure on the ground (airport) and many airports don’t support this.

                That would be equivalent to installing new intersections where you’d also have a broadcast of what the current signals are for each lane, which would help self-driving cars immensely (and regular cars eventually too, with assistive technologies to help drivers drive more safe), but that’s simply not a thing yet.

      • sugar_in_your_tea@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        1
        ·
        edit-2
        1 year ago

        The responsible party should be the owner of the vehicle, not the manufacturer or passenger. If a company runs an automated ride share service, for example, that company should be liable. Likewise if you own a car and use the self-driving feature, you are at fault it it goes wrong, so you should use it at your own risk.

        That said, for the owner to be truly responsible, they need ownership of the self-driving code, as well as diagnostics for them to be able to monitor it. If they don’t have that, do they truly own the car?

        That said, there’s nothing stopping a manufacturer or dealer from making a deal to cover self-driving fines.

        • amanneedsamaid@sopuli.xyz
          link
          fedilink
          English
          arrow-up
          1
          ·
          edit-2
          1 year ago

          Well exactly, I see no way that all the self driving source code will be FOSS (I don’t think corporations would ever willingly sign onto this). So the responsible party in the case of a malfunction should therefore be the company, because in a full self driving setup the occupant is not controlling the vehicle, and has no reasonable way to ensure the safety of the code.

          • sugar_in_your_tea@sh.itjust.works
            link
            fedilink
            English
            arrow-up
            1
            ·
            1 year ago

            Which is why it should be dual responsibility. The owner of the vehicle chose to use the feature, so they have responsibility. If it malfunctions when the driver was following the instructions, the manufacturer has responsibility. Both are culpable, so they should share responsibility.

    • schroedingershat@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      1 year ago

      Nah. Give tesla the same number of points everyone else gets on their license. If the company runs out, no more cars controlled by tesla on the roads…

      • MeshPotato@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        1 year ago

        We already had that in the 70s and 80s. Those were RoRo trains.

        You put your car on a drive on ramp. Go into the comfy cabin, maybe even a sleeper cabin for over night journeys. Get out at the other end, drive your car down the carrier and explore the area that you’ve journeyed to with the vehicle that you own. Look up the 89s ABC film about the Ghan railway closing down.

        I live in Australia and love seeing the distant from my home centre of tue country. Unfortunately long distance trains here have become a lifestyle luxury experience rather than transportation. Same goes for bicycles amd motorcycles.

    • Piecemakers@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      edit-2
      1 year ago

      I dunno. I could go for one about him launching himself to Mars in a carbon-fiber & titanium capsule, piloting via gamepad, ya know? Especially if he brought Bezos, the Koch bros. & Gates along. 🤷🏼‍♂️ It’d save time at least on setting up the ol’ woodchipper down the road, ya know?

          • max@feddit.nl
            link
            fedilink
            English
            arrow-up
            1
            ·
            1 year ago

            I mean, yeah. That’s not great. But he’s not hellbent on election fraud, reinventing the concept of a train a million times over just to sell cars, not in some evil plot that I know of, just to name a few of the things most billionaires do. I’m sure there are bad sides to him, human after all, but he seems to do more good than bad in my perception with his foundation and seems to care about global health and well-being.

  • rusticus1773@lemmy.ml
    link
    fedilink
    English
    arrow-up
    0
    ·
    1 year ago

    So sick of shit like this getting posted. Of course the software is not perfect. There are so many warnings about it not being independent of driver intervention it’s crazy. Yet here we are with the entire internet hating on Musk so much that we have to tear down the evolution of self driving cars, which is arguably the most complicated computing and programming problem in history. Bring on the downvotes but for the record, I think Musk is a douchebag but can separately appreciate the effort involved in the herculean task of programming cars that drive themselves.

    • Addv4@kbin.social
      link
      fedilink
      arrow-up
      1
      ·
      1 year ago

      I think it’s not a case of the software not being perfect, but that they are actively breaking in live environments, where it is amazingly critical that they not break. If that is an issue, then they need to get to such a level of confidence were they don’t need to worry about breaking, which tesla is apparently not at currently.

  • PenguinJuice@kbin.social
    link
    fedilink
    arrow-up
    0
    ·
    1 year ago

    He was prolly fired bc he couldn’t program the thing to stop at a red light.

    Also who knows when this footage was taken or if it was just test footage that has since been ironed out.

  • Arotrios@kbin.social
    link
    fedilink
    arrow-up
    0
    ·
    1 year ago

    JFC that’s frightening. It blew that red at about 30mph, didn’t even really slow down except for the curve.

    • killall-q@kbin.social
      link
      fedilink
      arrow-up
      1
      ·
      1 year ago

      Because the car didn’t recognize it as a red light, probably due to all the green lights that were facing a similar direction.

      The issue is not the speed at which it took the turn, but that it cannot distinguish which traffic lights are for the lane the car is in.

  • AwkwardPenguin@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    1 year ago

    To be fair, it’s a messy intersection with lots of traffic lights. I’m struggling to understand which one is the one to look at. However I’m finding hard to believe Tesla actually has the skills to unbeta this shit hole.

    • galaxies_collide@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      1 year ago

      That’s the thing, if FSD isn’t advanced enough to handle tricky intersections no matter the circumstance, then it’s not ready for deployment.