• Ocelot
    link
    fedilink
    English
    1
    edit-2
    10 months ago

    FSD is not without issues, but yes lots in this thread are implying that FSD is unsafe and causes tons of accidents, which there is absolutely no evidence to back that up. Its just a “Feeling” they have. They believe that it is irresponsible of anyone to use it and doing so puts others at unnecessary risk. People will genuinely believe that I am putting my family and neighbors at serious risk of harm because I’m irresponsible enough to have and use FSD. All I have asked for the entire time is just some kind of evidence that it is dangerous. Anything. Help me understand your view. Please.

    The reason I defend it so much at this point is because it is already demonstrated to be far safer than any average human driver and getting better with every release. With the new V12 and full neural net it is expected to get far smoother and drive even more human like with less code and consuming less power. We have seen massive improvements in the tech just in the past year and the rate in which it gets better continues to accelerate. It is impossible to count how many lives it has saved already through accident avoidance. We don’t need misinformed people bashing and trying to cast doubt and hold back this technology just because they “feel” a certain way about it. You should absolutely criticize valid concerns, but FFS please bring some facts and evidence to the table.

    The reason I am confident why FSD is safe despite “feelings” is how its programmed. For several years prior to even the earliest public beta, the camera and AI system learned how to correctly identify everything on the road. Other cars, pedestrians, dogs, cats, babies, telephone poles, traffic cones, whatever. It is in a state now where it is as accurate as it ever can be and any issues it has are with regards to mislabeling one thing as something else (like a car as a truck, etc). That doesn’t actually matter with self driving because literally the first line of code in FSD is something like: “This is a car, this is a truck, this is a pedestrian, this is a dog and this is where it is, where it is going and how far away it is”… OK? Don’t hit those. And it doesn’t. Everything else comes secondary. It drives like a robot, it obeys traffic laws to a T and that pisses off other drivers, or freaks out whoever is behind the wheel because the car didn’t do exactly what they would have done in that situation and it is therefore wrong, so they had to take over, often times the act of the driver taking over for the car actually puts them in more danger than they were in if they would have just let the car continue the maneuver. This shouldn’t be surprising because humans as a whole SUCK at driving and making decisions like that. It is sometimes unnecessarily cautious around pedestrians (But, honestly how would you want it to behave?) It might suddenly detect a hazard and swerve to avoid it, possibly moving the car into another unoccupied space. It is fully aware of the space it is occupying and fully aware of the space it is about to occupy. And it doesn’t hit anything. There are lots of youtube channels that prove this, they upload regularly and stress test FSD and try to get it into trickier and trickier situations and it never hits anything. It acts indecisively sometimes, and waits for gaps too large in an abundance of caution, but these are the issues that are getting better over time. At no point does it do anything “Unsafe”, especially since wide release of the public beta. Imagine, if you would, a world where all cars are like this. The most dangerous part of driving right now, FSD or not, is other drivers. The more people we have using it who understand it and are comfortable with it, the better it gets and our roads get safer and safer. I really don’t care how you feel about Elon, he deserves every bit of hate that is sent his way, but FFS please take a look at FSD for what it is and what it is becoming. If it helps you feel any better he was not personally responsible for writing a single line of code or designing any of the components of the system.

    All I’ve gotten to “back up” the claims that it is dangerous here is 3 different articles referencing the exact same incident (the bay bridge pile-up). The video clearly shows the car coasting (regen) to a stop and just sitting there. Had emergency braking been engaged, the hazards would have been turned on, and the car would have stopped a lot quicker. FSD never, ever has had any history or incident of completely stopping in a lane. Any complaints about “Phantom Braking” are usually where the car slows down due to a detected hazard which may or may not be present. There is no evidence of this ever happening anywhere else. 500k of these cars on the road and no other similar reports. Is that a fault of the software, or is it more likely some kind of user error? From my standpoint, having actually used FSD for several years I can tell you with complete certainty that the car would never behave like that and there are far too many red flags in that video to reasonably cast blame on the software. Of course, we will see what plays out in the court case once it is completed, but in my professional opinion, the driver clearly disengaged FSD and allowed the car to come to a complete stop on its own and did nothing to move the car out of the way, it had absolutely nothing to do with the software. I’m 100% open to disagreement on that and am curious as to what a civilized discussion on it would sound like and what someone else thinks is happening here, but so far it just turns into a flame war and I get called a deluded fanboy, even being called a liar and other names. No evidence, no discussion, only anger.

    Again, here is my point. If FSD is as dangerous as others are implying then we should see tons of accidents. Given that every single one of these cars has a constantly running 360 degree dashcam, we should see some evidence, right? Maybe not from this specific case, maybe there’s a valid reason why they couldn’t upload it. But, surely with half a million cars on the road and many millions of miles traveled collectively, and more and more teslas hitting the road every day we should at least see something by now, right? There are tons and tons of videos of teslas avoiding accidents, but nobody wants to mention or talk about those. People are focusing all of their energy into one highly suspect negative with nothing to back it up, holding back technology and safety and refusing to have any sort of civilized discussion around it. Their entire perception on how this technology works is restricted to a few clickbait headlines where they didn’t even bother to read the article. They come here and confidently proclaim that they know for a fact it doesn’t work and will never work and how dare you even try to bring any facts to the table. Its as if the discussion is being led by children. Its not productive, not based on any sort of facts and doesn’t go anywhere, and we’re all confused and less informed as a result. For example, someone posts an accident that occurred in 2018 as evidence that full-self-driving doesn’t work, when FSD didn’t even exist until 2021. If you point that out, there’s no concession, there’s no rebuttal, there’s only anger. Pointing out simple, easily verifiable facts makes you a Tesla fanboy and therefore any opinion or input you may have on the matter is invalid. Your mind is already made up and you will never see our point of view! No, I don’t currently see your point of view because you don’t have even the most basic facts straight.

    • @Zummy@lemmy.world
      link
      fedilink
      English
      110 months ago

      It’s clear from what you wrote that you want FSD to be as good as it can and I think we can get there but we aren’t there yet. You say there hasn’t been any reports of any accidents with FSD save for one, but I don’t know if that’s true and that would require some serious research on my behalf to evaluate that. First, I don’t know the number of people that have a car capable of doing FSD driving, from your reply you said 500k on the road, but provided no evidence so I can’t say that’s true without independent evaluation. Second, I have no knowledge of how many of those cars use FSD. It may be a bunch, but it may not. You don’t say and I don’t know. Now there may be far less accidents with FSD, but if the number of people of people on the road in Q1 is 286 million just in the US (https://www.statista.com/statistics/859950/vehicles-in-operation-by-quarter-united-states/ ), and the number of vehicles using FSD every single day all the time for every single drive, it would stand to reason there are far less accidents because there are far less car. You also mention that it has become good at being able to detect objects and I think it has, but being able to detect objects and being able to avoid getting accidents when there are 286 million FSD driving cars on the road that FSD exclusively every single time the vehicle is in use are two different things.

      The fact is, I do want FSD to be a thing, but when I see article written by someone who says that two times they had to take over for the car so it didn’t kill the driver or others, I start to worry that FSD isn’t ready. And frankly although there are YouTube channels that are about electric vehicles that haven’t brought up accidents ever, I wonder if they have a reason not to. I’m not sure. Also, I can’t say the big YouTube channels have never talked about this because I haven’t watched every video they’ve ever posted. And I would have to do that to know if your correct.

      I see that you are passionate about FSD, and I think your passion makes you overlook the real discussion going on. People, and certainly not all people, generally want FSD to be a thing for the reasons you stated, but they want to make sure the cars are safe when they are. And I get that you take a risk every time you drive a car, but the fact of the matter is from reading this article I get the sense that FSD isn’t ready to implemented for every person with a drivers license to use. It sounds like the author knew what to do because he had been driving for some time. If he hasn’t, I think the situation could have been very different.

      You talk about the car not doing exactly what they would have done, but the in articles case it was going to crash. I don’t think anyone would have done that. If the car was able to detect the object, why was it going to crash into it? That is something that would need to be investigated. You argue that people talk about FSD being removed/cancelled because people have a feeling it isn’t good, but I haven’t seen that in droves. I’ve seen several people say that they think FSD needs more testing and more limited roll out.

      I know I didn’t hit all your points, but they were quite numerous. I want full self driving, but I want it to be reliable. And I think if articles like this are written we just aren’t there yet. Yes, keep it coming, but be real about its current limitations.