• Arigion@feddit.org
    link
    fedilink
    English
    arrow-up
    81
    arrow-down
    2
    ·
    6 days ago

    Actually you will probably die in a Tesla whose doors do not open https://www.msn.com/en-gb/news/world/friends-trapped-in-tesla-burned-to-death-when-electronic-doors-failed-to-open-after-crash/ar-AA1tWPYH

    Or which just hits you: https://www.washingtonpost.com/technology/2023/06/10/tesla-autopilot-crashes-elon-musk/

    I mean now where Teslas are probably the next mandatory car for state officials. And after some “Think Tank” will be pretty sure that all the saftey fuss about autopilots is bad for the economy.

    • captain_oni@lemmy.blahaj.zone
      link
      fedilink
      English
      arrow-up
      4
      ·
      6 days ago

      I’m afraid that in the future they will try to require self-driving in all cars, citing some bs about how self-driving is “safer” (in theory, and under perfect conditions)

      • skulblaka@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        12
        arrow-down
        3
        ·
        6 days ago

        Actually, legislating that FSD be required in all cars would go a huge way towards shoring up all the problems with it. The most dangerous thing on the road is another living human, and computers can’t adjust for the inherent chaos that we living humans bring to the road. Remove the humans and you have a way smaller problem to solve.

        Teslas especially can’t cope, and keep running over kids and cop cars because Musk has forbidden them from installing the proper technology to solve their problem. LIDAR isn’t a magic bullet for this but it’s pretty damn close, and it’s way way better than just using visual sensors only. Elon is just so high on his own farts that he won’t allow Tesla to use LIDAR on their cars, despite every other FSD-attempting manufacturer seeing great returns from it and rapidly outpacing Tesla’s first mover advantage. But they’re trying to solve the wrong problem, making a car see and drive like a person. People are terrible at driving. They need to make the car drive like a computer. If all cars were required to have FSD and humans weren’t allowed to drive anymore we’d have a functioning auto-pool within 10 years. Maybe less than that if measures were put in place to ensure cross-car communication on the road. If every car within 100 meters of you is sharing sensor data and telling each other their speed, heading, and braking, you’d never see another car-to-car accident in your life.

        Point being, we already have the technology to solve this problem, what we don’t have is the technology to solve this problem on the same roads that people are already driving on, weaving around their old manually-controlled cars that they’re driving in. Building FSD-only roads could solve this. Changing motor vehicle laws could solve this. But neither of those are going to solve it without significant headache or push back from the public. The real problem is that we’re trying to solve both “cars can’t drive themselves” and “people drive like rabid orangutans” problems at the same time. Remove one of those problems - say by restricting or removing manual driving - and suddenly a proper solution is easily within grasp.

        Please note that I am NOT saying that this is a good idea, not least of which because myself and most people I know drive a 20-40 year old car and would be unable to switch to a new self driving car unless the government literally provided me one and bought my old car. And if we DID try to enact that across the country we’d run out of g-men in the first week because they’d all be shot dead by farmers and truck bros when they showed up to try and repossess the vehicles. Not to mention the fact that a government provided car is inherently untrustable by anyone, anywhere. The logistics on this would just be insane, and the best case way to tackle it would be to phase it in like they did with backup cameras. But there’s a problem here. Every new car may be FSD-enabled but the FSD they have won’t be able to be used - because the road is still full of live dumb-ass primates. So 25 years down the line maybe 94% of cars still on the road are now FSD enabled but nobody has ever used the damn thing and most people probably aren’t going to want to start then. So at this point you’ve only sort of solved the distribution problem (although not really, even then, because a single human-operated car on the road can fuck up everything for all the automated cars by acting unpredictably and potentially not being linked in, and there will always, always be at least one poor bastard driving a 50 year old beat-to-fuck Toyota no matter when or where you are) and we run right back into the same societal adaptation problem we had before, just with the can kicked down the road.

        So like, tl;dr, this wouldn’t be an awful idea if we weren’t all human beings. I guess that’s true of a lot of things. If we could just do it in one big flush and just tell people “this is the way things are now” and have them listen and cooperate, it’d be a great solution. But as with many things it’s the “getting people on board with the plan” part of the plan that makes it, in my opinion, soundly impossible.

        Aaand I’ve just spent the better part of half an hour talking myself into a circle to basically agree with you. Lovely. Cheers 🍻

        • angrystego@lemmy.world
          link
          fedilink
          English
          arrow-up
          7
          ·
          6 days ago

          The idea of FSD cars being safe if there are no human drivers is true only if you abandon the nice thought of having cyclists and pedestrians. Having the option to walk or ride a bike is amazing and worth aspiring to.

        • rwtwm@feddit.uk
          link
          fedilink
          English
          arrow-up
          2
          ·
          6 days ago

          There’s another issue too. In perfect conditions, self-driving cars are a lot safer, but they aren’t 100% safe. So when an incident occurs it’s newsworthy. (In the same way that we hear about plane crashes anywhere in the world, but won’t necessarily hear about someone getting run over in the next city).

          My hypothesis is that adoption would be throttled in even near perfect conditions. Just because we’ve internalised the risks of driving, but haven’t for the risks of being driven by a computer.

      • Malfeasant@lemm.ee
        link
        fedilink
        English
        arrow-up
        4
        ·
        6 days ago

        Self-driving cars are safer, in general, overall - it’s just that the specific cases in which they fail are different from anything a (halfway competent and alert) person would fail at.

        • leftzero@lemmynsfw.com
          link
          fedilink
          English
          arrow-up
          1
          ·
          5 days ago

          Self-driving cars are safer, in general, overall

          Self driving cars in general, maybe. Teslas in particular, probably not so much, especially with Musk refusing to use LIDAR…

        • Rekorse@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          1
          ·
          6 days ago

          People assess risks differently based on what the results could be. Burning to death trapped in a car is pretty high on the “avoid this” list, and americans likely are asking “if I get hit by a Tesla who do I sue?” If the answer is Tesla then thats a much higher risk than if its the person in the drivers seat.

          Its about ehat types of mistakes can happen and how they are handled. Many prefer gas automobiles as they consider them less risky.