• Ocelot@lemmies.world
    link
    fedilink
    English
    arrow-up
    14
    arrow-down
    49
    ·
    edit-2
    1 year ago

    Humans did not evolve to drive cars. ML did. It drives consistently with no distractions. It is never tired, drunk, or experiences road rage. It has super human reaction time and can see in a full 360 degrees. It is not about being a lazy fatass it is about safety. Hundreds of people in the US were killed in car accidents just today, and none of them were from self driving cars.

    Also please provide an example of a life threatening accident cause by FSD.

    • Zummy@lemmy.world
      link
      fedilink
      English
      arrow-up
      19
      arrow-down
      1
      ·
      1 year ago

      The article listed 2 life threatening near accidents that were only prevented because the person behind the wheel took over and kicked out FSD. Read the article and then comment.

      • Ocelot@lemmies.world
        link
        fedilink
        English
        arrow-up
        0
        arrow-down
        5
        ·
        edit-2
        1 year ago

        Teslas have 360 degree dashcams that are recording all the time. Why didn’t they upload the video? I promise you they have it.

        Such a video would go viral pretty easily. It would light a fire under tesla engineering to fix such a dangerous and life threatening situation. Where is it? Why is there never any footage attached to these articles? Why can’t I find a video ANYWHERE of such a thing? Why can nobody in this thread bashing the tech over and over produce any justification for their fear?

        If I were tesla and I wanted to cover up dangers of FSD trying to kill people I wouldn’t give everyone a constantly running dashcam. It would really make them look bad.

        Could it possibly be, just maybe, that the video disagrees with the “journalist” opinion that it was performing dangerously? Could it be that an article that says “Tesla FSD performs admirably, swerves to avoid obstacle that would have caused a blowout” might not get nearly as many clicks and ad revenue? Maybe?

        FSD is aware of where barriers and medians are. If it needs to swerve to avoid an obstacle it will go in whatever direction is safest. Sometimes that means towards a barrier. Sometimes the driver panicking and disengaging and taking over interrupts the maneuver and causes danger that wasn’t otherwise present. We will never know what actually happened because there is no evidence. Evidence that I promise you exists but for whatever reason was omitted.

        If a cop said something outrageous and dangerous happened to them and they say they are completely clear of fault and wrongdoing, would it not be reasonable to want to see the bodycam footage? If for whatever reason the police department says “we don’t have it” “its corrupted” or whatever other excuse would that not raise eyebrows? The same situation applies here.

        There are plenty of youtube channels out there like dirtytesla, whole mars catalog, AI Driver, Chuck Cook, and many others that show and even livestream FSD. None of them have been in an accident, even in very early releases of the beta software. These people are comfortable with the beta and often don’t take over control of the vehicle under any circumstances, even in their torture test scenario.

        Is it at all possible, just maybe, that FSD isn’t as dangerous as you might think? Fear is often a result of ignorance.

        I am extremely open to changing my mind here just show me some convincing evidence. Every tesla is recording all the time so it should be really easy to find some, no?

        Im sure im just a Tesla shill or fanboy whatever. The truth is Im just looking for facts. I would like to know why people feel this way and are so afraid of new technology despite overwhelming evidence to the contrary that it is saving lives.

        • wizardbeard@lemmy.dbzer0.com
          link
          fedilink
          English
          arrow-up
          4
          ·
          1 year ago

          Wow that’s sure a lot of text for someone that didn’t read the article.

          The author states that despite having storage plugged in, he was not given the option to save a recording.

          • Ocelot@lemmies.world
            link
            fedilink
            English
            arrow-up
            0
            arrow-down
            1
            ·
            1 year ago

            Thats because its a rolling recording. If you explicitly want to save a clip long-term you honk the horn. This is clearly laid out in the manual and is located as a setting right on the screen where the dashcam is enabled. This line is a pure cop-out. They had the footage they just refused to upload it. Possibly, they never bothered to check for it but that would be incredibly irresponsible for anything resembling “journalism”

      • CmdrShepard@lemmy.one
        link
        fedilink
        English
        arrow-up
        2
        arrow-down
        8
        ·
        1 year ago

        Hilarious telling them to read the article first when you couldn’t even be bothered to read their question before replying.

        • Zummy@lemmy.world
          link
          fedilink
          English
          arrow-up
          6
          ·
          edit-2
          1 year ago

          I read it just fine. He asked for an example of a life threatening accident caused by Full Self Driving. I noted that 2 examples were listed in the article. The ONLY difference was that the driver prevented the accidents by being aware. The FSD was going to cause accidents without intervention. I guess in your would people are supposed to do nothing to avoid a major accident. Hilarious that you want to love FSD driving so much that you’re willing to defend a billionaire who wouldn’t piss on you if you were on fire. Billionaires are not your friends. FSD is BETA feature that doesn’t work properly. Take your love somewhere else and away from my comment because you read it, didn’t understand it, and fired off a reply stating I didn’t do something I did because you can understand me. The next time you want to have a discussion come prepared, or don’t come at all!

          • CmdrShepard@lemmy.one
            link
            fedilink
            English
            arrow-up
            1
            arrow-down
            10
            ·
            1 year ago

            Ah the “only difference” in your two examples of life-threatening accidents occurring is that no accident occurred in either example? That’s quite the difference if you ask me… this isn’t a level 4 or 5 system so driver intervention is required. These systems can’t improve without real world testing, meanwhile a hundred people die on the road every single day. I guess you’d prefer more people die on the road from drunk or distracted drivers than have manufacturers roll out solutions that aren’t absolutely 100% perfect even if they’re more perfect than human drivers most of the time.

            Your obessesion with Musk is clouding your judgment. I made no mention of him, nor do I like or defend him. This tech wasn’t built by Musk so who gives a shit about him in this discussion?

            • Zummy@lemmy.world
              link
              fedilink
              English
              arrow-up
              5
              ·
              1 year ago

              I am not obsessed with Musk in any form, but the fact of the matter is when you have FSD systems that fail to do the thing they are supposed to do, then maybe it’s not the best idea to roll them out the entire world. Maybe it’s better to continue with more limited testing. You act as if all drunk driving/distracted will stop when FSD is used and that simply isn’t the care. Many people still use gasoline powered cars and drink and drive even though it’s dangerous to do so. Furthermore, FSD will lead to more distracted driving because people will assume the self driving means the car will take of everything and there is no need to be vigilant.

              The plain truth is that while FSD can be the future, rolling it out despite knowing that it isn’t ready is not the solution it’s irresponsible and will cause harm. The almost accidents that you aren’t concerned with would have most likely killed the driver and probably other people to. Our difference of opinion here is that you believe it’s okay if people die as long the the testing shoes that there is a chance they won’t die in the future and think if anyone dies it’s too much. The feature clearly isn’t ready for prime time and needs more limited real world testing, but the fact of the matter is testing doesn’t bring in money.

              Your inability to ever consider the fact that a worldwide roll out might not be the best idea right now since the testing shows the car isn’t ready shows that you really aren’t arguing in good faith. You have chosen the position that FSD is good and is ready even when confronted with articles like the above show it isn’t. I would wager that a lot of people want the era, of FSD, they just want it when it works. Keep the roll out more limited and do further testing. When mistakes happen, take the time to figure out why and how it can be prevented in the future. You argue testing is needed, but are in favor of a roll out now even though we need lots more limited real world testing. Both can’t be true. Time to think what you really want, because I don’t think you know… And accusing any person who doesn’t want a complete roll out of FSD today of having a bias against Musk shows that.

    • Chocrates@lemmy.world
      link
      fedilink
      English
      arrow-up
      4
      ·
      1 year ago

      Self driving is not there, and it may never get there, but you are right. We can save so many lives if we get this right.

      I dont know is Musk is responsible enough to be the one to get us there though.