• jet@hackertalks.com
    link
    fedilink
    English
    arrow-up
    50
    arrow-down
    5
    ·
    1 year ago

    In the event of a collision autopilot will disconnect moments before impact making the collision fully the fault of the driver in charge. This is a feature, not a bug.

    • Aurenkin@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      3
      arrow-down
      11
      ·
      1 year ago

      Source: trust me bro.

      On a serious note though why would they do that? Pretty sure they are legally covered with all the warnings that you are responsible for your vehicle.

      • jet@hackertalks.com
        link
        fedilink
        English
        arrow-up
        23
        arrow-down
        1
        ·
        1 year ago

        Serious reply: The computer navigates until it’s unable to navigate and then we’ll hand over to the driver. Unfortunately that just means the computer is navigated you into a very difficult position which you don’t have much time to recover from.

        Funny reply: if you can always claim collisions or the responsibility of a driver, you don’t have to answer difficult questions about ethics of artificial drivers, and their efficaciousness…

        Tesla’s done some bold things recently like removing lidar. And that’s going to make it harder to defend the robot driver when they make mistakes. Removing information from the command and control is difficult to justify

        • Aurenkin@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          5
          arrow-down
          21
          ·
          1 year ago

          Source: trust me bro again.

          You are meant to keep your hands on the wheel at all times and pay attention when using the system. There are multiple layers of warnings if you don’t and the system will eventually not allow you to activate it if you ignore the warnings. If you sit there and watch as autopilot drives you off a cliff, it’s your fault.

          Yes Elon has been dodgey as fuck with his timelines, taking people’s money and making great claims about the future capabilities of the system and is just an all around asshole but can we try and ground our criticisms in facts?

          There are plenty of things we can and should be critical of when it comes to Tesla and making things up just makes it easier for genuine criticisms to be dismissed.

          Apologies to you if you actually are making well backed claims, it’s just frustrating to see so much noise when it comes to Tesla and people often just throwing random bs out there.

          • jet@hackertalks.com
            link
            fedilink
            English
            arrow-up
            9
            arrow-down
            1
            ·
            1 year ago

            This is a discussion forum, so I’m discussing. I’m not citing sources as you have twice noted.

            Like it or not. People are going to associate any Tesla crash with the failure of Elon musk’s assisted driving system. Even if we look at a very sensible market participant like waymo, any waymo vehicle incident will be associated with the self-driving nature of the car. This is normal for any novel technology. All the downsides get associated with the novelness.

            It certainly my hope it’s statistically any issues that arise from automated driving are going to be less likely than issues arriving arising from human driving, especially intoxicated driving… Until we get to that point where everyone knows that, we’re going to have media that’s associated with the downsides.

            • Aurenkin@sh.itjust.works
              link
              fedilink
              English
              arrow-up
              2
              arrow-down
              2
              ·
              1 year ago

              On that we can absolutely agree and I think scrutiny is definitely warranted with any new technology especially one which has such a huge profit motive. My issue in this case was with the original claim that the system intentionally disengages at the last minute for the purpose of avoiding liability for any crash. Big call.

              Anyway, I was probably overly sarcastic and flippant which doesn’t help my point so sorry for venting my frustrations like that. Hopefully these technologies get the scrutiny they deserve without hysteria any time there’s a crash that ‘possibly’ involved autopilot.

              • jet@hackertalks.com
                link
                fedilink
                English
                arrow-up
                3
                arrow-down
                2
                ·
                edit-2
                1 year ago

                I don’t think that’s the main reason autopilot hands over when it’s about to crash, but I think that is a factor that was part of the design.

                I think a lawyer definitely was consulted during the design of assisted driving to human driver hand off. Can I cite sources no. It’s just sensible. If you were designing a system, that involved life and death decisions, you would have lawyers involved. Any good lawyer would help you limit your liability by moving the decision making to the human when something was about to go wrong.

                https://www.youtube.com/watch?v=ZBvIWFq-fGc Are drivers like this ready to take over in an emergency in less than a second? No. Elon musk does no favors to his system by calling it fully automated driving. Or whatever the term is. Which is misleading. Driver assistance should be assistance, but the more you take the driver out of the loop the more they get distracted the more they are not in the right context to jump in. That’s human nature. So there’s going to be a balance we have to find between automated hands-off driving and humans being responsible. I don’t think Tesla’s found that right balance.

                And I 100% believe lawyers are involved to limit liability at least so that statements can be made but self-driving system was not at fault for the car crash. It was not engaged at the time of the crash. 100% believe that was a factor in their handover logic. I can’t prove it. But the preponderance of evidence, the public behavior of certain market leaders, and my history with corporations. Does not make this a big leap of faith

    • Moonrise2473@feddit.it
      link
      fedilink
      English
      arrow-up
      19
      arrow-down
      2
      ·
      1 year ago

      They know, they just can’t tell the media while the investigation is ongoing.

      But if a vehicle crashed on a turning tractor trailer (they start to slow down and signal hundreds and hundreds of meters before turning) hitting on the side without signs of braking, it means the driver was sleeping while the cruise control was engaged (I don’t use their misleading marketing term for what is actually an advanced cruise control)

  • andyburke@kbin.social
    link
    fedilink
    arrow-up
    5
    ·
    1 year ago

    This same crash modality was one of the first autopilot crashes.

    When your autopilot will drive right into the side of a truck, people are right to question its safety.

    • Sentrovasi@kbin.social
      link
      fedilink
      arrow-up
      25
      arrow-down
      2
      ·
      1 year ago

      I guess the difference is we expect humans to fuck up, but autonomous driving is meant to eventually be the thing that replaces that and stops us fucking up.

        • Sentrovasi@kbin.social
          link
          fedilink
          arrow-up
          3
          ·
          1 year ago

          The scary thing to me is that humans are predictable, or at least, predictable in their unpredictableness.

          With AI, it’s a black box I don’t understand. When it suddenly crashes, I literally will have no idea why.

  • maporita@unilem.org
    link
    fedilink
    English
    arrow-up
    2
    ·
    1 year ago

    Quoting individual instances is not helpful. The more important question is whether their auto pilot makes driving safer on average. According to Tesla themselves…

    "the accident rate with Autopilot engaged is just a fraction of the industry average in the US.

    From their report:

    " Tesla vehicles with Autopilot engaged (mostly highway miles) had just 0.18 accidents per million miles driven, compared to the US vehicle average of 1.53 accidents per million miles.

    Teslas on non-highways with Full Self Driving (FSD) engaged had just 0.31 accidents per million miles representing an 80% reduction in accidents compared with the average vehicle. Tesla vehicles with no active safety triggered – neither Autopilot of FSD – had an accident rate of 0.68, less than half the total US vehicle fleet".

    I would really like to see independent verification of this.