• Aurenkin@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    4
    arrow-down
    11
    ·
    1 year ago

    Source: trust me bro.

    On a serious note though why would they do that? Pretty sure they are legally covered with all the warnings that you are responsible for your vehicle.

    • jet@hackertalks.com
      link
      fedilink
      English
      arrow-up
      23
      arrow-down
      1
      ·
      1 year ago

      Serious reply: The computer navigates until it’s unable to navigate and then we’ll hand over to the driver. Unfortunately that just means the computer is navigated you into a very difficult position which you don’t have much time to recover from.

      Funny reply: if you can always claim collisions or the responsibility of a driver, you don’t have to answer difficult questions about ethics of artificial drivers, and their efficaciousness…

      Tesla’s done some bold things recently like removing lidar. And that’s going to make it harder to defend the robot driver when they make mistakes. Removing information from the command and control is difficult to justify

      • Aurenkin@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        5
        arrow-down
        21
        ·
        1 year ago

        Source: trust me bro again.

        You are meant to keep your hands on the wheel at all times and pay attention when using the system. There are multiple layers of warnings if you don’t and the system will eventually not allow you to activate it if you ignore the warnings. If you sit there and watch as autopilot drives you off a cliff, it’s your fault.

        Yes Elon has been dodgey as fuck with his timelines, taking people’s money and making great claims about the future capabilities of the system and is just an all around asshole but can we try and ground our criticisms in facts?

        There are plenty of things we can and should be critical of when it comes to Tesla and making things up just makes it easier for genuine criticisms to be dismissed.

        Apologies to you if you actually are making well backed claims, it’s just frustrating to see so much noise when it comes to Tesla and people often just throwing random bs out there.

        • jet@hackertalks.com
          link
          fedilink
          English
          arrow-up
          9
          arrow-down
          1
          ·
          1 year ago

          This is a discussion forum, so I’m discussing. I’m not citing sources as you have twice noted.

          Like it or not. People are going to associate any Tesla crash with the failure of Elon musk’s assisted driving system. Even if we look at a very sensible market participant like waymo, any waymo vehicle incident will be associated with the self-driving nature of the car. This is normal for any novel technology. All the downsides get associated with the novelness.

          It certainly my hope it’s statistically any issues that arise from automated driving are going to be less likely than issues arriving arising from human driving, especially intoxicated driving… Until we get to that point where everyone knows that, we’re going to have media that’s associated with the downsides.

          • Aurenkin@sh.itjust.works
            link
            fedilink
            English
            arrow-up
            2
            arrow-down
            2
            ·
            1 year ago

            On that we can absolutely agree and I think scrutiny is definitely warranted with any new technology especially one which has such a huge profit motive. My issue in this case was with the original claim that the system intentionally disengages at the last minute for the purpose of avoiding liability for any crash. Big call.

            Anyway, I was probably overly sarcastic and flippant which doesn’t help my point so sorry for venting my frustrations like that. Hopefully these technologies get the scrutiny they deserve without hysteria any time there’s a crash that ‘possibly’ involved autopilot.

            • jet@hackertalks.com
              link
              fedilink
              English
              arrow-up
              3
              arrow-down
              2
              ·
              edit-2
              1 year ago

              I don’t think that’s the main reason autopilot hands over when it’s about to crash, but I think that is a factor that was part of the design.

              I think a lawyer definitely was consulted during the design of assisted driving to human driver hand off. Can I cite sources no. It’s just sensible. If you were designing a system, that involved life and death decisions, you would have lawyers involved. Any good lawyer would help you limit your liability by moving the decision making to the human when something was about to go wrong.

              https://www.youtube.com/watch?v=ZBvIWFq-fGc Are drivers like this ready to take over in an emergency in less than a second? No. Elon musk does no favors to his system by calling it fully automated driving. Or whatever the term is. Which is misleading. Driver assistance should be assistance, but the more you take the driver out of the loop the more they get distracted the more they are not in the right context to jump in. That’s human nature. So there’s going to be a balance we have to find between automated hands-off driving and humans being responsible. I don’t think Tesla’s found that right balance.

              And I 100% believe lawyers are involved to limit liability at least so that statements can be made but self-driving system was not at fault for the car crash. It was not engaged at the time of the crash. 100% believe that was a factor in their handover logic. I can’t prove it. But the preponderance of evidence, the public behavior of certain market leaders, and my history with corporations. Does not make this a big leap of faith