SAN FRANCISCO, Sept 28 (Reuters) - Opening statements are set to begin on Thursday in the first U.S. trial over allegations that Tesla’s (TSLA.O) Autopilot driver assistant feature led to a death, and its results could help shape similar cases across the country.

The trial, in a California state court, stems from a civil lawsuit alleging the Autopilot system caused owner Micah Lee’s Model 3 to suddenly veer off a highway east of Los Angeles at 65 miles per hour (105 kph), strike a palm tree and burst into flames, all in the span of seconds.

The 2019 crash killed Lee and seriously injured his two passengers, including a then-8-year-old boy who was disemboweled, according to court documents. The lawsuit, filed against Tesla by the passengers and Lee’s estate, accuses Tesla of knowing that Autopilot and other safety systems were defective when it sold the car.

Tesla has denied liability, saying Lee consumed alcohol before getting behind the wheel. The electric-vehicle maker also claims it was not clear whether Autopilot was engaged at the time of crash.

Tesla has been testing and rolling out its Autopilot and more advanced Full Self-Driving (FSD) system, which Chief Executive Elon Musk has touted as crucial to his company’s future but which has drawn regulatory and legal scrutiny.

Tesla won a bellwether trial in Los Angeles in April with a strategy of saying that it tells drivers that its technology requires human monitoring, despite the “Autopilot” name. A Model S swerved into a curb in 2019 and injured its driver, and jurors told Reuters after the verdict that they believed Tesla warned drivers about its system and that driver distraction was to blame.

The stakes are higher in the trial this week, and in other cases, because people died. Tesla and plaintiff attorneys jousted in the runup about what evidence and arguments each side could make.

Tesla, for instance, won a bid to exclude some of Musk’s public statements about Autopilot. However, attorneys for the crash victims can argue that Lee’s blood alcohol content was below the legal limit, according to court filings.

The trial, in Riverside County Superior Court, is expected to last a few weeks.

  • skip0110@lemm.ee
    link
    fedilink
    English
    arrow-up
    36
    arrow-down
    2
    ·
    1 year ago

    I don’t believe the claim that their ADAS was not enabled at the time of the crash. While maybe factually true, if it disengaged a few seconds before, the crash is still the fault of Tesla’s software.

    • Dasnap@lemmy.world
      link
      fedilink
      arrow-up
      23
      ·
      1 year ago

      “Plane crash is not the fault of pilot as he jumped out the plane during nosedive.”

    • Haui@discuss.tchncs.de
      link
      fedilink
      arrow-up
      21
      arrow-down
      1
      ·
      1 year ago

      Exactly. Most importantly, „it is unclear if the autopilot was active“ is an admission of guilt in my book. A system that is highly critical (impacting steering, accelerating and decelerating a 2+ ton vehicle with multiple people on board) is unclear if active means they are completely clueless of their own products status and should be fully liable. You can not sell something like this at all. I feel like they (people in charge) should go to jail for negligent manslaughter.

      • cerevant@lemm.ee
        link
        fedilink
        English
        arrow-up
        8
        arrow-down
        1
        ·
        1 year ago

        Yes, so much. It drove me crazy when a car company argued “but our logs say it was the driver’s fault”. We’re arguing that your most critical software failed, and you want us to trust the logging subsystem?

        The NTSC needs to be qualifying car software the same way the FAA qualifies aircraft software. We need to stop trusting the manufacturers to self police.

          • cerevant@lemm.ee
            link
            fedilink
            English
            arrow-up
            3
            ·
            1 year ago

            It is a nice break from my day job, where I am certifying software for critical systems.

            sigh

            • Haui@discuss.tchncs.de
              link
              fedilink
              arrow-up
              2
              ·
              1 year ago

              Oof. Seeing this must frustrate you. It bothers me too seeing sleazebags getting their way while honest people get fucked.

    • TheYang@lemmy.world
      link
      fedilink
      arrow-up
      4
      arrow-down
      13
      ·
      1 year ago

      if it disengaged a few seconds before, the crash is still the fault of Tesla’s software.

      I actually disagree, because it’s not self driving, it doesn’t actually claim to have any autonomous features. The driver has to be aware all the time.
      The way all of this is worded when facing the public is… horrible, that’s true. But since the warnings once in the car seem to all be there, I’d say that’s more a false advertising issue than a “is your product actually safe?” issue.

      Really interested in how this comes out, even more data would be interesting to me.

      • CaptObvious@literature.cafe
        link
        fedilink
        arrow-up
        11
        arrow-down
        3
        ·
        1 year ago

        Except that Tesla does claim that they’re autonomous self-driving. They’re even among the group pushing to be allowed to sell cars with no driver controls.

        Not only should Tesla’s executives be held personally liable, I’d probably also jail whichever regulator let them get away with it.

        • TheYang@lemmy.world
          link
          fedilink
          arrow-up
          3
          arrow-down
          6
          ·
          1 year ago

          Except that Tesla does claim that they’re autonomous self-driving. They’re even among the group pushing to be allowed to sell cars with no driver controls.

          https://www.tesla.com/en_eu/support/autopilot

          they really don’t say that. I mean they advertise with it, sure. But always when it actually comes down to it, tesla admits it’s an assistive feature that requires constant attention.

          Autopilot, Enhanced Autopilot and Full Self-Driving Capability are intended for use with a fully attentive driver, who has their hands on the wheel and is prepared to take over at any moment. While these features are designed to become more capable over time, the currently enabled features do not make the vehicle autonomous.

          and you get warnings source (or here ) when you first sign up, as well as reminders when the car detects that you don’t follow the requirements.

          • cerevant@lemm.ee
            link
            fedilink
            English
            arrow-up
            7
            arrow-down
            2
            ·
            1 year ago

            So, the software doesn’t actually do anything, it just gives the illusion that it does. That’s sounds safe.

            If you are relying on T&C as a get out of jail free card for your safety system, then it isn’t a safety system.

            • TheYang@lemmy.world
              link
              fedilink
              arrow-up
              2
              arrow-down
              4
              ·
              edit-2
              1 year ago

              If you are relying on T&C as a get out of jail free card for your safety system, then it isn’t a safety system.

              That’s how every safety system works.
              You define the necessary conditions in which it works, and guarantee (with testing and validation) that in those conditions it does its job.
              Nothing works unconditionally.

              The Conditions in this case are in fact, that it is an assistance system, and not a safety system, because everybody knows it can’t be relied upon. It probably works >99% of times, which just isn’t (nearly) enough for driving.

              • cerevant@lemm.ee
                link
                fedilink
                English
                arrow-up
                4
                arrow-down
                1
                ·
                1 year ago

                Yeah, I’ve been working in aerospace, automotive, industrial and rail safety for over 20 years. You don’t get to say “this software does thing” and then in the safety manual say “you don’t get to trust that the software will actually do thing”.

                Further, when you claim the operator as a layer of protection in your safety system, the probability of dangerous failure is a function of the time between the fault (the software doing something stupid) and the failure (crash). The shorter that time, the less safe the system is.

                Here’s a clue: Musk doesn’t know anything about software safety. Their lead in autonomous technology has less to do with technical innovation and more to do with cutting corners where they can get away with it.

              • pup_atlas@pawb.social
                link
                fedilink
                arrow-up
                3
                arrow-down
                1
                ·
                1 year ago

                My guy, the feature is literally named “Autopilot”. By definition they are advertising it as a system that can PILOT the car AUTOMATICALLY. It doesn’t matter what they put in the fine print, they are blatantly describing the system as autonomous.

  • NutWrench@lemmy.zip
    link
    fedilink
    English
    arrow-up
    11
    arrow-down
    2
    ·
    1 year ago

    Tesla won a bellwether trial in Los Angeles in April with a strategy of saying that it tells drivers that its technology requires human monitoring, despite the “Autopilot” name

    Then it should not be called “Autopilot.” The AI required to make real autopilot work does NOT exist now and probably won’t exist for decades.

    Tesla autopilot is a marketing gimmick that is going to cost a lot of lives because the people who shill for Musk have a child-like worship of billionaires.

    • TheYang@lemmy.world
      link
      fedilink
      arrow-up
      14
      arrow-down
      1
      ·
      edit-2
      1 year ago

      Then it should not be called “Autopilot.” The AI required to make real autopilot work does NOT exist now and probably won’t exist for decades.

      Well, in Aviation, where I believe the term “Autopilot” is most commonly used, at least before tesla, an Autopilot is actually exactly what Tesla offers.
      When everything is fine, it can keep the plane going.
      If issues come up, it disengages and the pilot has to be able to receive full control

      /e: also, waymo and cruise already have completely autonomous cars, which generally work.

        • TheYang@lemmy.world
          link
          fedilink
          arrow-up
          4
          arrow-down
          2
          ·
          1 year ago

          I mean Teslas marketing here is totally predatory, and imho this is where they should be severely punished.

          once it became kinda publicly known that Teslas “Autopilot” doesn’t mean hands off driving, they changed to Full Self Driving capability

  • AutoTL;DR@lemmings.worldB
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 year ago

    This is the best summary I could come up with:


    SAN FRANCISCO, Sept 28 (Reuters) - Opening statements are set to begin on Thursday in the first U.S. trial over allegations that Tesla’s (TSLA.O) Autopilot driver assistant feature led to a death, and its results could help shape similar cases across the country.

    The trial, in a California state court, stems from a civil lawsuit alleging the Autopilot system caused owner Micah Lee’s Model 3 to suddenly veer off a highway east of Los Angeles at 65 miles per hour (105 kph), strike a palm tree and burst into flames, all in the span of seconds.

    The 2019 crash killed Lee and seriously injured his two passengers, including a then-8-year-old boy who was disemboweled, according to court documents.

    Tesla has been testing and rolling out its Autopilot and more advanced Full Self-Driving (FSD) system, which Chief Executive Elon Musk has touted as crucial to his company’s future but which has drawn regulatory and legal scrutiny.

    Tesla won a bellwether trial in Los Angeles in April with a strategy of saying that it tells drivers that its technology requires human monitoring, despite the “Autopilot” name.

    However, attorneys for the crash victims can argue that Lee’s blood alcohol content was below the legal limit, according to court filings.


    The original article contains 363 words, the summary contains 206 words. Saved 43%. I’m a bot and I’m open source!

  • HughJanus@lemmy.ml
    link
    fedilink
    arrow-up
    2
    arrow-down
    5
    ·
    1 year ago

    Anybody who crashes while using Autopilot is their own damn fault, and this is what the court will find.

    Autopilot is nothing more than an advanced driver assistance system. Every other manufacturer has a similar system but Tesla is the only one being used and plastered all over the news.

    • Dr Cog@mander.xyz
      link
      fedilink
      arrow-up
      1
      arrow-down
      1
      ·
      1 year ago

      The way it is marketed is not in line with it’s functionality. I expect the prosecution will claim the term “Full Self Driving” is confusing to consumers

        • Dr Cog@mander.xyz
          link
          fedilink
          arrow-up
          3
          ·
          1 year ago

          Sigh…

          Jonathan Michaels, an attorney for the plaintiffs, in his opening statement at the trial in Riverside, California, said that when the 37-year-old Lee bought Tesla’s “full self-driving capability package” for $6,000 for his Model 3 in 2019, the system was in “beta,” meaning it was not yet ready for release.

          RTFA

          • HughJanus@lemmy.ml
            link
            fedilink
            arrow-up
            1
            arrow-down
            1
            ·
            1 year ago

            🤦‍♂️ you’re the one who needs to read the article. There’s nothing there to indicate it was in use at the time.

            • Dr Cog@mander.xyz
              link
              fedilink
              arrow-up
              3
              arrow-down
              1
              ·
              edit-2
              1 year ago

              That’s irrelevant. The plaintiff bought the FSD package and his attorney (not prosecutor, I missed that this was a civil suit not criminal trial) will likely argue that it introduced confusion on the part of his client. It doesn’t matter that the FSD package wasn’t actually in use if the plaintiff believed it was (or, more importantly, that he believed it could do things that it could not due to the confusing terminology)

              • HughJanus@lemmy.ml
                link
                fedilink
                arrow-up
                1
                arrow-down
                1
                ·
                1 year ago

                Of course it matters. The plaintiff not knowing how to use their car is not a valid defense.