TL;DR: Self-Driving Teslas Rear-End Motorcyclists, Killing at Least 5

Brevity is the spirit of wit, and I am just not that witty. This is a long article, here is the gist of it:

  • The NHTSA’s self-driving crash data reveals that Tesla’s self-driving technology is, by far, the most dangerous for motorcyclists, with five fatal crashes that we know of.
  • This issue is unique to Tesla. Other self-driving manufacturers have logged zero motorcycle fatalities with the NHTSA in the same time frame.
  • The crashes are overwhelmingly Teslas rear-ending motorcyclists.

Read our full analysis as we go case-by-case and connect the heavily redacted government data to news reports and police documents.

Oh, and read our thoughts about what this means for the robotaxi launch that is slated for Austin in less than 60 days.

  • captainastronaut@seattlelunarsociety.org
    link
    fedilink
    English
    arrow-up
    22
    ·
    9 个月前

    Tesla self driving is never going to work well enough without sensors - cameras are not enough. It’s fundamentally dangerous and should not be driving unsupervised (or maybe at all).

    • KayLeadfoot@fedia.ioOP
      link
      fedilink
      arrow-up
      11
      ·
      9 个月前

      Accurate.

      Each fatality I found where a Tesla kills a motorcyclist is a cascade of 3 failures.

      1. The car’s cameras don’t detect the biker, or it just doesn’t stop for some reason.
      2. The driver isn’t paying attention to detect the system failure.
      3. The Tesla’s driver alertness tech fails to detect that the driver isn’t paying attention.

      Taking out the driver will make this already-unacceptably-lethal system even more lethal.

      • jonne@infosec.pub
        link
        fedilink
        English
        arrow-up
        8
        ·
        9 个月前
        1. Self-driving turns itself off seconds before a crash, giving the driver an impossibly short timespan to rectify the situation.
        • KayLeadfoot@fedia.ioOP
          link
          fedilink
          arrow-up
          5
          ·
          9 个月前

          … Also accurate.

          God, it really is a nut punch. The system detects the crash is imminent.

          Rather than automatically try to evade… the self-driving tech turns off. I assume it is to reduce liability or make the stats look better. God.

          • jonne@infosec.pub
            link
            fedilink
            English
            arrow-up
            4
            ·
            edit-2
            9 个月前

            Yep, that one was purely about hitting a certain KPI of ‘miles driven on autopilot without incident’. If it turns off before the accident, technically the driver was in control and to blame, so it won’t show up in the stats and probably also won’t be investigated by the NTSB.

      • br3d@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        edit-2
        9 个月前

        There’s at least two steps before those three:

        -1. Society has been built around the needs of the auto industry, locking people into car dependency

        1. A legal system exists in which the people who build, sell and drive cars are not meaningfully liable when the car hurts somebody
    • scarabic@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      9 个月前

      These fatalities are a Tesla business advantage. Every one is a data point they can use to program their self-driving intelligence. No one has killed as many as Tesla, so no one knows more about what kills people than Tesla. We don’t have to turn this into a bad thing just because they’re killing people /s

    • Ledericas@lemm.ee
      link
      fedilink
      English
      arrow-up
      1
      ·
      9 个月前

      they originally had lidar, or radar, but musk had them disabled in the older models.

      • NotMyOldRedditName@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        edit-2
        9 个月前

        They had radar. Tesla has never had lidar, but they do use lidar on test vehicles to ground truth their camera depth / velocity calculations.

  • keesrif@lemmy.world
    link
    fedilink
    English
    arrow-up
    10
    ·
    9 个月前

    On a quick read, I didn’t see the struck motorcycles listed. Last I heard, a few years ago, was that this mainly affected motorcycles with two rear lights that are spaced apart and fairly low to the ground. I believe this is mostly true for Harleys.

    The theory I recall was that this rear light configuration made the Tesla assume it was looking (remember, only cameras without depth data) at a car that was further down the road - and acceleration was safe as a result. It miscategorised the motorcycle so badly that it misjudged it’s position entirely.

    • jonne@infosec.pub
      link
      fedilink
      English
      arrow-up
      5
      ·
      9 个月前

      Whatever it is, it’s unacceptable and they should really ban Tesla’s implementation until they fix some fundamental issues.

    • KayLeadfoot@fedia.ioOP
      link
      fedilink
      arrow-up
      5
      ·
      9 个月前

      I also saw that theory! That’s in the first link in the article.

      The only problem with the theory: Many of the crashes are in broad daylight. No lights on at all.

      I didn’t include the motorcycle make and model, but I did find it. Because I do journalism, and sometimes I even do good journalism!

      The models I found are: Kawasaki Vulcan (a cruiser bike, just like the Harleys you describe), Yamaha YZF-R6 (a racing-style sport bike with high-mount lights), and a Yamaha V-Star (a “standard” bike, fairly low lights, and generally a low-slung bike). Weirdly, the bike models run the full gamut of the different motorcycles people ride on highways, every type is represented (sadly) in the fatalities.

      I think you’re onto something with the faulty depth sensors. Sensing distance is difficult with optical sensors. That’s why Tesla would be alone in the motorcycle fatality bracket, and that’s why it would always be rear-end crashes by the Tesla.

    • treadful@lemmy.zip
      link
      fedilink
      English
      arrow-up
      1
      ·
      9 个月前

      Still probably a good idea to keep an eye on that Tesla behind you. Or just let them past.

  • Gork@lemm.ee
    link
    fedilink
    English
    arrow-up
    8
    ·
    9 个月前

    Lidar needs to be a mandated requirement for these systems.

  • SkunkWorkz@lemmy.world
    link
    fedilink
    English
    arrow-up
    3
    ·
    9 个月前

    It’s because the system has to rely on visual cues, since Tesla’s have no radar. The system looks at the tail light when it’s dark to gauge the distance from the vehicle. And since some bikes have a double light the system thinks it’s a car in front of them that is far away, when in reality it’s a bike up close. Also remember the ai is trained on human driving behavior which Tesla records from their customers. And we all know how well the average human drives around two wheeled vehicles.

  • PastafARRian@lemmy.world
    link
    fedilink
    English
    arrow-up
    3
    ·
    9 个月前

    I’m wondering how that stacks up to human drivers. Since the data is redacted I’m guessing not well at all.

  • Ledericas@lemm.ee
    link
    fedilink
    English
    arrow-up
    3
    ·
    9 个月前

    the cybertruck is sharp enough to cut a deer in half, surely a biker is just as vulnerable.

  • AnimalsDream@slrpnk.net
    link
    fedilink
    English
    arrow-up
    1
    ·
    edit-2
    9 个月前

    I imagine bicyclists must be æffected as well if they’re on the road (as we should be, technically). As somebody who has already been literally inches away from being rear-ended, this makes me never want to bike in the US again.

    Time to go to Netherlands.

  • Redex@lemmy.world
    link
    fedilink
    English
    arrow-up
    1
    ·
    9 个月前

    Cuz other self driving cars use LIDAR so it’s basically impossible for them to not realise that a bike is there.

    • Not_mikey@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      0
      ·
      9 个月前

      Robots don’t get drunk, or distracted, or text, or speed…

      Anecdotally, I think the Waymos are more courteous than human drivers. Though waymo seems to be the best ones out so far, idk about the other services.

        • dogslayeggs@lemmy.world
          link
          fedilink
          English
          arrow-up
          0
          ·
          9 个月前

          They have remote drivers that CAN take control in very corner case situations that the software can’t handle. The vast majority of driving is don’t without humans in the loop.

          • NotMyOldRedditName@lemmy.world
            link
            fedilink
            English
            arrow-up
            1
            ·
            edit-2
            9 个月前

            They don’t even do that, according to Waymo’s claims.

            They can suggest what the car should do, but they aren’t actually doing it. The car is in complete control.

            Its a nuanced difference, but it is a difference. A Waymo employee never takes control of or operates the vehicle.