Self-driving cars are often marketed as safer than human drivers, but new data suggests that may not always be the case.

Citing data from the National Highway Traffic Safety Administration (NHTSA), Electrek reports that Tesla disclosed five new crashes involving its robotaxi fleet in Austin. The new data raises concerns about how safe Tesla’s systems really are compared to the average driver.

The incidents included a collision with a fixed object at 17 miles per hour, a crash with a bus while the Tesla vehicle was stopped, a crash with a truck at four miles per hour, and two cases where Tesla vehicles backed into fixed objects at low speeds.

  • HarneyToker@lemmy.world
    link
    fedilink
    English
    arrow-up
    9
    ·
    8 hours ago

    Got this saved next time someone tells me that a robot can drive better than a human. They almost had me there, but data doesn’t lie.

    • greygore@lemmy.world
      link
      fedilink
      English
      arrow-up
      4
      ·
      edit-2
      1 hour ago

      This is more specific to Tesla than self driving in general, as Musk decided that additional sensors (like LiDAR and RADAR on other self driving vehicles) are a problem. Publicly he’s said that it’s because of sensor contention - that if the RADAR and cameras disagree, then the car gets confused.

      Of course that raises the problem that when the camera or image recognition is wrong, there’s nothing to tell the car otherwise, like the number of Tesla drivers decapitated by trailers that the car didn’t see. Additionally, I assume Teslas have accelerometers so either the self driving model is ignoring potential collisions or it’s still doing sensor fusion.

      Not to mention we humans have multiple senses that we use when driving; this is one reason why steering wheels still mostly use mechanical linkages - we can “feel” the road, we can detect when the wheels lose traction, we can feel inertia as we go around a corner too fast. On a related tangent, the Tesla Cybertruck uses steer-by-wire instead of a mechanical linkage.

      This is why many (including myself) believe Tesla has a much worse safety record than Waymo. I’ve seen enough drunk and distracted drivers to believe that humans will always drive better than a human robot. Don’t get me wrong, I still have concerns about the technology, but Musk and Tesla has a history of ignoring safety concerns - see the number of deaths related to his desire to have non-mechanical handles and hide the mechanical backup.

    • Buddahriffic@lemmy.world
      link
      fedilink
      English
      arrow-up
      7
      ·
      7 hours ago

      A robot can theoretically drive better than a human because emotions and boredom don’t have to be involved. But we aren’t there yet and Teslas are trying to solve the hard mode of pure vision without range finding.

      Also, I suspect that the ones we have are set up purely as NNs where everything is determined by the training, which likely means there’s some random-ass behaviour for rare edge cases where it “thinks” slamming on the accelerator is as good an option as anything else but since it’s a black box no one really understands, there’s no way to tell until someone ends up in that position.

      The tech still belongs in universities, not on public roads as a commercial product/service. Certainly not by the type of people who would at any point say, “fuck it, good enough, ship it like that”, which seems to be most of the tech industry these days.

    • w3dd1e@lemmy.zip
      link
      fedilink
      English
      arrow-up
      3
      ·
      7 hours ago

      Other robots might be able to, but I wouldn’t trust a Tesla RoboTaxi get me safely across a single street.