A Tesla Model S exited a freeway, ran a red light and slammed into a Honda Civic in 2019 in Gardena

  • @[email protected]
    link
    fedilink
    52 years ago

    Tesla should be held liable. Their autopilot mode is terrifyingly bad. One of my best friends owns a Tesla Model 3 and showed me the autopilot mode-- the whole time he was saying "just wait, it’ll fuck up somehow" and sure enough it inexplicably tried to take a right exit off the highway by jamming the brakes and veering to the right until my friend took manual control over it again.

    I honestly can’t believe Tesla autopilot mode is allowed on roads. It’s so clearly still technology in its infancy and not at all ready for real-world application. The company misleads Tesla owners into a false sense of safety and has hoards of lawyers who’ve quite clearly done everything they can to protect Tesla from any liability. Lawmakers won’t adapt because the whole system is reliant on not stifling the almighty growth of corporations like Tesla.

    • @[email protected]
      link
      fedilink
      42 years ago

      Doesn’t autopilot requires de driver to pay attention and have hands at the wheel at all times? I’d guess they could be held liable if they could prove the driver tried to correct the car but faulty software didn’t allow him/her to take control back.🤷

      • @[email protected]
        link
        fedilink
        1
        edit-2
        2 years ago

        This is like giving a kid a cake to hold and getting mad at them when they eat it. We know that humans can’t pay attention to mundane tasks. Maybe a few people can all of the time, and most people can some of the time, but as a rule it just doesn’t happen. It is utterly irresponsible to pretend like this isn’t true and ship diver-assist systems that are good enough that people stop paying attention.

        I think Autonomy Levels 2-4 should be outright illegal. They are basically guaranteed to have people not paying full attention and result in crashes. Level 5 should be legal but the manufacture should be responsible for any accidents, illegal actions or other malfunctions.

        • @[email protected]
          link
          fedilink
          12 years ago

          This is like giving a kid a cake to hold a

          If holding cakes is something that should only ever be entrusted to adults, then it stands to reason that this person should never be allowed to hold a cake again, doesn’t it?

          It’s not like someone who made the mistake of fucking around with Tesla auto-drive is going to improve in ways that we should ever trust them to drive a car again. They should lose their license for life, if nothing else.

    • mekhos
      link
      fedilink
      22 years ago

      Sounds like it should be renamed after one of the most dangerous jobs in the world - “Test Pilot”

    • scrollbars
      link
      fedilink
      12 years ago

      I think it comes down to the rate of autopilot fuck ups. If it’s close to or worse than human drivers Tesla should definitely be held to account. Or if there are traffic scenarios where autopilot is shown to commonly put people in danger I think that also qualifies. Of course getting objective/non-tampered data is the hard part…

  • @[email protected]
    link
    fedilink
    4
    edit-2
    2 years ago

    Yes. Unambiguously yes. The current laws on self driving cars is very clear that the driver must always be looking at the road, must always be assessing whether the car is driving safely, and must be ready to take control of the car at any time.

    I’m not defending Tesla here, I hate them as much as the next guy, but to say that the driver was not responsible for this is ridiculous.

    • @[email protected]
      link
      fedilink
      1
      edit-2
      2 years ago

      The law says yes. However I think the law is wrong. I think the law should put the company that designed the system at fault. They are creating a system that is encouraging people to not focus on the road and if they are doing that they should be responsible for what the car does.

      I also think that Tesla is super irresponsible by living in the “uncanny valley” of self-driving where they advertise like it is full self driving (Autopilot) but then say the driver needs to pay attention. I think this should be illegal because it is well-known the humans can’t reliably pay attention to mundane tasks. I think Google did this right. They did an early test of self-driving with their employees, noticed that a lot of them weren’t paying attention and pulled the plug. Then they stopped testing until they had full self driving.

  • @[email protected]
    link
    fedilink
    32 years ago

    A bad autopilot is even better than a lot of human drivers. It is not the question that an autopilot is perfect, although accidents do happen, they are much less frequent than with human drivers. The driver is to blame for this accident, because having an autopilot does not mean that he can go to sleep on the trip.

  • @[email protected]
    link
    fedilink
    1
    edit-2
    2 years ago

    While the driver is certainly to blame for not paying attention, one must ask why he even blindly trusted on the system in the first place. Tesla could still be found guilty for false advertising.

    There’s an ongoing review to check if Tesla is violating the DMV regulation that bars companies from marketing their cars as autonomous when they are not.

    Tesla’s real crime here would be lying to people, not caring about their lives, only for their own profit.

    • alex
      link
      fedilink
      12 years ago

      Tesla will tell you to pay attention, the driver is always at fault. Tesla’s system is law-abiding and the law says the driver of the car is responsible, even if they technically aren’t driving.