Elon Musk’s FSD v12 demo includes a near miss at a red light and doxxing Mark Zuckerberg — 45-minute video was meant to demonstrate v12 of Tesla’s Full Self-Driving but ended up being a list of thi…::Elon Musk posted a 45-minute live demonstration of v12 of Tesla’s Full Self-Driving feature. During the video, Musk has to take control of the vehicle after it nearly runs a red light. He also doxxes Mark Zuckerberg.

  • Thorny_Thicket@sopuli.xyz
    link
    fedilink
    English
    arrow-up
    2
    arrow-down
    14
    ·
    1 year ago

    That depends on what you value.

    If you want self driving cars that follow traffic rules to the letter even if that means more people are going to die then that’s fine. I don’t agree but I can see why someone would think that. Personally I would prioritize human life so if it turns out this is one of the cases when bending the rules does in fact lead to less accidents then that’s what I’m voting for.

    I’m not claiming either is true. Just asking to consider the fact that the right thing to do is not always intuitive.

      • Thorny_Thicket@sopuli.xyz
        link
        fedilink
        English
        arrow-up
        1
        arrow-down
        2
        ·
        edit-2
        1 year ago

        That third option is the first option in my view.

        For the sake of an argument let’s imagine that most people drive 10kph over the speedlimit on highways and statistically a significant number of accidents happens when people are overtaking someone driving slower.

        Now by driving faster these dangerous overtakes happen way less often and it results in overall increase in safety but it’s also against the rules so how does your “third option” solve this issue?

    • batmaniam@lemmy.world
      link
      fedilink
      English
      arrow-up
      3
      ·
      1 year ago

      When someone is driving, if they misjudge and bend the rules at wrong time, and kill someone they go to court. They can potentially be convicted of all sorts of things.

      Who’s going to court when a car does it? Who serves the jail time?

      • Thorny_Thicket@sopuli.xyz
        link
        fedilink
        English
        arrow-up
        2
        ·
        1 year ago

        With the current systems the driver obviously. These systems are not yet advanced enough to be blindly relied on

        • batmaniam@lemmy.world
          link
          fedilink
          English
          arrow-up
          2
          ·
          1 year ago

          I should have been more clear: I meant an AI trained to break the rules the way we’re talking about. Having the ability to make a judgement also means responsibility for that judgement. If I cross a double yellow to get around farm equipment on a back country road, and I misjudge and kill someone, it’s on me. It doesn’t matter if 999/1000 I could have broke the rules responsibly.

          So who goes to jail when a car does it?

          • Thorny_Thicket@sopuli.xyz
            link
            fedilink
            English
            arrow-up
            2
            ·
            1 year ago

            Well its an ongoing discussion with no definite answer but here’s how I see it:

            Let’s say a car manufacturer comes up with a self-driving vehicle that is proven to be, let’s say, 3 times better than a skilled human driver. It is then objectively true to say that everyone would be safer in one of these cars. You could even argue it’s the responsible thing to do, especially compared to driving by yourself, right?

            Well, maybe as a society, we don’t prohibit people from driving, but you must then acknowledge that if you cause an accident, you would also suffer the consequences. However, even these self-driving vehicles aren’t foolproof. Despite being 3 times safer, they will still end up in accidents. Who do we blame for this, then? That’s what I take you’re asking?

            No one, really, I guess. Assigning blame might not be the most productive thing to do, and it could be more reasonable to think of these accidents as a collective risk that users willingly accept when using these products. You’re already accepting that risk now, so taking a risk three times smaller shouldn’t be an issue. Perhaps it’s conceivable that the vehicle manufacturer pays some compensation to the victim/family too but not because it’s their fault per se, but because they can afford it and it seems like the fair thing to do.

            • batmaniam@lemmy.world
              link
              fedilink
              English
              arrow-up
              1
              ·
              1 year ago

              Fun conversation.

              I don’t think the statistics resolve the issue though. At the end of the day, you can’t give something agency without accountability. I guess it’s similar to a well behaved dog at a park that loses it and eats an old man or something. The statistics only matter so much: the owner introduced an unpredictable element with it’s own agency, you can’t hold a dog accountable so the owner inherits that responsibility.

              When I drive, I do accept a risk, but I do so knowing there are a set of rules everyone is following to minimize that risk, and that there’s accountability should someone choose not to follow them. I guess what I’m saying is that an autonomous vehicle reducing my risk by 3x, 100x, 1000x, doesn’t change the accountability for a single instance in which it got it wrong. Not when we’re talking about it knowingly and intentionally violating established traffic laws. That’s like saying a highly trained race car driver get’s off the hook for hitting someone while driving way to fast in public because, statistically, they’re actually much less of a risk to the public than most drivers.

              This is all assuming, by the way, that we’re talking about a well tested, well understood system. I think having vehicles on the road right now which are advertised as “full self driving”, when there are known issues, make a whole group of people of people directly responsible for any deaths that occur.

              • Thorny_Thicket@sopuli.xyz
                link
                fedilink
                English
                arrow-up
                1
                ·
                1 year ago

                Moral questions about autonomous vehicles is an interesting subject. There’s a lot of difficult questions like this that we have to come up with answers to. For example there’s also the issue wether in case of an unavoidable accident should the car prioritize the life of the passengers over everyone else meaning that given the choice it’s going to rather drive over a pedestrian than hit a brick wall. Human doesn’t have time to think about this and react on time but AI does.