Elon Musk’s FSD v12 demo includes a near miss at a red light and doxxing Mark Zuckerberg — 45-minute video was meant to demonstrate v12 of Tesla’s Full Self-Driving but ended up being a list of thi…::Elon Musk posted a 45-minute live demonstration of v12 of Tesla’s Full Self-Driving feature. During the video, Musk has to take control of the vehicle after it nearly runs a red light. He also doxxes Mark Zuckerberg.

  • meseek #2982
    link
    fedilink
    English
    arrow-up
    98
    arrow-down
    4
    ·
    10 months ago

    That’s what we were all clambering for: a self driving machine that operates like a mouth breather late for work.

    Elon is a masterclass of stupid.

    • EpsilonVonVehron@lemmy.world
      link
      fedilink
      English
      arrow-up
      7
      ·
      10 months ago

      Mush doesn’t care about laws. As mentioned on another article, he appears to be operating the phone by hand in the driver’s seat, which is both a driving violation and against Tesla’s own driver manual.

      • meseek #2982
        link
        fedilink
        English
        arrow-up
        5
        ·
        edit-2
        10 months ago

        Same guy who parades around in his private jet calling everyone who doesn’t return to the office amoral and selfish.

        So yeah. All that tracks. The entire “it’s different because it’s me” stench wafting in.

    • morriscox@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      arrow-down
      12
      ·
      10 months ago

      WTH is wrong with mouth breathers? What ass grasped for some new insult and came up with that? It’s a lame stupid insult.

    • Thorny_Thicket@sopuli.xyz
      link
      fedilink
      English
      arrow-up
      18
      arrow-down
      91
      ·
      10 months ago

      Perhaps you should put your hatred towards Elon aside for a while and objectively consider what actually is the better solution here.

      One could argue that strictly following the rules is the right approach, and perhaps it would be if everyone actually drove that way. However, in reality, that’s not usually the case. What truly increases traffic safety is predictability. If most drivers are rolling through stop signs and you’re the only one stopping completely, while you might technically be in the right, your behaviour could lead to accidents due to the unpredictability. The same applies to speeding. Driving significantly slower than the flow of traffic might slow down the traffic flow, leading to unsafe overtakings and such. While you might be legally correct here too, in practice, a slight increase in speed could lead to increased road safety.

      These are complex issues. A dose of humility might go a long way instead of acting like the answer is obvious.

      • Honytawk@lemmy.zip
        link
        fedilink
        English
        arrow-up
        26
        ·
        10 months ago

        What better predictability is there than actually following the law?

        Self driving cars should be better than us, not be just like us.

        • Thorny_Thicket@sopuli.xyz
          link
          fedilink
          English
          arrow-up
          2
          arrow-down
          9
          ·
          10 months ago

          Even if self driving car behaves like a human driver it still exceeds humans thousandfold in processing and reaction speed. For a truly advanced self driving system plowing thru stop signs and speeding should be non-issue because unlike humans it can pay 100% attention to its surroundings 100% of the time and react instantly when needed.

      • SatanicNotMessianic@lemmy.ml
        link
        fedilink
        English
        arrow-up
        27
        arrow-down
        1
        ·
        10 months ago

        I do this kind of thing for a living, and have done so for going on 30 years. I study complex systems and how they use learning and adaptation.

        Musk’s approach to these systems is idiotic and shows no understanding of or appreciation for how complex systems - animals, in particular - actually work. He wanted to avoid giving his vehicles lidar, for instance, because animals can navigate the world without it. Yet he didn’t give them either the perceptual or cognitive capabilities that animals have, nor did he take into account the problems of animal locomotion being solved by evolution are very different from the problems solved by people driving vehicles. It, of course, didn’t work, and now Tesla is trailing the pack on self-driving capabilities with the big three German car makers and others prepping class 3 vehicles for shipping.

        If he is trying to chatgpt his way out of the corner he’s painted himself into, he’s just going to make it worse - and, amusingly, for the same reasons. Vision is just one dimension of sensation, and cars are not people, or antelopes, or fish, or whatever his current analogy is.

        This is just Elon Eloning again. No one predicts a car coming towards them is going to do a California stop at a stop sign. If Om pulling into an intersection and I see someone rolling through a stop sign, I’m hitting the brakes because obviously a) they didn’t see me and b) they don’t know the rules of the road. Elon’s cars have a problem with cross traffic and emergency vehicles anyway, making the logic fuzzier is not going to improve the situation. If he thinks throwing video and telemetry data at a large model is going to overcome his under-engineered autonomous system, I suspect he’s going to be in for a rude discovery.

        If there’s anything kids today can learn from Elon (or from Trump for that matter), it’s how to be so confidently wrong that people throw money at you. The problem is that if you’re not already born into wealth and privilege, you’re likely to merely become the owner of the most successful line of car dealerships in a suburban county in Pennsylvania, or else in prison for fraud.

        • Thorny_Thicket@sopuli.xyz
          link
          fedilink
          English
          arrow-up
          1
          arrow-down
          15
          ·
          10 months ago

          If FSD is trained from billions of hours of video data then it by definition drives like an average driver and thus is highly predictable.

          • SatanicNotMessianic@lemmy.ml
            link
            fedilink
            English
            arrow-up
            15
            ·
            10 months ago

            That’s not how it works, unfortunately. That’s how people want it to work, but it’s not how it works.

            This is just more of Elon’s pie in the sky.

            • Thorny_Thicket@sopuli.xyz
              link
              fedilink
              English
              arrow-up
              1
              arrow-down
              9
              ·
              10 months ago

              If you’ve done this kind of stuff for living for the past 30 years then I’m sure you can give me a better explanation than “that’s not how it works”

      • meseek #2982
        link
        fedilink
        English
        arrow-up
        22
        arrow-down
        3
        ·
        10 months ago

        The better solution is to not program your machine to act like a clown behind the wheel, doing all manner of illegal offences because ThAt’s HoW ReGulAr PeoPlE DrIve!

        We aren’t trying to make auto pilot act like a real bonafide driver, we are just removing the inconvenience of needing to do the driving.

        • Thorny_Thicket@sopuli.xyz
          link
          fedilink
          English
          arrow-up
          2
          arrow-down
          14
          ·
          10 months ago

          That depends on what you value.

          If you want self driving cars that follow traffic rules to the letter even if that means more people are going to die then that’s fine. I don’t agree but I can see why someone would think that. Personally I would prioritize human life so if it turns out this is one of the cases when bending the rules does in fact lead to less accidents then that’s what I’m voting for.

          I’m not claiming either is true. Just asking to consider the fact that the right thing to do is not always intuitive.

          • meseek #2982
            link
            fedilink
            English
            arrow-up
            9
            ·
            10 months ago

            Oh we all know what Elon values 🤪

            Let’s pluck out this forced choice fallacy first off. I’m going to opt for c) I want self driving cars to obey the rules of the road “to the letter” and keep people safe. If not, why do they even make traffic rules?

            You and Elon want the cool self driving car that cruises 60 in a 50 with traffic and occasionally doesn’t check its blind spot but quickly recovers and gives a quick wave like sorry bro my bad.

            I mean, okay Jerry.

            • Thorny_Thicket@sopuli.xyz
              link
              fedilink
              English
              arrow-up
              1
              arrow-down
              2
              ·
              edit-2
              10 months ago

              That third option is the first option in my view.

              For the sake of an argument let’s imagine that most people drive 10kph over the speedlimit on highways and statistically a significant number of accidents happens when people are overtaking someone driving slower.

              Now by driving faster these dangerous overtakes happen way less often and it results in overall increase in safety but it’s also against the rules so how does your “third option” solve this issue?

          • batmaniam@lemmy.world
            link
            fedilink
            English
            arrow-up
            3
            ·
            10 months ago

            When someone is driving, if they misjudge and bend the rules at wrong time, and kill someone they go to court. They can potentially be convicted of all sorts of things.

            Who’s going to court when a car does it? Who serves the jail time?

            • Thorny_Thicket@sopuli.xyz
              link
              fedilink
              English
              arrow-up
              2
              ·
              10 months ago

              With the current systems the driver obviously. These systems are not yet advanced enough to be blindly relied on

              • batmaniam@lemmy.world
                link
                fedilink
                English
                arrow-up
                2
                ·
                10 months ago

                I should have been more clear: I meant an AI trained to break the rules the way we’re talking about. Having the ability to make a judgement also means responsibility for that judgement. If I cross a double yellow to get around farm equipment on a back country road, and I misjudge and kill someone, it’s on me. It doesn’t matter if 999/1000 I could have broke the rules responsibly.

                So who goes to jail when a car does it?

                • Thorny_Thicket@sopuli.xyz
                  link
                  fedilink
                  English
                  arrow-up
                  2
                  ·
                  10 months ago

                  Well its an ongoing discussion with no definite answer but here’s how I see it:

                  Let’s say a car manufacturer comes up with a self-driving vehicle that is proven to be, let’s say, 3 times better than a skilled human driver. It is then objectively true to say that everyone would be safer in one of these cars. You could even argue it’s the responsible thing to do, especially compared to driving by yourself, right?

                  Well, maybe as a society, we don’t prohibit people from driving, but you must then acknowledge that if you cause an accident, you would also suffer the consequences. However, even these self-driving vehicles aren’t foolproof. Despite being 3 times safer, they will still end up in accidents. Who do we blame for this, then? That’s what I take you’re asking?

                  No one, really, I guess. Assigning blame might not be the most productive thing to do, and it could be more reasonable to think of these accidents as a collective risk that users willingly accept when using these products. You’re already accepting that risk now, so taking a risk three times smaller shouldn’t be an issue. Perhaps it’s conceivable that the vehicle manufacturer pays some compensation to the victim/family too but not because it’s their fault per se, but because they can afford it and it seems like the fair thing to do.

                  • batmaniam@lemmy.world
                    link
                    fedilink
                    English
                    arrow-up
                    1
                    ·
                    10 months ago

                    Fun conversation.

                    I don’t think the statistics resolve the issue though. At the end of the day, you can’t give something agency without accountability. I guess it’s similar to a well behaved dog at a park that loses it and eats an old man or something. The statistics only matter so much: the owner introduced an unpredictable element with it’s own agency, you can’t hold a dog accountable so the owner inherits that responsibility.

                    When I drive, I do accept a risk, but I do so knowing there are a set of rules everyone is following to minimize that risk, and that there’s accountability should someone choose not to follow them. I guess what I’m saying is that an autonomous vehicle reducing my risk by 3x, 100x, 1000x, doesn’t change the accountability for a single instance in which it got it wrong. Not when we’re talking about it knowingly and intentionally violating established traffic laws. That’s like saying a highly trained race car driver get’s off the hook for hitting someone while driving way to fast in public because, statistically, they’re actually much less of a risk to the public than most drivers.

                    This is all assuming, by the way, that we’re talking about a well tested, well understood system. I think having vehicles on the road right now which are advertised as “full self driving”, when there are known issues, make a whole group of people of people directly responsible for any deaths that occur.

      • OnionQuest@lemmy.ml
        link
        fedilink
        English
        arrow-up
        15
        ·
        10 months ago

        It’s simply solved by the fact that I, as a human driver, can recognize now when a robo-taxi is driving and change my expectations of the car’s behavior. Right now it’s clearly evident what an autonomous car looks like and a reasonable person will have the expectation that they follow the letter of the law.

        I interact with these vehicles on a daily basis in San Francisco and it would be weird if they weren’t driving perfectly.

      • NotYourSocialWorker@feddit.nu
        link
        fedilink
        English
        arrow-up
        2
        ·
        10 months ago

        If most drivers are rolling through stop signs and you’re the only one stopping completely, while you might technically be in the right, your behaviour could lead to accidents due to the unpredictability.

        Simply no. If you as a driver aren’t prepared that the car in front of you might actually stop when there’s a sign that says stop, and if you aren’t keeping enough of a distance to be able to break, then it isn’t the car in front that is the problem, or who is the one causing the accident, it’s you and only you.

        The same applies to speeding. Driving significantly slower than the flow of traffic might slow down the traffic flow, leading to unsafe overtakings and such.

        Again no. If they are driving at the speed of the signage, keeping the speed and driving predictable, then the ones driving “significantly” faster are the ones decreasing road safety. No-one is forcing them to perform “unsafe overtakings and such”. Also, just because you, from your vantage point, can’t see a reason for the car in front of you driving slowly doesn’t mean that there isn’t one.

        While a dose of humility is good, a dose of personal responsibility is also great

        • Thorny_Thicket@sopuli.xyz
          link
          fedilink
          English
          arrow-up
          1
          ·
          10 months ago

          then the ones driving “significantly” faster are the ones decreasing road safety. No-one is forcing them to perform “unsafe overtakings and such”.

          I’m not claiming it is so, but I’m saying it’s conceivable that if the autonomous vehicle drives slightly over the speed limit, with the flow of traffic, it may actually lead to a statistically significant drop in accidents compared to the scenario where it follows the speed limit. Yes, no one is forcing other drivers to behave in such a way, but they do, and because of that, people die. In this case, forcing self-driving cars to follow traffic rules to the letter would paradoxically mean you’re choosing to kill and injure more people.

          I don’t think the answer to this kind of moral question is obvious. Traffic is such a complex system, and there are probably many other examples where the actually safer thing to do is not what you’d intuitively think.

      • trashgirlfriend@lemmy.world
        link
        fedilink
        English
        arrow-up
        8
        arrow-down
        15
        ·
        10 months ago

        The answer is clear and easy.

        Don’t let computers have full control over freely moving several ton death machines.

        • The King@lemmy.world
          cake
          link
          fedilink
          English
          arrow-up
          14
          arrow-down
          9
          ·
          10 months ago

          This is such a cop-out. “No computers!”, but it’s okay to let someone drive who isn’t paying attention because they’re deep in their phone? I drive a motorcycle and I’ve had people stare me straight in the eye, only to pull out in front of me and nearly kill me.

          People are notoriously bad at driving. The computer doesn’t have to be perfect, just better than the soccer moms or distracted dummies.

        • Honytawk@lemmy.zip
          link
          fedilink
          English
          arrow-up
          6
          arrow-down
          5
          ·
          10 months ago

          After a while the human will be the bottle neck of preventing accidents.

          Computers are a lot better at following the law.