During covid times I heard many interesting conspiracy predictions such as the value is money will fall to zero, the whole society will collapse, the vaccine will kill 99% of the population etc. None of those things have happened yet, but can you add some other predicitons to the list?

Actually, long before covid hit, there were all sorts of predictions floating around. You know, things like the 2008 recession will cause the whole economy to collapse and then we’ll go straight to Mad Max style post-apocalyptic nightmare or 9/11 was supposed to start WW3. I can’t even remember all the predictions I’ve heard over the years, but I’m sure you can help me out. Oh, just remembered that someone said that paper and metal money will disappear completely by year xyz. At the time that date was like only a few years away, but now it’s more like 10 years ago or something. Still waiting for that one to come true…

  • corsicanguppy
    link
    fedilink
    arrow-up
    6
    arrow-down
    2
    ·
    1 year ago

    It doesn’t have to be perfect. It just has to be better than humans.

    And it is.

    • tagliatelle@lemmy.world
      link
      fedilink
      arrow-up
      2
      arrow-down
      2
      ·
      1 year ago

      who’s liable when it crashes? And it’s “better” than human drivers in very limited situations with a human driver behind the wheel to take control.

      • shrugal@lemm.ee
        link
        fedilink
        arrow-up
        4
        ·
        edit-2
        1 year ago

        I’d say if the human is supposed to observe and take control then the human is liable unless something about the autopilot made it impossible to intervene (e.g. no time to react). If it’s a completely autonomous autopilot then ofc the manufacturer is liable, who else could it be?! But autopilots would probably have to pass some safety tests before being allowed on the road, and you’d have to prove negligence or malicious intent by the manufacturer (e.g. faking test results). This would be similar to things like medicine, where the manufacturer just can’t guarantee 100% safety.

        Regarding “better”, afaik it’s on average. So if you let 1000 humans and 1000 autopilots drive 1000 miles each the autopilots will produce less accidents overall. Idk if autopilots get better or worse by allowing human intervention, a human could also take control at the wrong moment after all.

      • yiliu@informis.land
        link
        fedilink
        arrow-up
        2
        ·
        1 year ago

        who’s liable when it crashes?

        This is potentially the killer app of self-driving. If it gets safe enough, the company offering self-driving cars can take responsibility for insurance (so long as you use the self-driving feature).