The White House wants to ‘cryptographically verify’ videos of Joe Biden so viewers don’t mistake them for AI deepfakes::Biden’s AI advisor Ben Buchanan said a method of clearly verifying White House releases is “in the works.”

  • DrCake@lemmy.world
    link
    fedilink
    English
    arrow-up
    50
    arrow-down
    3
    ·
    10 months ago

    Yeah good luck getting to general public to understand what “cryptographically verified” videos mean

    • patatahooligan@lemmy.world
      link
      fedilink
      English
      arrow-up
      22
      arrow-down
      1
      ·
      10 months ago

      The general public doesn’t have to understand anything about how it works as long as they get a clear “verified by …” statement in the UI.

      • kandoh@reddthat.com
        link
        fedilink
        English
        arrow-up
        4
        ·
        10 months ago

        The problem is that even if you reveal the video as fake,the feeling it reinforces on the viewer stays with them.

        “Sure that was fake,but the fake that it seems believable tells you everything you need to know”

        • go_go_gadget@lemmy.world
          link
          fedilink
          English
          arrow-up
          4
          arrow-down
          1
          ·
          edit-2
          10 months ago

          “Herd immunity” comes into play here. If those people keep getting dismissed by most other people because the video isn’t signed they’ll give up and follow the crowd. Culture is incredibly powerful.

    • BradleyUffner@lemmy.world
      link
      fedilink
      English
      arrow-up
      18
      arrow-down
      1
      ·
      10 months ago

      It could work the same way the padlock icon worked for SSL sites in browsers back in the day. The video player checks the signature and displays the trusted icon.

    • Funderpants
      link
      fedilink
      English
      arrow-up
      17
      arrow-down
      3
      ·
      10 months ago

      Democrats will want cryptographically verified videos, Republicans will be happy with a stamp that has trumps face on it.

      • wizardbeard@lemmy.dbzer0.com
        link
        fedilink
        English
        arrow-up
        4
        arrow-down
        2
        ·
        edit-2
        10 months ago

        I mean, how is anyone going to crytographically verify a video? You either have an icon in the video itself or displayed near it by the site, meaning nothing, fakers just copy that in theirs. Alternatively you have to sign or make file hashes for each permutation of the video file sent out. At that point how are normal people actually going to verify? At best they’re trusting the video player of whatever site they’re on to be truthful when it says that it’s verified.

        Saying they want to do this is one thing, but as far as I’m aware, we don’t have a solution that accounts for the rampant re-use of presidential videos in news and secondary reporting either.

        I have a terrible feeling that this would just be wasted effort beyond basic signing of the video file uploaded on the official government website, which really doesn’t solve the problem for anyone who can’t or won’t verify the hash on their end.


        Maybe some sort of visual and audio based hash, like musicbrainz ids for songs that are independant of the file itself but instead on the sound of it. Then the government runs a server kind of like a pgp key server. Then websites could integrate functionality to verify it, but at the end of the day it still works out to a “I swear we’re legit guys” stamp for anyone not techinical enough to verify independantly thenselves.


        I guess your post just seemed silly when the end result of this for anyone is effectively the equivalent of your “signed by trump” image, unless the public magically gets serious about downloading and verifying everything themselves independently.

        Fuck trump, but there are much better ways to shit on king cheeto than pretending the average populace is anything but average based purely on political alignment.

        You have to realize that to the average user, any site serving videos seems as trustworthy as youtube. Average internet literacy is absolutely fucking abysmal.

        • technojamin@lemmy.world
          link
          fedilink
          English
          arrow-up
          4
          ·
          10 months ago

          People aren’t going to do it, the platforms that 95% of people use (Facebook, Tik Tok, YouTube, Instagram) will have to add the functionality to their video players/posts. That’s the only way anything like this could be implemented by the 2024 US election.

        • beefontoast@lemmy.world
          link
          fedilink
          English
          arrow-up
          2
          ·
          10 months ago

          In the end people will realise they can not trust any media served to them. But it’s just going to take time for people to realise… And while they are still blindly consuming it, they will be taken advantage of.

          If it goes this road… Social media could be completely undermined. It could become the downfall of these platforms and do everyone a favour by giving them their lives back after endless doom scrolling for years.

        • Strykker@programming.dev
          link
          fedilink
          English
          arrow-up
          1
          ·
          10 months ago

          Do it basically the same what TLS verification works, sure the browsers would have to add something to the UI to support it, but claiming you can’t trust that is dumb because we already use that to trust the site your on is your bank and not some scammer.

          Sure not everyone is going to care to check, but the check being there allows people who care to reply back saying the video is faked due to X

    • makeasnek@lemmy.ml
      link
      fedilink
      English
      arrow-up
      5
      ·
      10 months ago

      “Not everybody will use it and it’s not 100% perfect so let’s not try”

      • NateNate60@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        10 months ago

        That’s not the point. It’s that malicious actors could easily exploit that lack of knowledge to trick users into giving fake videos more credibility.

        If I were a malicious actor, I’d put the words “✅ Verified cryptographically by the White House” at the bottom of my posts and you can probably understand that the people most vulnerable to misinformation would probably believe it.

    • maynarkh@feddit.nl
      link
      fedilink
      English
      arrow-up
      7
      arrow-down
      7
      ·
      10 months ago

      Just make it a law that if as a social media company you allow unverified videos to be posted, you don’t get safe harbour protections from libel suits for that. It would clear right up. As long as the source of trust is independent of the government or even big business, it would work and be trustworthy.

      • General_Effort@lemmy.world
        link
        fedilink
        English
        arrow-up
        16
        arrow-down
        1
        ·
        10 months ago

        Back in the day, many rulers allowed only licensed individuals to operate printing presses. It was sometimes even required that an official should read and sign off on any text before it was allowed to be printed.

        Freedom of the press originally means that exactly this is not done.

        • Funderpants
          link
          fedilink
          English
          arrow-up
          7
          arrow-down
          1
          ·
          10 months ago

          Jesus, how did I get so old only to just now understand that press is not journalism, but literally the printing press in ‘Freedom of the press’.

        • vithigar
          link
          fedilink
          English
          arrow-up
          3
          arrow-down
          2
          ·
          edit-2
          10 months ago

          You understand that there is a difference between being not permitted to produce/distribute material and being accountable for libel, yes?

          “Freedom of the press” doesn’t mean they should be able to print damaging falsehood without repercussion.

          • General_Effort@lemmy.world
            link
            fedilink
            English
            arrow-up
            10
            ·
            10 months ago

            What makes the original comment legally problematic (IMHO), is that it is expected and intended to have a chilling effect pre-publication. Effectively, it would end internet anonymity.

            It’s not necessarily unconstitutional. I would have made the argument if I thought so. The point is rather that history teaches us that close control of publications is a terrible mistake.

            The original comment wants to make sure that there is always someone who can be sued/punished, with obvious consequences for regime critics, whistleblowers, and the like.

            • Dark Arc@social.packetloss.gg
              link
              fedilink
              English
              arrow-up
              1
              ·
              edit-2
              10 months ago

              We need to take history into account but I think we’d be foolish to not acknowledge the world has indeed changed.

              Freedom of the press never meant that any old person could just spawn a million press shops and pedal whatever they wanted. At best the rich could, and nobody was anonymous for long at that kind of scale.

              Personally I’m for publishing via proxy (i.e. an anonymous tip that a known publisher/person is responsible for) … I’m not crazy about “anybody can write anything on any political topic and nobody can hold them accountable offline.”

            • vithigar
              link
              fedilink
              English
              arrow-up
              1
              arrow-down
              3
              ·
              10 months ago

              So your suggestion is that libel, defamation, harassment, et al are just automatically dismissed when using online anonymous platforms? We can’t hold the platform responsible, and we can’t identify the actual offender, so whoops, no culpability?

              I strongly disagree.

                • vithigar
                  link
                  fedilink
                  English
                  arrow-up
                  1
                  ·
                  10 months ago

                  I am not. And if that’s not what’s implied by their comments then I legitimately have no idea what they’re suggesting and would appreciate an explanation.

      • bionicjoey
        link
        fedilink
        English
        arrow-up
        3
        ·
        10 months ago

        As long as the source of trust is independent of the government or even big business, it would work and be trustworthy

        That sounds like wishful thinking