A judge in Washington state has blocked video evidence that’s been “AI-enhanced” from being submitted in a triple murder trial. And that’s a good thing, given the fact that too many people seem to think applying an AI filter can give them access to secret visual data.

  • Richard@lemmy.world
    link
    fedilink
    English
    arrow-up
    3
    arrow-down
    2
    ·
    9 months ago

    That’s wrong. With a degree of certainty, you will always be able to say that this data was likely there. And because existence is all about probabilities, you can expect specific interpolations to be an accurate reconstruction of the data. We do it all the time with resolution upscaling, for example. But of course, from a certain lack of information onward, the predictions become less and less reliable.

    • Hacksaw
      link
      fedilink
      English
      arrow-up
      1
      ·
      9 months ago

      It’s probability, like you said. When you upscale you have to decide if the missing pixel is a smooth interpolation between neighboring pixels or a sharp transition. That decision will always be a guess with a large chance of error. For TV or gaming that’s often OK upscaled videos look nice and most of the time errors are small and not noticable.

      For police purposes, if the difference wasn’t noticeable, they wouldn’t need upscaling in the first place. They’re trying to zoom in further than the video data will go. Nobody should face judicial consequences because an algorithm guessed at the interpolation between pixels and now it looks like your face instead of another person’s.