There are uses of AI that are proving to be more than black and white. While voice actors, have protested their performances being fed into AI against their will, we are now seeing an example of this being done, with permission, in a very unique case.

  • theneverfox@pawb.social
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 year ago

    Why not?

    Your likeness is basically IP, if it’s worth anything you can put it in your estate, if its worth a lot you can set up a trust to manage it, and I’m sure there’s some sort of legal shenanigans you can do to make it thorny to use

    I mean you’re dead. If your family sucks and you’re worried they’ll use your voice or face for something evil, you could make it public domain to trash the value, if you care about your legacy, well… Look upon my works and despair and all that. You can burn your estate to protect it for a lifetime or two, or set up a trust to fund itself by selling use of the license according to certain standards… Eventually it’ll either warp into something very different than your body of work (for better or worse), or you’ll fade into obesity before the lawyer money runs out - so it’ll just stop

    A lot of people say “AI is bad” when what they really mean is “AI is powerful; corporations are bad; I don’t want the evil artificial intelligence made by lawyers to misuse the artificial intelligence made by math and human media”

    And kind of like AI, corporations are a tool. They suffer an alignment problem way worse than AI, so trusting them with digital technology like networking has been mostly disastrous, sometimes quite good, but mostly neutral.

    This use of AI to use a dead person’s likeness isn’t good or bad… It’s just neutral. There’s no greater issue here than the media industry getting alternatives to human talent - the people are dead, some legacies might corrode faster, but there’s no legal hack or big moral peril here.

    There are people who lived in the small window of good enough recording/storage to be useful for this tech to be useful, died before it was inevitable, but were still recognizable before it “disrupts” media entirely in a year.

    With another year, the consumer grade abilities will go from currently “uncanny similar voices with a short sample” to “indistinguishable from the original voice”… We’re very close to the point where the likeness debate becomes moot because hobbyists can deepfake 4k video for shitposts

    • Mongostein
      link
      fedilink
      English
      arrow-up
      2
      ·
      1 year ago

      Because, as you said, some people don’t get along with their families and it could be used maliciously.

      I suppose that could be solved in a will though.