• Szymon
    link
    fedilink
    English
    arrow-up
    68
    arrow-down
    4
    ·
    5 months ago

    You can train AI with just a single voice clip. You can do this on your desktop. Microsoft doesn’t need to sell shit, you put that clip on tiktok yourself.

    • brrt@sh.itjust.works
      link
      fedilink
      arrow-up
      12
      ·
      5 months ago

      You don’t even need to upload anything. They can call you, have a short convo and then just say “oh sorry wrong number” or something. You’d never know.

      • SomeGuy69@lemmy.world
        link
        fedilink
        arrow-up
        2
        ·
        edit-2
        5 months ago

        Yup. You need like 5 to 15 seconds of talking, that’s it. I’ve done this myself to confirm it works actually quite well with.

    • unexposedhazard@discuss.tchncs.de
      link
      fedilink
      arrow-up
      9
      arrow-down
      1
      ·
      5 months ago

      Well they said they dont share their voice anywhere, if thats true it would be concerning. I for one just dont use any centralized unencrypted services that could scrape my voice but i would assume most people think that if they dont publish anything, they are safe…

      • Overzeetop@lemmy.world
        link
        fedilink
        English
        arrow-up
        5
        ·
        5 months ago

        You don’t talk to anyone on the phone through a pbx? Never call your bank? Your doctor? Your credit card company? Any of your insurance company? Even on private systems all of those calls are recorded for legal reasons. And all of them will eventually be compromised.

        • unexposedhazard@discuss.tchncs.de
          link
          fedilink
          arrow-up
          1
          ·
          5 months ago

          I make regular phone calls maybe twice a year, everything can be done by email or web forms in germany. But generally the people who have access to all the phone lines are the feds of whichever country you are in. And they, unlike big tech arent super interested in selling that data.

      • tiramichu@lemm.ee
        link
        fedilink
        arrow-up
        52
        ·
        5 months ago

        The ‘old’ way of faking someone’s voice like you saw in 90s spy movies was to get enough sample data to capture each possible speech sound someone could make, such that those sounds can be combined to form all possible words.

        With AI training you only need enough data to know what someone sounds like ‘in general’ to extrapolate a reasonable model.

        One possible source of voice data is spam-calls.

        You get a call, say “Hello?” And then someone launches into trying to sell you insurance or some rubbish, you say “Sorry I’m not interested, take me off your list please. Okay, bye” and hang up.

        And that is already enough data to replicate your voice.

        When scammers make the call using your fake voice, they usually use a crappy quality line, or background noise, or other techniques to cover up any imperfections in the voice replica. And of course they make it really emotional, urgent and high-stakes to override your family member’s logical thinking.

        Educating your family to be prepared for this stuff is really important.

      • Szymon
        link
        fedilink
        English
        arrow-up
        1
        arrow-down
        2
        ·
        5 months ago

        Yeah I’m gonna go ahead and not give that knowledge out.

          • Szymon
            link
            fedilink
            English
            arrow-up
            1
            arrow-down
            2
            ·
            edit-2
            5 months ago

            Oh gee, someone on the Internet thinks I’ll say it if they tell me they think I’m bluffing. My ego is so hurt! I’d better spill the beans on these unregulated technologies!

              • Szymon
                link
                fedilink
                English
                arrow-up
                1
                ·
                edit-2
                5 months ago

                If they’re too stupid to figure it out, they’re too stupid to consider its implications and consequences when using it. I’m not going to give a weapon to a toddler for the same reason.

                • thanks_shakey_snake
                  link
                  fedilink
                  arrow-up
                  1
                  ·
                  5 months ago

                  I wouldn’t describe myself as “too stupid to figure it out,” more like “interested in hearing more about your contribution to the conversation.”

                  Bit rude to call me stupid for that, IMO.

                  • Szymon
                    link
                    fedilink
                    English
                    arrow-up
                    1
                    ·
                    5 months ago

                    Again, not interested in teaching people how to fake shit, I’m already sick of fake shit and it’s just started.