• Showroom7561
    link
    fedilink
    arrow-up
    2
    ·
    3 months ago

    Perhaps we’re misunderstanding something.

    It’s a fact that plenty of devices have assistant software running 24/7, with an open mic. We can agree that the key phrase is detected locally via some low-power chip or something similar.

    I’m saying that these virtual assistants are capturing and saving recordings, even when they aren’t explicit commands. Those recordings can then be used to further profile a user.

    Mozilla even says that Amazon claims that they can delete recordings, but will continue to use data collected by the user from those recordings, despite that. This is a problem, IMO, and it can certainly explain many of these coincidences that people are witnessing.

    There has been zero proof about illegal recording, even though it would be easy to find.

    Except that Amazon has had to pay out $25 million for keeping kid’s recordings.

    And the State of Texas has sued Google for illegally collecting voice-data.

    California has also certified several class-action lawsuits against google for illegally recording and using conversations without consent.

    Or that Apple was caught secretly recording voice conversations, even when the user opted-out.. Apple claimed this was a “bug”. LOL

    There are so many cases like this, that we know of. I can’t imagine how many of these privacy nightmare we haven’t been made aware of.

    • Vlyn@lemmy.zip
      link
      fedilink
      arrow-up
      1
      ·
      3 months ago

      One is a bug, one is just a lawsuit that went nowhere, one is just an accusation (Google did pay a fine, but for geolocation tracking, not voice), the Amazon one is pretty bad, but again it’s not for a phone!!!

      Yes, if your phone assistant accidentally activates then your voice might be uploaded without you knowing. That’s a fact. But you agreed to that by enabling the voice assistant (it even warns you about this).

      If you switch your voice assistant off (I have) then you don’t have this issue. What is so difficult to understand here?

      The low powered chips really just listen to a few syllables, they can easily have false positives. That’s just a technical aspect of it.

      • LordKitsuna@lemmy.world
        link
        fedilink
        arrow-up
        1
        ·
        edit-2
        3 months ago

        So just a thought, if they are looking for highly optimized keywords that can be done locally what’s to stop them from adding common keywords for advertising.

        In the given anecdote about babies and diapers, you would literally just need a baby keyword. It gets triggered phone tells the Mothership it heard about babies, suddenly diaper ads. It wasn’t listening to every single word, it wasn’t parsing the sentence, it was just looking for highly optimized ad keywords. You could even set a threshold for how often certain add keywords or triggered to avoid false positives on detection

        • Vlyn@lemmy.zip
          link
          fedilink
          arrow-up
          1
          ·
          3 months ago

          It’s not listening to actual words, that’s already too complex (you’d have to parse language for that, which those low power chips can’t do). It’s listening for syllables, Oh-Kay-Goo-Gle or whatever. Depends on the chip and implementation of course, which is also why you get false positives when someone says something similar.

          If you add more syllables to that then your phone would activate literally all the time, with tons of false positives.

          Seriously, if we had low powered voice recording + voice to text you’d already have instant conversation subtitles on your phone, instant translation and so on. We simply don’t have that yet, those features do exist but they are power hungry (so if you do use them say goodbye to your battery life).