University vending machine error reveals use of secret facial recognition | A malfunctioning vending machine at a Canadian university has inadvertently revealed that a number of them have been usin…::Snack dispenser at University of Waterloo shows facial recognition message on screen despite no prior indication

  • Greg Clarke
    link
    fedilink
    English
    arrow-up
    7
    arrow-down
    97
    ·
    10 months ago

    This seems like an over reaction by people who don’t understand the technology or associated risks. Focus on the implementation not the tech. There is no indication that the vending machine is inappropriatly storing or transmitting personally identifiable information or that its making decisions based on biased data.

      • Greg Clarke
        link
        fedilink
        English
        arrow-up
        3
        arrow-down
        20
        ·
        10 months ago

        Likely for general marketing feedback so not targeting individuals like Facebook, Google, etc. If the vending machine is GDPR compliant then it’s not storing individuals PII on the machine (it would be physically insecure) or transmitting PII without consent. And anyway, the marketing team wouldn’t care about individuals, they’re looking for aggregate trends. I think we should have stricter anti-marketing laws but this is not a dangerous anti-privacy vector. Online marketing is far far worse so if we’re concerned with privacy, let’s implement laws and policies that protect privacy instead of these BS distractions that don’t actually affect people’s privacy.

    • BearOfaTime@lemm.ee
      link
      fedilink
      English
      arrow-up
      42
      arrow-down
      4
      ·
      10 months ago

      Hahahahahahahahahahahahajaja

      Total “trust me bro” take.

      I have the keys to your house, but there’s no evidence I’m using them inappropriately.

      I never say this, but go lick some more boot.

      • Greg Clarke
        link
        fedilink
        English
        arrow-up
        3
        arrow-down
        32
        ·
        10 months ago

        You obviously don’t work in tech in Canada. Do a tiny bit of some research before generating strong opinions

    • the_tab_key@lemmy.world
      link
      fedilink
      English
      arrow-up
      29
      ·
      10 months ago

      This is a pretty “generous” take. I ask you then: if the company isn’t doing communicating any of the scans/recordings, what is the purpose of the technology being installed in the first place?

      • conciselyverbose@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        8
        arrow-down
        1
        ·
        10 months ago

        Cameras are one thing.

        But if you can actually process it, that’s a meaningful cost per unit. The only reason you do that is if you’re planning to use it.

        • Greg Clarke
          link
          fedilink
          English
          arrow-up
          3
          arrow-down
          7
          ·
          10 months ago

          This type of analysis is cheap nowadays. You could easily fit a model to extract demographics from an image on a Jetson Nano (basically a Raspberry Pi with a GPU). Models have gotten more efficient while hardware has also gotten cheaper.

          • conciselyverbose@sh.itjust.works
            link
            fedilink
            English
            arrow-up
            7
            arrow-down
            1
            ·
            10 months ago

            MSRP is $100. Even assuming you can cut that to $50 in bulk, $50 per unit is something that manufacturers are going to take seriously as an added cost. They’re not going to pay it without an intent to use it.

            And that’s before software costs. Even leveraging open source it’s still going to take investment to tailor it to your deployment.

            • Greg Clarke
              link
              fedilink
              English
              arrow-up
              1
              arrow-down
              3
              ·
              10 months ago

              I doubt they would implement thing on every vending machine. They can still derive some useful analytic data from a smaller sample size

      • Greg Clarke
        link
        fedilink
        English
        arrow-up
        2
        arrow-down
        4
        ·
        10 months ago

        Marketing is often targeted, especially online (which is a huge privacy issue). I would guess they are using the data from these vending machines to measure the success of their marketing campaigns.

        • the_tab_key@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          arrow-down
          1
          ·
          10 months ago

          Like I said: generous. You are "guess"ing that what they are doing with it is above board. I’m not that trusting of corporations.

          People trusted Boeing would put planes together with the utmost concern for safety… Then a fucking for feel off mid-flight.

          • Greg Clarke
            link
            fedilink
            English
            arrow-up
            1
            ·
            10 months ago

            The FAA failed to regulate Boeing. I’m pro regulation and laws that protect people’s privacy. And if this company and the individuals within it break the law they should receive appropriate punishments with fines tied to international revenue.

            My point is that the laws should relate to privacy independent of the technology. The “ban face recognition” narrative misses the point and doesn’t address the threats. Facial recognition technology can be used in ways that don’t threaten individuals privacy and non facial recognition technologies can be a threat to individual privacy.

            It’s cynical to assume this company is breaking privacy with no evidence. But it’s fair to say there needs to be greater punishments and regulations

      • Greg Clarke
        link
        fedilink
        English
        arrow-up
        3
        arrow-down
        22
        ·
        10 months ago

        There is no indication that the vending machine was collecting customer biometrics. In fact that would prevent it from being GDPR compliant.

          • Greg Clarke
            link
            fedilink
            English
            arrow-up
            4
            arrow-down
            16
            ·
            10 months ago

            That’s not true. They’re likely using a model that identifies some demographic attribute and associating that with a purchase. It’s 2024, this can all be done on the machine. The machine doesn’t need to store the individuals data etc. If the vending is storing enough data to identify individuals then it wouldn’t be GDPR compliant.

              • Greg Clarke
                link
                fedilink
                English
                arrow-up
                2
                arrow-down
                8
                ·
                10 months ago

                Consent is a requirement for GDPR compliance. They are likely taking an image from the camera, extracting semantic attributes from the image, and then discarding the image. The length of time the individual is standing there making the purchase is likely longer than the image is stored in memory while extracting the attributes.

                • uis@lemm.ee
                  link
                  fedilink
                  English
                  arrow-up
                  7
                  arrow-down
                  1
                  ·
                  10 months ago

                  I bet there is no button “consent to biometrics collection”

      • Greg Clarke
        link
        fedilink
        English
        arrow-up
        3
        arrow-down
        24
        ·
        10 months ago

        It’s because I understand the technology and the actual threats to our privacy.

          • Greg Clarke
            link
            fedilink
            English
            arrow-up
            1
            arrow-down
            3
            ·
            10 months ago

            I have in other sections of this thread. I don’t want to copy and paste but I’m happy to answer any specific questions.

    • Billiam@lemmy.world
      link
      fedilink
      English
      arrow-up
      17
      ·
      10 months ago

      There is no indication that the vending machine is inappropriatly storing or transmitting personally identifiable information or that its making decisions based on biased data.

      And until the machine malfunctioned, there was no indication that the vending machine was collecting any data at all. Businesses can say whatever they want in the court of public opinion, but until these same claims are made in a court of law they should be considered lies to placate the public.

      Furthermore, why even collect such data if it’s not meant to be utilized? They already know what the most popular products are (since they know what they restock the most) so for what reason do they need to collect demographics?

    • bionicjoey
      link
      fedilink
      English
      arrow-up
      8
      arrow-down
      1
      ·
      10 months ago

      Says a guy who doesn’t hide his real name, face, and location for his online persona. You have no concept of digital privacy.

      • Greg Clarke
        link
        fedilink
        English
        arrow-up
        3
        arrow-down
        7
        ·
        10 months ago

        Arguing that I have no concept of digital privacy because I choose to share my name and face is an ignorant statement and demonstrates how little you understand the concept of online privacy. For context, I work in tech in Canada, I deal with GDPR and other compliances. I understand the technology, the risks, and the attack vectors. These vending machines are not a serious threat to individuals privacy. Facebook, Google, Amazon, are serious threats. Focus your energy on the actual risks instead of making uninformed comments.

        • bionicjoey
          link
          fedilink
          English
          arrow-up
          6
          arrow-down
          3
          ·
          10 months ago

          Did 2yo Marisol also make an informed choice to share her identity and location on the fediverse?

          This vending machine is taking biometrics off of everyone who walks past it and you don’t think that’s the least bit concerning?

          GDPR doesn’t apply in Canada unless you are trying to operate business in Europe.

          Compliance only matters if you can’t afford a fine. If you can make more money violating regulations than the cost of the fine, it’s just a business expense.

          • Greg Clarke
            link
            fedilink
            English
            arrow-up
            2
            arrow-down
            6
            ·
            10 months ago

            You pretend to care about consent and privacy and then mention my daughter by name here. You’ll notice I share photos and details about my daughter from accounts on servers I control. There is an implicit agreement in the fediverse to respect people’s privacy. I obviously don’t rely on that implicit agreement because some people do unethical things as demonstrated in your post. I protect my daughter from legitimate online privacy and security threats, I don’t play privacy and security theatre.

            This vending machine is taking biometrics off of everyone who walks past

            You have no evidence of this and there is no mention of this in the article. This also doesn’t make any sense from an implementation perspective.

            GDPR doesn’t apply in Canada unless you are trying to operate business in Europe.

            You’re correct that GDPR doesn’t apply in Canada, it’s just that GDPR is usually the strictest compliance so it’s usual for companies to meet that compliance as a minimum.

            Compliance only matters if you can’t afford a fine.

            GDPR fines can be tied to global revenue.

            When your beliefs don’t align with the facts, consider changing your beliefs instead of doubling down on your opinions, making things up, and doing unethical things. Please try better.

    • Luci
      link
      fedilink
      English
      arrow-up
      7
      arrow-down
      2
      ·
      10 months ago

      This is the first step to charging a different price based on demographic.

      • Greg Clarke
        link
        fedilink
        English
        arrow-up
        3
        arrow-down
        13
        ·
        10 months ago

        The Canadian Human Rights Act protects Canadians from discrimination based on race, national or ethnic origin, colour, religion, age, sex, sexual orientation, gender identity or expression, marital status, family status, genetic characteristics, disability etc.

        • uis@lemm.ee
          link
          fedilink
          English
          arrow-up
          3
          arrow-down
          1
          ·
          10 months ago

          This is not the last step to charging a different price based on demographic.

    • min_fapper@iusearchlinux.fyi
      link
      fedilink
      English
      arrow-up
      4
      arrow-down
      11
      ·
      10 months ago

      Yikes, people here are brutal to people with differing viewpoints, heh.

      Doesn’t seem to matter how knowledgeable in the subject matter the person may be either.

      ¯\_(ツ)_/¯

      • Greg Clarke
        link
        fedilink
        English
        arrow-up
        2
        arrow-down
        10
        ·
        10 months ago

        Lol yeah, if the easily checked facts don’t align with beliefs then groupthink-people double down on their beliefs. Denying reality is easier than changing beliefs. It’s the same reasoning skills that Trump supporters use 😅

        • can@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          4
          ·
          10 months ago

          Which really checked facts are you referring to? It appears to be a matter of differing opinions.