• smoothbrain coldtakes
    link
    fedilink
    English
    arrow-up
    34
    arrow-down
    8
    ·
    8 months ago

    I don’t care what America is doing, ban it because it’s fucking awful for society. Algorithmic short-form content is literally destroying the attention span of an entire generation, regardless of if gathers data on you or not.

    • cygnus
      link
      fedilink
      arrow-up
      19
      arrow-down
      7
      ·
      8 months ago

      ban it because it’s fucking awful for society.

      Lots of things are awful for society — fast food, conservatism, detached housing. I agree that TikTok is generally detrimental to society but I think banning it is going too far, unless we can definitely established that it’s being used by the CCP as a psyop.

      • smoothbrain coldtakes
        link
        fedilink
        English
        arrow-up
        11
        arrow-down
        4
        ·
        8 months ago

        We’ve mitigated some of that stuff though; with fast food we’ve required calorie counts to be displayed alongside the menu items, which allows you to make an informed decision, even if it’s a bad one.

        With TikTok there’s just being forcefed without knowledge of how any of this black box algorithm bullshit works. It’s like going to McDonalds and being able to have unlimited hamburgers of infinite uniqueness and getting unhealthy in the brain.

        • cygnus
          link
          fedilink
          arrow-up
          8
          ·
          8 months ago

          With TikTok there’s just being forcefed without knowledge of how any of this black box algorithm bullshit works. It’s like going to McDonalds and being able to have unlimited hamburgers of infinite uniqueness and getting unhealthy in the brain.

          You can replace “TikTok” here with any closed-source algorithm-driven social media platform, including YouTube. You could probably even make that argument about search engines too. Hell, if you really want to stretch it, even newspapers and radio could get lumped in there, since we don’t see the decisions that lead to what gets published and how it’s presented.

          • nyan@lemmy.cafe
            link
            fedilink
            English
            arrow-up
            2
            arrow-down
            1
            ·
            8 months ago

            The thing is, none of those algorithms have to be black boxes. They could be published—it’s just that businesses don’t want to, and so far, no government has chosen to force it.

            • cygnus
              link
              fedilink
              arrow-up
              4
              ·
              edit-2
              8 months ago

              Sure, but now this no longer has much to do with the original topic. What you’re suggesting is that any software algorithm has to have its source code shared, and any media company needs to explain why it publishes what it does, and not other things. That goes far, far beyond TikTok.

              • nyan@lemmy.cafe
                link
                fedilink
                English
                arrow-up
                1
                ·
                8 months ago

                That’s because banning Tiktok alone is like trying to hold a gaping wound together with a band-aid. We need to force all social media companies to act like good citizens if they want to remain in the Canadian market. (Yeah, I know, not going to happen.)

    • Cruxifux@lemmy.world
      link
      fedilink
      arrow-up
      5
      arrow-down
      3
      ·
      8 months ago

      I feel like you’re gonna get zero negative responses to this, because, after all

      We’re all on Lemmy

    • echo64@lemmy.world
      link
      fedilink
      arrow-up
      9
      arrow-down
      8
      ·
      8 months ago

      Again with this, what was it last time. The mobile phones? Then the video Games? Then the movies and the TV? The rock music? The radio?

      • smoothbrain coldtakes
        link
        fedilink
        English
        arrow-up
        8
        arrow-down
        2
        ·
        8 months ago

        It’s not a moral panic this time though, it’s a legitimate concern.

        Nothing has ever produced these billions of sub-30-second, algorithmically curated dopamine blasts.

        Video games were a moral panic because they “promoted violence” and TV was supposed to make you stupid, which, it kind of does depending on what you consume. They both required you to sit still and focus on something though.

        There are kids who can’t sit through a movie without pulling out their phone because they’re just used to being onto the next thing in 10 seconds.

        So you can bitch about it being a moral panic or whatever you want to do to make fun of people who think it’s a concern, but it’s very clear to me from seeing how damaging excessive screen time is on young kids personally that things like YouTube shorts and TikTok are actually super damaging in a number of different ways. They were right about phones, because they’ve facilitated the mass delivery of this content.

        • echo64@lemmy.world
          link
          fedilink
          arrow-up
          1
          ·
          8 months ago

          I’m sure they thought it wasn’t a moral panic last time too. You sound the same as the adults when i was a young person. Exactly the same. Everyone here does.

          You need to ask questions of yourself, I mean, you can ignore the questions of yourself, but that is heading down the same road, but this time, it’s resulting in actual censorship of the things young people use instead of just a panic.

        • Taleya@aussie.zone
          link
          fedilink
          arrow-up
          1
          ·
          8 months ago

          nah, it’s a moral panic.

          Is reels being castigated? Youtube shorts? They do the exact same dopamine microdosing and attention span buttfucking. The practises aren’t being addressed, or their deleterious effects - in fact it would be incredibly easy to ban what tiktok does rather than the app itself, but they’re not doing that. it’s a moral panic drumbeat that conveniently opens the door to fuck other social media.

      • djsoren19@yiffit.net
        link
        fedilink
        arrow-up
        9
        arrow-down
        3
        ·
        8 months ago

        I mean, we have actual data this time proving that it’s limiting attention spans and is incredibly addictive. This isn’t the classic right-wingers fearmongering over smoke and air, there’s genuine psychological issues being caused by the app.

        • minibyte@sh.itjust.works
          link
          fedilink
          arrow-up
          9
          arrow-down
          2
          ·
          8 months ago

          I hate Tik Tok as much as the next guy, but I think this could act as a precedent for future censorship.

          • Avid Amoeba
            link
            fedilink
            arrow-up
            7
            arrow-down
            5
            ·
            8 months ago

            We’re way past the point of censorship. TikTok is censorship. The private, black-box algorithm content feeds censor whatever isn’t profitable for their major shareholders under the guise that “it’s still there.” Except it almost never gets surfaced.

            • Nik282000
              link
              fedilink
              arrow-up
              2
              ·
              8 months ago

              That fucking self-censoring algo-speak is disgusting. But every single twit who uses “un-alive” instead of just saying “dead” still has the option to leave and use any other platform on the internet. They choose to be fucked by TikTok and Instagram, they aren’t being forced to use these platforms.

          • djsoren19@yiffit.net
            link
            fedilink
            arrow-up
            3
            arrow-down
            2
            ·
            edit-2
            8 months ago

            I think there’s a pretty massive world of difference between blocking access to a mass misinformation machine operated by a hostile power that has notoriously used apps it controls to locate and kidnap foreign dissidents and a 1984 surveillance state.

            I don’t know how we got to this point where millions of Americans think China is their friend but hate their own government for being complicit in a genocide. I guess the Uyghurs just aren’t marketable enough for people to care? The brutal crackdown on Hong Kong has left people’s goldfish memory? Or maybe, just maybe, the mass misinformation machine is doing its job.

        • Rodeo
          link
          fedilink
          arrow-up
          3
          ·
          8 months ago

          We have actual data going back 30+ years that watching TV limits attention span in children and is also addictive.

          Nobody gives a shit about that though. What makes tiktok so much more important?

      • BCsven
        link
        fedilink
        arrow-up
        2
        ·
        8 months ago

        Those disn’t have algorithms and psychology baked in to lure you in

    • Nik282000
      link
      fedilink
      arrow-up
      1
      ·
      8 months ago

      Algorithmic short-form content is literally destroying the attention span of an entire generation

      I absolutely despise short vertical video as a format, but how is the lack of attention span caused by this type of media? Does the availability of books cause long attention spans? Lack of attention span is the default state of North Americans. Platforms like TikTok, Instagram and Facebook show videos that users want to watch, it’s no different than Audible suggesting books or Netflix suggesting movies. I like the Audiobooks I listen to, they like the 15 second videos that they watch.

      It’s also not generational, there are just as many boomers, gen-x, millennials, and gen-z burning hours into these platforms.