• Blackmist@feddit.uk
    link
    fedilink
    arrow-up
    2
    ·
    1 year ago

    I think this is because they were getting lawsuits when their wonderful algorithms kept sending suicidal kids more info on suicide.

    Dropping negative topics down the rankings is not a terrible idea, but it does lead to bonkers workarounds when people want to talk about it anyway and worry about keeping their social media metrics up.

    Hiding all the engagement metrics would probably do wonders for a lot of teens mental health, as they become desperate to be influencers.

    • saigot
      link
      fedilink
      arrow-up
      3
      ·
      1 year ago

      I think it’s more of a business decision. A lot of parents won’t let their kids on reddit (for good reason lol) but Tik tok actually does do an alright job of keeping age appropriate content off the platform, much tighter than most other social media sites. For my own interest I went looking for NSFW stuff on tik tok, the most intense I could find was some tame thirst traps, some national geographic style nudity (lots of discussion of adult themes, but I don’t think those are neccesarily a problem).

      on any other social media site I’ve ever used, even with child mode or whatever I have been able to find beheadings and hardcore sex. That’s a pretty big selling point if you want to primarily attract children.

      • IndiBrony@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        1 year ago

        I have been able to find beheadings and hardcore sex. That’s a pretty big selling point if you want to primarily attract children.

        ಠ⁠_⁠ಠ