Summary

A study found that TikTok’s recommendation algorithm favored Republican-leaning content during the 2024 U.S. presidential race.

TikTok, with over a billion active users worldwide, has become a key source of news, particularly for younger audiences.

Using 323 simulated accounts, researchers discovered that Republican-leaning users received 11.8% more aligned content than Democratic-leaning users, who were exposed to more opposing viewpoints.

The bias was largely driven by negative partisanship, with more anti-Democratic content recommended.

  • Glide
    link
    fedilink
    arrow-up
    10
    ·
    13 hours ago

    Does this mean the algorithm was designed to push a republican agenda? Or does the algorithm know that liberals are more likely to watch videos from the opposing side than conservatives?

    Both of these things can be true.

    A friend of mine likes to say, a systems goal is what it does in practice, not its design intent.

    • danc4498@lemmy.world
      link
      fedilink
      English
      arrow-up
      4
      ·
      12 hours ago

      Sure, kinda like saying, if it looks like shit and it smells like shit, it’s probably shit. Apt metaphor.

      I guess I’m just wondering about the intent. Like, is it possible to prove that an algorithm was designed to have a bias vs the bias is a natural result of what people spend their time watching. I am sure it’s the former, but how does one prove that without leaks from the inside.

      • naught101@lemmy.world
        link
        fedilink
        arrow-up
        2
        ·
        9 hours ago

        The intent on e.g. YouTube is to optimise views. Radicalisation is an emergent outcome, as a result of more combatitive, controversial, and flashy content being more captivating in the medium term. This is documented to some extent in Johann Hari’s book Stolen Focus, where he interviews a couple of insiders.

        So no, the stated intent is not the bias (at least initially). The bias is an pathological outcome of optimising for ads.

        But looking at some of Meta’s intentional actions more recently, it seems like maybe it can become an intentional outcome after the fact?

      • Optional@lemmy.world
        link
        fedilink
        arrow-up
        3
        ·
        12 hours ago

        I think it’s a matter of How Many Coincidences Does It Take

        If we’re assigning good faith to the TikTok algorithm.

        Which - reading that out loud just sounds absurd.