Brin’s “We definitely messed up.”, at an AI “hackathon” event on 2 March, followed a slew of social media posts showing Gemini’s image generation tool depicting a variety of historical figures – including popes, founding fathers of the US and, most excruciatingly, German second world war soldiers – as people of colour.

  • RandoCalrandian@kbin.social
    link
    fedilink
    arrow-up
    1
    ·
    edit-2
    10 months ago

    Doing the opposite of that isn’t any better

    And Gemini was perfectly happy to exclude blacks from prompts about eating fried chicken and watermelon.

    Turns out you can’t fight every fire with more fire, more often than not it will burn everything down. You can’t solve something as complex as systemic racism with more systemic racism just against a different group.

    • Kichae
      link
      fedilink
      English
      arrow-up
      1
      arrow-down
      1
      ·
      10 months ago

      Doing the opposite of that isn’t any better

      Socially, it kind of is, though? When certain groups of people have been historically and chronically maligned, marginalized, persecuted, and othered, showing them in positive roles more frequently is actually a net benefit to those groups, and to society as a whole.

      Like, yes, it’s very stupid that these systems are overwriting specific prompts, but also that’s the effect of a white supremacist society refusing to look at itself in the mirror and wrestle with its issues. If you want these big companies to let you use their resources to make specific things that could be used to highlight that white supremacy that people don’t want to acknowledge or address, you kind of have to… get them to acknowledge and address it.

      Otherwise, build your own generative model.