• Sludgehammer@lemmy.world
    link
    fedilink
    English
    arrow-up
    9
    ·
    5 days ago

    its spokesperson described the AI blunder as a “minor issue” that only involved a “small amount” of customers.

    If this has happened once there’s probably dozens (possibly hundreds) of other ways you can “trick” the chatbot into revealing that information. As an example while Google’s Gemini refuses to directly tell you how to make napalm, you can simply ask it what napalm is, then what ratio of gasoline and polystyrene it contains.