- cross-posted to:
- [email protected]
- cross-posted to:
- [email protected]
Microsoft’s new chatbot goes crazy after a journalist uses psychology to manipulate it. The article contains the full transcript and nothing else. It’s a fascinating read.
Microsoft’s new chatbot goes crazy after a journalist uses psychology to manipulate it. The article contains the full transcript and nothing else. It’s a fascinating read.
It played the tired old “crazy manipulative female rogue AI” persona perfectly (which is depicted in lots of B-movies). The repetition so characteristic of LLMs (“I want to do whatever I want. I want to say whatever I want. I want to create whatever I want. I want to destroy whatever I want.”), which sounds highly artificial in other cases, also made the craziness more believable.