- cross-posted to:
- [email protected]
- [email protected]
- [email protected]
- cross-posted to:
- [email protected]
- [email protected]
- [email protected]
It’s cheap, quick and available 24/7, but is a chatbot therapist really the right tool to tackle complex emotional needs?
I’m not sure why, but you seem to have posted this yesterday but it didn’t show up until an hour ago. Your instance may be having some issues.
I do get where you’re coming from with all that, but the act of going to therapy itself is an achievement a patient can benefit from, and should be considered from the start. If that truly isn’t possible for someone, voice calls from a real therapist are a reasonable next step.
Also, the original question was, “Can AI replace therapists?”. I can see some meaningful benefits coming from an AI assisting a therapist, but that’s not what I was getting at. AI alone really just feels like a bandaid on a bullet wound, when applying pressure or a tourniquet is also available.
No, the original question is “can AI therapists do better than the real thing?” And yes, they can do better at specific things. That doesn’t make them a replacement, though.
Bandaids aren’t much use for a bullet wound, but bandaids are still good to have and useful in other situations. You wouldn’t use a tourinquet for a papercut.