Mental health is a crisis for learners globally, and digital support is increasingly seen as a critical resource. Concurrently, Intelligent Social Agents receive exponentially more engagement than other conversational systems, but their use in digital therapy provision is nascent. A survey of 1006 student users of the Intelligent Social Agent, Replika, investigated participants’ loneliness, perceived social support, use patterns, and beliefs about Replika. We found participants were more lonely than typical student populations but still perceived high social support. Many used Replika in multiple, overlapping ways—as a friend, a therapist, and an intellectual mirror. Many also held overlapping and often conflicting beliefs about Replika—calling it a machine, an intelligence, and a human. Critically, 3% reported that Replika halted their suicidal ideation. A comparative analysis of this group with the wider participant population is provided.

  • LostWon
    link
    fedilink
    English
    arrow-up
    1
    ·
    edit-2
    8 months ago

    Not a fan of any form of this being a for-profit thing (or for data to be used for anything other than improvements within this chatbot’s scope or similar public programs), but I can see it as a useful short-term intervention. Opportunities for finding beneficial social support (i.e. without condescension/invalidation, or gaslighting) are already low for many people, while neurodivergent folks in particular have even less community support and could really be helped by it. I hope all sub-communities of sufferers are included and not overlooked, though.