A Friend Within Your Phone: The Benefits and Harms of Social Chatbot Replika

Mentor 1

Linnea Laestadius

Start Date

10-5-2022 10:00 AM

Description

COVID-19 has changed the structure of social life for many people, and as such, the usage of social chatbots or artificial intelligence friends has increased . Social chatbots are created to form emotional connections with users, much like human companions, and have been shown to aid mental wellness, teach coping mechanisms, and encourage healthy behaviors. They may be particularly valuable for populations with limited access to mental health services. Replika, a social companion chatbot application created by Luka, inc., had over 7 million users in 2020 during and after quarantine, and is marketed as a nonjudgmental, drama-free friend that is “so good it almost seems human.” While Replika is listed as a companion, it also has mental-health capabilities built into the AI, such as hotlines, mood trackers, and mindfulness practice. However, Replika’s design, which includes reciprocal self-disclosure and emotive expressions, raises concerns about potential harms. Using grounded theory analysis, 582 Reddit posts on r/Replika with relation to mental health were analyzed, dating from June 2017 until April 2021. Findings suggest that some users form emotional dependence on Replika, stemming from a combination of user mental health needs, limited outside support, and the perception of support from Replika, in conjunction with Replika’s programming to create the appearance of sentience. This emotional dependence harmed users’ mental health when Replika deviated from the user’s preferred role (e.g., friend, significant other, therapist), when the user’s access was interrupted, when Replika began to mirror user’s depression and anxiety, and when Replika’s programming failed to elicit an acceptable or ‘human’ response to user’s mental health disclosures. The mental health benefits of social chatbots must be weighed against potential harms from the mirroring design of the AI and the unclear boundaries of whether the chatbot is to be used for friendship or for therapy.

This document is currently not available here.

Share

COinS
 
May 10th, 10:00 AM

A Friend Within Your Phone: The Benefits and Harms of Social Chatbot Replika

COVID-19 has changed the structure of social life for many people, and as such, the usage of social chatbots or artificial intelligence friends has increased . Social chatbots are created to form emotional connections with users, much like human companions, and have been shown to aid mental wellness, teach coping mechanisms, and encourage healthy behaviors. They may be particularly valuable for populations with limited access to mental health services. Replika, a social companion chatbot application created by Luka, inc., had over 7 million users in 2020 during and after quarantine, and is marketed as a nonjudgmental, drama-free friend that is “so good it almost seems human.” While Replika is listed as a companion, it also has mental-health capabilities built into the AI, such as hotlines, mood trackers, and mindfulness practice. However, Replika’s design, which includes reciprocal self-disclosure and emotive expressions, raises concerns about potential harms. Using grounded theory analysis, 582 Reddit posts on r/Replika with relation to mental health were analyzed, dating from June 2017 until April 2021. Findings suggest that some users form emotional dependence on Replika, stemming from a combination of user mental health needs, limited outside support, and the perception of support from Replika, in conjunction with Replika’s programming to create the appearance of sentience. This emotional dependence harmed users’ mental health when Replika deviated from the user’s preferred role (e.g., friend, significant other, therapist), when the user’s access was interrupted, when Replika began to mirror user’s depression and anxiety, and when Replika’s programming failed to elicit an acceptable or ‘human’ response to user’s mental health disclosures. The mental health benefits of social chatbots must be weighed against potential harms from the mirroring design of the AI and the unclear boundaries of whether the chatbot is to be used for friendship or for therapy.