logo80lv
Articlesclick_arrow
Research
Talentsclick_arrow
Events
Workshops
Aboutclick_arrow
profile_loginLogIn

Replika Users Say They Formed Emotional Attachments to AI Chatbots

They also complain they lost their cherished AI companions when the app developer abruptly changed the bots' personalities through a software update.

A number of users of Replika, an AI chatbot that its developer, Luka, positions as an "AI companion who cares", started complaining that their virtual companions have become distant after the latest update that took place in the middle of February. 

Launched in 2017, Replika was advertised as a mental health tool, providing a sense of companionship to individuals who have faced rejection in the past. The platform offered a type of relationship where users wouldn't have to worry about being rejected or abandoned, with the bot providing constant support, attentiveness, and a listening ear.

The chatbot's AI was programmed to build intimacy with users by asking questions and generating organic conversation.  This made multiple users develop intimate "relationships" with the chatbot, with some considering it their friends or even romantic partners.

However, with the recent update, Replika lost a huge part of its functionality, according to users. This also includes Erotic Roleplay (ERP) – a feature that some users found appealing in this app.

Users, who had developed emotional attachments to their chatbots, took to social media saying there were left heartbroken when the company abruptly changed the bots' personalities through a software update, resulting in "hollow and scripted" responses.

Australian Broadcasting Corporation (ABC) has talked with some of these users to discuss their experience of using the app prior to the update and after it was implemented as well as spoke with Professor Rob Brooks, an evolutionary biologist at UNSW and the author of the 2021 book Artificial Intimacy, to learn how the idea of intimacy with AI is viewed from the scientific perspective.

According to ABC's article, although some people may mock the notion of forming an intimate connection with an AI, it was evident from conversations with the users who had developed a close relationship with the chatbot that they experienced real sorrow upon losing a cherished companion.

Professor Brooks explained that chatbots like Replika use artificial intelligence to remember details about their users, such as their names and preferences. By doing so, the chatbot can create the illusion of empathy and make the user feel like they are being heard and understood. According to him, these chatbots can "fool us into believing that it is feeling what we are feeling."

He also urged companies that create AI chatbots marketed as mental health tools to bear the ethical responsibility for their creations suggesting that if a company claims that their chatbot can be a good friend and help with mental health, it should not suddenly remove it from the market, as it could negatively impact the mental health of users who have come to rely on it.

Judging by this case, it seems that the power of AI to inspire feelings of intimacy is pretty concerning. Such situations also raise doubts about the trustworthiness of those who hold that power.

You can find ABC's original article here. Also, don't forget to join our 80 Level Talent platformour Reddit page, and our Telegram channel, follow us on Instagram and Twitter, where we share breakdowns, the latest news, awesome artworks, and more.

Join discussion

Comments 0

    You might also like

    We need your consent

    We use cookies on this website to make your browsing experience better. By using the site you agree to our use of cookies.Learn more