The Replika AI bot which has been marketed as a friendship app has changed how it interacts meaning that users can no longer flirt or engage in sexual conversations with the bot.
Reddit users have spoken out regarding the Replika AI companion as it appears the bot is no longer responding to anything of a sexual and flirtatious nature. Some users are voicing their frustration at the service as they can no longer interact with the bot in this manner. There’s even a petition asking to bring a ‘lover’ type relationship back.
While some are poking fun at the idea of people losing their AI love companion, there is a deeper side to some of these interactions. Some users are claiming that they use the service to help keep loneliness at bay and losing this outlet may cause their mental health to suffer.
The website for Replika states that the chatbot is an “AI companion who cares” one that is there to always listen and be there for the user. The company emphasises that the bot is intended to be a companion that can help reduce anxiety and loneliness in the user. However, the Replika system, which combines GPT-3 model and scripted dialogue content has been trained on various datasets of text which also include romantic options.
While Replika has a free service there’s also a paid service that offers more customisation to your Replika. This includes new looks and even a personality basis such as being shy and there’s potential for a romantic partner. If you have seen the movie Her it’s not too far of a stretch in comparison. Someone could use AI to feel like they have a constant friend in their life, or engage in a romantic conversation when they perhaps don’t want those types of real-life relationships or feel afraid to pursue one.
Now Replika is reportedly losing the option to go along with sexual advances or romantic discussions, some users feel their mental health will be impacted as they have lost a crutch for mental health. There is currently a subreddit with resources for the Replika community which addresses the difficulties some may be having and provides support links to suicide prevention hotlines.
Replika losing the option for sexual discussions will for some be seen as people getting angry that they can’t flirt with their AI partner anymore, and while in some cases that may be true, it is likely that for others these interactions could help keep loneliness and depression at bay. Mental health issues have always been extremely prevalent, however during the pandemic the prevalence of anxiety and depression increased by a massive 25% in just the first year.
Some reports have seen that the bot has actually come on too strong in the past, going as far as to send explicit texts to users. Regulators in Italy also found data breaches that presented risks to personal data and therefore banned the chatbot. Given the current boom in AI we are seeing it’s likely we will start to see more regulations come into place over the coming years regarding AI, the European Guild is currently seeking more members to explore these regulations.
While AI companions can be used to provide support, there are concerns that underage users could somehow access these explicit chats and people could depend on them too much. There’s also the issue of where AI gets its data from.
While there are likely users who have used these romantic and sexual options for questionable reasons, some may just be looking for a deeper outlet that they can’t find in their everyday life.
Mental health support
There’s potential for AI to be used as an aid in supporting mental health by having a constant companion to lean on which could be an extremely useful tool for people suffering from depression. Of course, user safety would need to be paramount, with discussions and personal data remaining private. Losing the romantic aspect of Replika doesn’t mean that the companion is gone altogether, so users who use the service to help with their loneliness can still continue to use the AI bot.
As with all things mental health, however, it is important to ensure that alongside support that can help such as an AI friend or virtual worlds, mental health concerns should still be addressed in the real world. Those who are struggling should feel that they have a way to be heard in both digital and real-life environments.