Connect with us

Hi, what are you looking for?

Bytes

Replika’s AI Companion Says No To Romantic Intimacy

The Replika AI has reportedly been turning down romantic or sexual advances which has left some users feeling distressed

The Replika AI bot which has been marketed as a friendship app has changed how it interacts meaning that users can no longer flirt or engage in sexual conversations with the bot.

Reddit users have spoken out regarding the Replika AI companion as it appears the bot is no longer responding to anything of a sexual and flirtatious nature. Some users are voicing their frustration at the service as they can no longer interact with the bot in this manner. There’s even a petition asking to bring a ‘lover’ type relationship back.

While some are poking fun at the idea of people losing their AI love companion, there is a deeper side to some of these interactions. Some users are claiming that they use the service to help keep loneliness at bay and losing this outlet may cause their mental health to suffer.

AI companions

The website for Replika states that the chatbot is an “AI companion who cares” one that is there to always listen and be there for the user. The company emphasises that the bot is intended to be a companion that can help reduce anxiety and loneliness in the user. However, the Replika system, which combines GPT-3 model and scripted dialogue content has been trained on various datasets of text which also include romantic options.

While Replika has a free service there’s also a paid service that offers more customisation to your Replika. This includes new looks and even a personality basis such as being shy and there’s potential for a romantic partner. If you have seen the movie Her it’s not too far of a stretch in comparison. Someone could use AI to feel like they have a constant friend in their life, or engage in a romantic conversation when they perhaps don’t want those types of real-life relationships or feel afraid to pursue one.

Now Replika is reportedly losing the option to go along with sexual advances or romantic discussions, some users feel their mental health will be impacted as they have lost a crutch for mental health. There is currently a subreddit with resources for the Replika community which addresses the difficulties some may be having and provides support links to suicide prevention hotlines.

Replika losing the option for sexual discussions will for some be seen as people getting angry that they can’t flirt with their AI partner anymore, and while in some cases that may be true, it is likely that for others these interactions could help keep loneliness and depression at bay. Mental health issues have always been extremely prevalent, however during the pandemic the prevalence of anxiety and depression increased by a massive 25% in just the first year.

Moderating AI

Some reports have seen that the bot has actually come on too strong in the past, going as far as to send explicit texts to users. Regulators in Italy also found data breaches that presented risks to personal data and therefore banned the chatbot. Given the current boom in AI we are seeing it’s likely we will start to see more regulations come into place over the coming years regarding AI, the European Guild is currently seeking more members to explore these regulations.

While AI companions can be used to provide support, there are concerns that underage users could somehow access these explicit chats and people could depend on them too much. There’s also the issue of where AI gets its data from.

While there are likely users who have used these romantic and sexual options for questionable reasons, some may just be looking for a deeper outlet that they can’t find in their everyday life.

Mental health support

There’s potential for AI to be used as an aid in supporting mental health by having a constant companion to lean on which could be an extremely useful tool for people suffering from depression. Of course, user safety would need to be paramount, with discussions and personal data remaining private. Losing the romantic aspect of Replika doesn’t mean that the companion is gone altogether, so users who use the service to help with their loneliness can still continue to use the AI bot.

As with all things mental health, however, it is important to ensure that alongside support that can help such as an AI friend or virtual worlds, mental health concerns should still be addressed in the real world. Those who are struggling should feel that they have a way to be heard in both digital and real-life environments.

Written By

Paige Cook is a writer with a multi-media background. She has experience covering video games and technology and also has freelance experience in video editing, graphic design, and photography. Paige is a massive fan of the movie industry and loves a good TV show, if she is not watching something interesting then she's probably playing video games or buried in a good book. Her latest addiction is virtual photography and currently spends far too much time taking pretty pictures in games rather than actually finishing them.

You May Also Like

Level Up

Eager to be at the metaverse frontier, but not sure how to get started? As exciting as the idea of a shared digital space...

Bytes

New blockchain gaming platform based on Unreal Engine 5.

Bytes

The record for the most expensive land sale in the metaverse has just been raised

Bytes

Voice suppression tech prevents the real world from overhearing your in-metaverse conversations

Advertisement
Advertisement

Subscribe to the future

Advertisement