Connect with us

Hi, what are you looking for?

Bytes

Voice Chat Is The Main Source Of Toxic Behaviour In Video Games

While gamers say voice chat makes their gaming experience better, many are also being harassed online

Photo Credit: Petr Macháček / Unsplash

Voice chat moderation company Speechly surveyed over 1,000 gamers and found that nearly 70% have used voice chat at least once. However, 72% also stated they had experienced a toxic incident.

Back before gamers could use party chat or apps such as discord to choose who they wanted to speak to while playing multiplayer video games, players would need to brace themselves for the public game chat. Now we can privately lock our chats and tailor our online experiences in terms of who we communicate with.

However, many still opt to go to public game chats to communicate with more people and make new friends. While this option can be a fun experience, the report shows that gamers rated voice chat as the most problematic engagement channel.

Toxic behaviour

Participants stated that issues were also present with in-game play and text chat. Despite the high number of players that have experienced some kind of negative interaction through voice chat, nearly half of gamers believe that voice chat improves the gaming experience.

Being called offensive names is the most reported incident in voice chat, with 39.9% experiencing this. Trolling online has been experienced by 38.4% of participants, with a further 29.9% stating that they had been bullied in voice chat. Being sexually harassed is also an issue, with 15.9% of the participants saying they had experienced this type of behaviour. The problem spans other areas of the gaming industry as even new and emerging spaces such as the metaverse are already reporting incidents of sexual harassment and bullying.

Moderating a voice chat can be difficult as it isn’t always clear who said what and not all games have a way to prove what happened. While most games and services do offer a system to report a player, this is often a simple choice between what type of issue occurred and a report is sent, sometimes a player may be banned and other times it comes to nothing.

Solutions

Of those who participated in the survey, 43.9% said they would like to see a one-click complaint system that shares voice chat audio with moderators. This would mean that players would be able to report abusive voice chat with evidence of it happening. It would also perhaps make players more aware of the repercussions of what they say online. A similar number of players, 41.9% said they would be happy with a simple way to file a complaint that does not need to include voice audio and 29.5% opted for a three-strike point system for verified toxicity.

There’s a balance to be found with moderating online spaces. While it’s important to ensure that players have a sense of privacy and aren’t constantly monitored, there is also more that could be done with reporting systems to ensure that players who harass others are identified.

Written By

Paige Cook is a writer with a multi-media background. She has experience covering video games and technology and also has freelance experience in video editing, graphic design, and photography. Paige is a massive fan of the movie industry and loves a good TV show, if she is not watching something interesting then she's probably playing video games or buried in a good book. Her latest addiction is virtual photography and currently spends far too much time taking pretty pictures in games rather than actually finishing them.

You May Also Like

Level Up

Eager to be at the metaverse frontier, but not sure how to get started? As exciting as the idea of a shared digital space...

Bytes

New blockchain gaming platform based on Unreal Engine 5.

Bytes

The record for the most expensive land sale in the metaverse has just been raised

Bytes

Voice suppression tech prevents the real world from overhearing your in-metaverse conversations

Advertisement
Advertisement

Subscribe to the future

Advertisement