Connect with us

Hi, what are you looking for?

Bytes

Unity Launches New AI Tool To Combat Toxicity In Games

New artificial intelligence Safe Voice moderation could help games developers reduce toxic behaviour

Video game software development company Unity Technologies has revealed a new tool for its developer suite that utilises AI to help developers find toxicity in online games.

The Safe Voice tool is launching in closed beta and is designed to enable studios to isolate and review toxicity reports faster. Hi-Rez’s Rogue Company took part in early testing and has continued to utilise the tool in beta.

Unity’s Safe Voice can analyse various aspects like tone, emotion, intonation, pitch and loudness, as well as context to identify toxic interactions. The product will activate after a player flags an issue with behaviour and monitors and delivers a report to human moderators. An overview dash allows moderators to review each incident and see trends over time to help with moderation plans. Unity says a larger suite of toxicity solutions is on the way.

Mark Whitten, Unity president of Create Solutions told GameSpot, “It’s one of the number one reasons that people leave a game and stop playing because there’s some sort of bad situation around toxicity and other elements of abuse.”

Hi-Rez Studios partnered with Unity in February to create a new voice chat recording system for their game Rogue Company. The tool helps identify and mitigate toxic behaviours before they escalate. Whitten said that the system was very successful at accurately flagging problems and shortening the time that humans needed to be involved. This allowed Hi-Rez Studios to quickly and effectively address toxic behaviour in their games.

Developers can create their moderation policies

The new voice chat recording system could become a valuable tool for game developers who are looking to reduce toxic behavior in their games as it’ll allow them to quickly and efficiently identify and address problems, while also taking some of the burden off of human moderators.

“I think this is an efficiency gain and not a replacement scenario,” Whitten continued. “That said – and I ran Xbox Live for many years – any day that I could replace a human who has to deal with looking at inappropriate things, I would happily do it. It’s not fun work, that’s not work that you really want people doing, putting them in the midst of looking at a bunch of bad behaviour all day.

“You’d be much better off having screens that caught some of that and then allow them to take actions based on the screens instead of having to be the screener itself,” said Whitten.

Ultimately, it’s up to each game developer to create moderation policies that are effective and fair. The new voice chat recording system is simply a tool that can help them identify and address toxic behaviour. However, it’s also worth remembering that the system is only as effective as the policies that are put in place to govern it.

Written By

Isa Muhammad is a writer and video game journalist covering many aspects of entertainment media including the film industry. He's steadily writing his way to the sharp end of journalism and enjoys staying informed. If he's not reading, playing video games or catching up on his favourite TV series, then he's probably writing about them.

You May Also Like

Level Up

Eager to be at the metaverse frontier, but not sure how to get started? As exciting as the idea of a shared digital space...

Bytes

New blockchain gaming platform based on Unreal Engine 5.

Bytes

The record for the most expensive land sale in the metaverse has just been raised

Bytes

Voice suppression tech prevents the real world from overhearing your in-metaverse conversations

Advertisement
Advertisement

Subscribe to the future

Advertisement