Microsoft has published its first Xbox Transparency Report, which contains details regarding what actions have been taken to ensure that Xbox is a safe and fair place to play.
The report sheds a light on what the Xbox team is doing to offer a better experience for its players. So let’s dive into some of its findings.
Firstly, the Xbox community themselves are playing a part as they have provided over 33 million reports from January to June of this year, however, this number is down 36% year-on-year, and with less players reporting issues it makes it more difficult for Xbox to take action. Part of the driving force behind the new report is to show players what is going on behind the scenes to make the Xbox space the best it can be.
Of the 33 million reports that the community did submit, communications accounted for 46% and conduct 42%, making these the two most highly reported issues.
Enforcing the rules
Xbox has issued more than 4.33 million proactive enforcements against inauthentic accounts, which represent 57% of total enforcements, these accounts are typically automated or bot accounts that harm the Xbox ecosystem. These proactive enforcements are when the Xbox team finds and manages a situation before it has been reported by a player, as opposed to reactive moderation which is when a player has made a report and Xbox then reviews and investigates the nature of that report.
When looking at the total enforcements by policy area, it can be seen that cheating/inauthentic accounts are at the top of the list as previously mentioned and this is followed by a further 1.05 million enforcements for profanity and 814,000 for adult sexual content. Harassment/bullying received 759,000 enforcements followed by the next highest amount of 211,000 for hate speech.
Enforcement data shows what happens when a violation of community standards happens. These are put into three categories. The first, which accounts for 63% of enforcement, is the suspension of a player’s account. 3% of the actions taken are for content only, which means the content which is deemed against the rules is simply removed, and finally the third is a combination of both previous categories and this amounts to 34% of the action taken.
General manager for Xbox Product Services, Dave McCarthy spoke with Gamesindustry.biz and commented on the report saying “Being open and transparent can be difficult, but it’s important that players know more about what we do to keep them safe. We see incredible value in being introspective, assessing what our processes are today, the data and insights across our technology, and listening to feedback. All of this is with the intention of learning and improving”.
It’s always important to know what tools are at the player’s disposal to keep these spaces safe, Xbox offers various settings to impose parental controls, filter messages, or even restrict voice communications. Xbox intends to continue it’s work and has pledged to release a new Transparency Report every six months.
Paige Cook is a writer with a multi-media background. She has experience covering video games and technology and also has freelance experience in video editing, graphic design, and photography. Paige is a massive fan of the movie industry and loves a good TV show, if she is not watching something interesting then she's probably playing video games or buried in a good book. Her latest addiction is virtual photography and currently spends far too much time taking pretty pictures in games rather than actually finishing them.