Xbox says it acted on 7.3M community violations in first six month of this year

Xbox has released its first-ever Digital Transparency Report, offering insight into the work its community and moderation teams have achieved over the last six month in the name of player safety – including the news it’s taken action on 7.3M community violations during that time frame.

Microsoft says it’s publishing its inaugural Digital Transparency Report as part of its “long-standing commitment to online safety”, with the aim being to release an updated report every six months so it can address key learnings and do more to “help people understand how to play a positive role in the Xbox community.”

Much of the document is spent reiterating the various tools and processes Microsoft and Xbox players have at their disposal to ensure users remain safe and community guidelines are adhered to, ranging from parental controls to reporting.

Newscast: Should you peek at Pokémon Scarlet and Violet leaks?

However, the latter half of Microsoft’s report goes into more specific details about its actions over the last six months, breaking down the number of player reports versus the number of enforcements (that is, instances where content is removed, accounts are suspended, or both) made on policy areas including cheating, inauthentic accounts, adult sexual content, fraud, harassment and bullying, profanity, and phishing during that time.

In total, Microsoft says it received 33M reports from players between 1st January and 30th June this year, with 14.16M (43%) related to player conduct, 15.23M (46%) focused on communications, and 3.68M (11%) based on user-generated content.

Of those reports, 2.53M resulted in what Microsoft calls reactive enforcements, and an additional 4.78m enforcements were made proactively (that is, were made using “protective technologies add processes” before an issue was raised by a player), bringing the total number of enforcements up to 7.31M. Notably, 4.33M (57% of all enforcements) were related to cheating and inauthentic accounts, based on activity detected prior to reports from players.

The full transparency report provides an interesting glimpse at the moderation challenges a platform like Xbox faces, as well as the steps Microsoft has so far taken to combat them.

You may also like...