CALL OF DUTY IS 50% LESS TOXIC THANKS TO MODERATION SYSTEMS

In an all-new blog post, Activision Blizzard and its associated studios have spoken at length about Call of Duty’s code of conduct, drawing specific attention to findings generated from a recent rollout of fresh moderation mechanics. Most importantly, the team discussed the results gleaned from the deployment of an automated chat monitoring system that would reprimand and punish players for speaking aggressively or offensively in-game.


That’s a lot of Punishment

In the blog post published on callofduty.com, it was claimed that more than two million accounts have ‘seen in-game enforcement for disruptive voice chat’. That’s a lot of players being offensive and abusive in Call of Duty, but some will argue that it’s part of the ecosystem. What’s Call of Duty without a little trash talk – that kind of rhetoric.

Activision also stressed that around a 50% reduction in ‘players exposed to severe instances of disruptive voice chat’ has been recorded since the launch of Modern Warfare III.

With a promise to combat toxicity insofar as possible in the Call of Duty world, Activision explained that more moderation systems are coming in future updates – and more punishments. There was a reference reminding players to report any malicious activity in any Call of Duty game, but that’s a tough line to walk thanks to recent updates. For instance, there’s a new rule that says you’ll be punished for making false reports – but no thresholds were clarified.

So, you could report an entire team for toxicity and the automated system could read that as ‘spam reporting’. Who really knows?

Source