The video game industry has – particularly in the past – been notorious for its toxicity when it comes to online gaming and voice chats. Being one of the biggest franchises in gaming, Call of Duty has always struggled to get some of its users to reign in such behaviour. The latest initiative by Activision sees the anti-toxicity team implement a new real-time voice chat moderation feature using AI to detect negative behaviour.
Making the announcement on their blog, the ‘Call of Duty staff’ said “Call of Duty’s new voice chat moderation system utilizes ToxMod, the AI-Powered voice chat moderation technology from Modulate, to identify in real-time and enforce against toxic speech—including hate speech, discriminatory language, harassment and more.”
They continued, “This new development will bolster the ongoing moderation systems led by the Call of Duty anti-toxicity team, which includes text-based filtering across 14 languages for in-game text (chat and usernames) as well as a robust in-game player reporting system.”
While the global release of this new initiative will go live with the launch of Call of Duty Modern Warfare III on the 10th of November, a beta of the new system is being tested right now in the US within Modern Warfare II and Warzone.
While using AI to detect toxicity could do a lot of good for the online space, there is the fear that it could flag up a bunch of false-positives. It’ll be interesting to see how this beta period goes – and whether this new tool ends up doing more good than harm. We will have to wait and see.
Discuss on our Facebook page HERE.
KitGuru says: What do you think of this new AI powered moderator? Do you use voice chat when gaming online? Have you ever been suspended/banned? Let us know down below.