Activision’s Anti-Hate Speech Software For Call of Duty

Call of Duty will be taking the fight to quash “toxic and disruptive” attitudes with a brand-new voice chat moderation system known as ToxMod, technology powered by AI from Modulate, a team who specialise in identifying and enforcing against toxic behaviours such as hate speech, discriminatory language and harassment. 

 

“This new development will bolster the ongoing moderation systems led by the Call of Duty anti-toxicity team, which includes text-based filtering across 14 languages for in-game text (chat and usernames) as well as a robust in-game player reporting system,” a blog post by the developers reads. 

 

The voice chat tech rolled out for an initial beta test across North America yesterday (August 30 – the same day the Season 5: Reloaded update dropped). The beta will be trialed in current Call of Duty titles, Modern Warfare II and Warzone. Later, a worldwide release (without the inclusion of Asia) will follow to align with the debut of Modern Warfare III.

 

Following the launch of Modern Warfare II, the existing anti-toxicity tech for moderation caught out more than 1 million accounts through voice and text chats. 20% of those, Activision added, did not reoffend following a first warning. However, those who repeated their actions faced stricter penalties, including “voice and text chat bans” and “temporary account restrictions.

 

Call of Duty: Modern Warfare III will launch on Friday, November 10, 2023 across the PS5, PS4, Xbox One and Series consoles, and PC.

 

What else can Activision do to help tackle the issues covered in this report? Let us know what you think, and drop us a message in the comments below! 

Read More Articles On

Subscribe
Notify of
guest

0 Comments
Inline Feedbacks
View all comments