Call of Duty Taking Steps to Control Toxic Speech

Call of Duty Taking Steps to Control Toxic Speech

To address the ongoing issue of toxic speech in the game, the Call of Duty franchise is integrating AI-powered voice chat moderation technology in conjunction with the release of Call of Duty: Modern Warfare 3. In line with the release of Modern Warfare 3, the AI voice chat moderation feature—which has been in North American beta testing since August—is now being rolled out globally.

The well-known shooter series has long struggled with toxic speech, which is why developer Infinity Ward has taken action against abusive players in text and voice chat. Because so many players log in every day, combating toxicity is still a problem even with efforts to impose bans on offenders. 

The Call of Duty team has responded by implementing new technology to prevent offensive language and improve the whole gaming experience. With the release of Modern Warfare 3, the voice chat moderation system for Call of Duty, known as “ToxMod,” was formally introduced. 

This Modulate-developed AI technology was first made available in beta for Warzone and Modern Warfare 3 in late summer. The official Call of Duty blog has announced that the technology is currently being deployed internationally across all three titles. 

The moderation system, ToxMod, has been officially launched with the global release of Call of Duty: Modern Warfare 3, with the exception of the Asia-Pacific region. In addition to filtering text and audio chat, ToxMod can comprehend fourteen other languages. In the near future, the moderation team intends to broaden its scope to include voice chat in-game in Spanish and Portuguese.

This kind of behavior is common in internet chat rooms, both inside and outside of the gaming industry, and is not limited to any one franchise. Nonetheless, the recent actions made by the Call of Duty team show that they are taking a proactive stance in resolving these issues and creating a more inviting atmosphere for all players.

Call of Duty introduced a new code of conduct a year ago, which resulted in the suspension of 500,000 accounts. The fight against toxic behavior persists despite these efforts, and the latest release of AI-powered voice chat moderation technology may cause debate. Gamers may be concerned by the idea of AI listening in on chats, even with the best of intentions.

Read Also

Abdul Wahab is a Software Engineer by profession and a Tech geek by nature. Having been associated with the tech industry for the last five years, he has covered a wide range of Tech topics and produced well-researched and engaging content. You will mostly find him reviewing tech products and writing blog posts. Binge-watching tech reviews and endlessly reading tech blogs are his favorite hobbies.