Call of Duty spies on its users with AI in search of 'white supremacists'
Activision has partnered with tech firm Modulate to create an artificial intelligence that scans conversations by tracking "toxicity."
The video game industry joins others in using artificial intelligence to spy on users. This is the case of Activision, the company that created the Call of Duty saga. It partnered with technology firm Modulate, a company that uses artificial intelligence to monitor players' conversations on the popular video game.
As revealed by PC Gamer, Activision and Modulate developed ToxMod. The tool, which began testing last week on North American servers, has the ability to "identify in real-time and enforce against toxic speech—including hate speech, discriminatory language, harassment and more."
To do this, the company explains, the program analyzes both messages and voice chats on video games such as "Rec Room" and now the Call of Duty saga. Specifically, ToxMod will be used in Call of Duty: Warzone and Call of Duty: Modern Warfare II. It is also expected to be included in Call of Duty: Modern Warfare III, the new game that will go on sale in November worldwide, except in Asia.
Call of Duty restricts more than 1 million accounts
The tool has been running for almost a week. In that time, Activision says in a blog post, more than a million accounts have been restricted by Call of Duty's "anti-toxicity moderation":
In addition, ToxMod is able to make complex distinctions thanks to the use of AI. To do this, says the developer of the tool, the program can "listen to conversational cues to determine how others in the conversation are reacting to the use of [certain] terms":
Collaboration with the Anti-Defamation League
Determining the context in which the n-word and other words are used is not easy. That's why Modulate partnered with the Anti-Defamation League (ADL) to create ToxMod to recognize "white supremacists" and "alt-right extremists" who, according to the company's website, fall in the "violent radicalization" category: