XBOX

Call of Duty now has an “Anti-toxicity” AI monitoring everything you say – WGB, Home of AWESOME Reviews

Do you remember all those dystopian stories where everything people say and do is closely monitored? Call of Duty now has a new AI monitoring everything you say. Called ToxMod, the new system has already been implemented in Call of Duty: Modern Warfare 2 and Warzone in America, and will be launched globally to coincide with Call of Duty: Modern Warfare 3 when it launches in November.

The new system is born of a partnership between Activision and Modulate and aims to, “identify in real-time and enforce against toxic speech—including hate speech, discriminatory language, harassment and more.” Currently, only English is supported but more languages will be added.

Had this system been in place back when I was playing the original Call of Duty 4: Modern Warfare, pretty much everyone would have been banned immediately, and possibly been put on some sort of watchlist. The smack talk was…quite devastating, back then.

Presumably the AI is not in charge of actually handing out suspensions and will instead flag up what it believes to be “toxic” speech for actual humans to examine. Toxic speech is, of course, a somewhat nebulous concept at best.

It’ll be interesting to see exactly what the system flags up and whether it has any ability to handle the way words can mean massively different things in different countries. Here in Scotland, for example, calling someone a total bastard can be very friendly, as can calling someone a c***. We’re a strange bunch, and swearing is basically part of our everyday lexicon. Mind you, there’s a difference between swearing and swearing at someone and their mam.

The point is, its very easy for an out-of-context snippet of a sentence to be grabbed by the AI and flagged up. While the surge in AI progression is amazing, the technology is still in its infancy and prone to being inconsistent and sometimes downright bloody stupid.

On the other hand, perhaps the new system will work great. According to Activision their prior efforts have been working quite well: “In examining the data focused on previously announced enforcement, 20% of players did not reoffend after receiving a first warning. Those who did reoffend were met with account penalties, which include but are not limited to feature restrictions (such as voice and text chat bans) and temporary account restrictions. This positive impact aligns with our strategy to work with players in providing clear feedback for their behavior.”

Mistakes have certainly been made, too, like when the game’s anti-cheat software started flagging up people for cheating because it didn’t like piece of background software they were using.

But what if Modulate’s plan is actually to develop the ultimate douchebag AI that is trained to deliver hateful, angry tirades. Surely there would be no better training ground than Call of Duty.



Originally posted by wolfsgamingblog.com

Microsoft UK IE

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button

Adblock Detected

We only use unintrusive ads on our website from well known brands. Please support our website by enabling ads. Thank you.