Can Artificial Intelligence Improve Online Gaming Communities?

Online gaming communities are well-known for their toxic behavior and abusive language, and the popular shooter franchise Call of Duty is no exception. A recent study completed in 2022 reinforced the reputation of Call of Duty players as being particularly toxic. However, game developer Activision is taking steps to address this issue by incorporating artificial intelligence (AI) into the player community.

Activision has introduced a new feature called ToxMod, developed in collaboration with AI startup Modulate, which utilizes AI technology to moderate voice chats in the game. Currently available in North America for the games Call of Duty: Modern Warfare II and Call of Duty: Warzone, ToxMod scans voice chats in real time to identify and take action against toxic speech, including harmful language, hate speech, discriminatory language, and harassment.

The introduction of ToxMod is a significant step in enhancing the existing moderation systems employed by the Call of Duty anti-toxicity team. Activision’s current methods include text-based filtering in 14 different languages and a robust in-game player reporting system. According to Activision, these measures have already had a positive impact, with a reported 20% of players not repeating their offensive behavior after receiving a warning.

While the use of AI technology shows promise in combating toxic behavior in online gaming communities, Activision emphasizes the importance of continued reporting by players. They encourage the community to play an active role in identifying and addressing disruptive behavior.

As the roll-out of ToxMod continues, players are hopeful that this new AI feature will provide a safer and more enjoyable gaming experience for everyone involved. Only time will tell if artificial intelligence can truly transform online gaming communities by minimizing toxic behavior and fostering a more inclusive and respectful environment.

FAQs

Q: What is ToxMod?
ToxMod is a new feature in the Call of Duty games that uses artificial intelligence to moderate voice chats in the player community. It identifies and takes action against toxic speech, including harmful language, hate speech, discriminatory language, and harassment.

Q: How does ToxMod work?
ToxMod scans voice chats in real time, utilizing AI technology to detect and address toxic behavior.

Q: What other measures does Activision have in place to combat toxic behavior?
Activision employs text-based filtering across 14 languages for in-game text and usernames, as well as a robust in-game player reporting system.

Q: Is the use of AI effective in reducing toxic behavior?
Activision reports that 20% of players who received a warning did not repeat their offensive behavior, indicating that existing measures have had a positive impact. The introduction of ToxMod is expected to further enhance the efforts to combat toxic behavior in online gaming communities.

Subscribe Google News Channel