Combatting toxicity is important in any game. In Call of Duty, toxicity can come in many forms such as text chat, usernames, and more. Recently, Activision published an anti-toxicity progress report. In the report, Activision outline that it will not tolerate any forms of toxic behavior. This includes hate speech or harassment which may take place in-game. The Call of Duty anti-toxicity report explains what has been done and what is being focused on in the future surrounding this issue.
Over the last 12 months, over 350,000 accounts have received a ban for racist language or toxic behavior. These bans have originated from player reports and a review of the player-name database. The bans have also taken place across multiple Call of Duty titles. This includes Call of Duty: Warzone, Black Ops Cold War, and Modern Warfare 2019. Moreover, new in-game filters have been introduced with the aim of catching offensive usernames, clan tags, or profiles. Also, new technology has been added to filter offensive text chat. These filters are designed to flag toxicity in over 11 languages.
Improving anti-toxicity in Call of Duty
Activision has noted some areas where it plans to increase its efforts in the anti-toxicity report. It will implement more resources to detect and enforce toxic behavior, with additional monitoring. The other focus will be to carry out consistent and fair reviews of enforcement policies.
The report also states that Activision will increase communication with the community. Therefore, players may expect to be informed about any more progress that has been made in the future. Fans are told to follow the Call of Duty Twitter page for updates. It is always a positive sign to see developers and publishers trying to eliminate toxic behavior from gaming. Although there is still a lot of work to be done, the anti-toxicity report shows the progress that has been made in Call of Duty.