If you’ve ever played a competitive game of CS:GO you’re likely familiar with how it usually goes. A few bad rounds go by and soon enough someone on your team is screaming at you to do better. Well, Valve has stepped in and introduced a new auto-mute system to try and help combat this problem.
Going forward, any players who receive too many abuse reports will be given a warning. If they fail to improve their behavior, this warning will turn into a penalty. The penalty is that the game will have them automatically muted by all other players in their matches. Furthermore, the only way to remove this penalty is by earning enough XP by playing.
How auto-mute works
As with any muted player, everyone has the choice to unmute a teammate with a penalty. However, the player with the penalty has no control over who unmutes them and who they can talk to. This new auto-mute system should help somewhat combat CS:GO‘s ongoing toxicity problem.
The most toxic players who consistently irritate their teammates will have their freedom to communicate taken off them. Each report is weighted differently based on an account’s playtime and how often they are reported. This is possibly a counter-measure to prevent someone from being unfairly auto-muted by a few bad teammates. Above all else, the auto-mute system will prioritize banning repeat offenders.
But auto-mute isn’t the only anti-toxicity system that has been added into CS:GO. Last October, FACEIT unveiled their Minerva A.I, a chat reviewer that hands out warnings and bans within seconds of a message being sent. It was an overwhelming success having recorded 20,000 bans in its first 45 days of activity.
These bans were based on 200 million analyzed messages, seven million of which were deemed toxic. Also, the blog post claims that the introduction of Minerva decreased the number of toxic messages in-game by just over 20%. It’ll be interesting to see if systems like Minerva can continue to remove toxic players from gaming communities.