Overwatch Might Get New Ways Of Dealing With Toxicity

Toxic behavior has always been a problem in multiplayer games. Most people want to just play their game and enjoy it, since it IS a video game, but others take it too seriously. That’s when toxic behavior starts showing up, and that is something that no one likes. Blizzard has been experimenting with a new way of dealing with toxic people, and that is letting machines handle these kinds of problems instead of people.

Overwatch

It would be really nice if you could punish a player for his behavior without needing to report him, wouldn’t it? Well, that’s what Blizzard wants to do with Overwatch. In their interview with Kotaku, they said that the abusive chat went down by 17 percent thanks to the game’s reporting and punishment systems, which is great progress, but it’s still not enough. The problem with the reporting systems is that players can abuse them in all sorts of ways. But, that’s where they thought about experimenting with other things.

Enter machine learning. Already, the company is teaching its AI non-English languages like Korean, and the long-term hope is that it can learn what toxic gameplay looks like.

But, as humans make mistakes, so can machines. How will a machine be able to figure out if someone uses their ability to hinder their team? We don’t know yet how any of this will work, but we also know that Blizzard wants players to compliment their Overwatch teammates if they did well.

Leave A Reply

Your email address will not be published.

This site uses Akismet to reduce spam. Learn how your comment data is processed.