Online toxicity is now an urgent problem for game companies – | We make games our business

Online and multiplayer games are notoriously toxic places. As various lawmakers look for ways to crack down on toxic behavior, game companies are beginning to understand the urgent need to get their house in order.

This week. Microsoft released its Xbox Transparency Report, detailing the company’s efforts to mitigate toxic behavior, often through the use of content moderation tools.

The report indicates that the company saw a 16.5-fold increase in the number of “proactive applications,” meaning “when we use our portfolio of protection technologies and processes to find and manage an issue before a player gets it to us.” report”. ”.

The vast majority of those executions were applied to accounts that were not “authentic” or involved in cheating. Others were hit with bans or suspensions for posting adult sexual content, vulgarity, profanity, harassment and bullying.

At this week’s GamesBeat Summit, a panel discussed the topic “How to build trust and confidence right before you’re forced to.” One panelist, David Hoppe, is a partner at Gamma Law and specializes in the gaming and technology sectors. He pointed to the many efforts being made to legislate against companies that tolerate, or fail to address, toxic gaming spaces.

California’s Age-Appropriate Design Code Act will go into effect this summer, but, as Hoppe said, “there are also musings at the federal level, and there are seven additional states besides California that are considering similar laws.” He added: “Without a doubt, if we were to go back five years from now, it will be a completely different environment for the regulation of content and communications between users.”

poison players

Companies like Microsoft are working on artificial intelligence tools that will do the job of detecting toxic accounts before they have a chance to poison other players’ experiences. Xbox’s 2002 report indicates that player reports are down 34 percent from the same period in 2021, which the company says is due to its tools intercepting toxicity at an early stage.

The report states that “player reports…are often first evaluated by content moderation technologies to see if a violation can be determined, with the remainder reviewed by human content moderation agents for decision making.” “.

In the past year, Microsoft says it has expanded its definition of toxic behavior. “We increased our definition of vulgar content to include offensive gestures, sexualized content, and crude humor… This policy change, along with improvements to our image classifiers, has resulted in a 450 percent increase in the application of vulgar content “.

carrot and stick

Eve Crevoshay, CEO of Take This, spoke at the GamesBeat panel, calling on companies to take a “carrot and stick” approach to cracking down on toxicity, particularly white supremacist rhetoric, which is not uncommon. in the play spaces. Take This is a non-profit mental health advocacy, research and training organization focused on gaming.

Crevoshay said: “There are norms and behaviors and ideologies [in gaming spaces] which have become very common. I don’t mean they are ubiquitous, I mean they are a small but very loud problem. And that volume means that it has normalized”.

He noted that these standards include “misogynistic white supremacist, neo-Nazi, and other xenophobic language along with bullying and mean behavior.” She said the game industry has yet to fully accept the problem. “We haven’t really realized what it means to design for positive results. We’re starting to do that… But right now, we’re seeing really high incidences.”

A report last year, by The Anti-Defamation League, found that “nearly one in ten gamers between the ages of 13 and 17 had been exposed to white supremacist ideology and themes in online multiplayer games. An estimated 2.3 million teens were exposed to white supremacist ideology in multiplayer gaming.”

Richard Warren, another GamesBeat panelist, is a partner at Windwalk, which is a consultancy dedicated to building gaming communities, often on Discord. He said it’s hard for game companies to manage communities on third-party apps, where young gamers often congregate, but there are ways to influence the conversation. He said companies should “establish a culture around self-restraint within communities, promoting people who are doing good deeds within the community.”

For Crevoshay, the real danger is that toxic behavior becomes learned behavior, especially among young people. As this spreads, it acts to exclude newcomers from gaming spaces, which is bad for gaming companies.

“Kids are learning toxic behaviors because the environment is full of them,” he said. “It is a real concern. And people freeze out of games because they don’t feel comfortable in them. So we know it’s a limiting factor in the business.”

He added: “It is pervasive and harmful to people. It spreads through communities, which are not effectively moderated.”

While organizations like Take That and some game companies are working to fix the problem, time is running out. “We are armed with the ability to say, ‘OK, we have the tools within this industry to address this. We can change this.’ But in the meantime, the regulatory landscape is really heating up. People are starting to say, it’s time to put the hammer down.”

Photo credit: Andre Hunter on Unsplash