Skip to content
Executive learning about toxic trolling

A Gaming Company Tamed Toxic Trolling – What They Learned Could Help Your Community

Toxic behavior creates a very poor user experience. Learn how to approach toxic trolling your online community.

Note: Riot Games is not a Higher Logic client, nor did Higher Logic work with Riot Games around this issue. We just found this to be an inspiring story and a great example of fine community management at work. We’re excited to share the story with you.

How do you change the behavior of millions of community members? That’s the problem Riot Games had to solve with their game, League of Legends, when they realized that antisocial behavior – like gamers using racial and homophobic slurs with each other – had reached toxic levels.

By using social science and tons (and TONS!) of data, Riot Games uncovered tactics to reduce – if not end – toxic trolling within their game, League of Legends. League of Legends has 67 million players (27 million a day) and grossed an estimated $1.25 billion in revenue in 2015, but it’s reputation for toxic gamer behavior, such as gamers often making racist and homophobic remarks, made it hard to attract and retain players. Just because their gaming community created certain community norms didn’t mean that Riot Games had to maintain them – especially if they were hurtful to fellow gamers and deterred new people from playing their game. Toxic behavior, like racist and homophobic comments between teammates, created a very poor user experience for most of their players, so they set out to change their gamers’ behavior.

How Did Riot Games Approach the Problem?

Riot Games hired Jeffrey Lin, who had just received his PhD in cognitive neuroscience. Jeffrey was to act as a game designer and had access to the company’s seemingly endless supply of user data. The first thing he did was assemble a team to go over thousands of log chats (chats between players) to get a clear picture of what was currently happening in the game.

From his team’s analysis, they discovered that roughly 1% of the players were considered toxic – but they only accounted for 5% of all ‘toxic’ activity. This revelation was surprising, since it’s usually believed that most people don’t engage in bad behavior. Rather than just a few trolls bringing the community down, the culture at large allowed everyone to participate in toxic behavior at some level. That gave Jeffrey an important clue: he needed to change the entire culture – no small feat, given the vast number of players.

As any community builder would know, changing a widespread cultural problem is extremely difficult. But Jeffrey and his team had two incredible advantages working for them – autonomy and data.

Change Community Behavior With Psychology 101

Although Jeffrey has a PhD in cognitive neuroscience, he turned to psychology 101 basics. He decided to test ‘priming’ – where a person is exposed to something and their reaction is measured. Priming sounds more complicated than it is.

Jeffrey and his team ‘primed’ gamers with color coded tips throughout the game. Here’s an example tip: “Teammates perform worse if you harass them.” They discovered the tips performed best if they were in red boxes. Overall, they reduced verbal abuse by 6.2% and offensive language by 11%.

They were on the right track. But if Jeffrey’s team wanted to essentially revamp a toxic culture including millions of gamers, they needed better results.

Include the Community

That’s when Jeffrey introduced a brand new aspect of the game – “The Tribunal.” Gamers needed to be directly involved to effect lasting change and help transform the culture from the inside out.

Now, whenever a gamer reports another gamer, the misbehaving gamer ends up in The Tribunal – a jury of their peers.

Results showed most people didn’t know what rules they had broken. If they were never informed, they’d often pick up their old habits and continually be suspended from the game. But when players were told why they were in The Tribunal, 50% didn’t misbehave again for at least three months.

Equally interesting, when those people received written feedback from members of The Tribunal, the reform rate soared to 70%. And when offenders received feedback within 5-10 minutes of their infractions, the reform rate reached 92%.

How Does This Help Your Community?

Although Jeffrey’s work for Riot Games is impressive, most people don’t have access to the amount of data League of Legends created. Even if they did, it takes time, money and expertise to run the types of studies Jeffrey did.

But there are a few nuggets in this study that anyone could learn from and apply in their own community, no matter the size or toxicity level.

1. Toxicity is bad for business

Even if your members seem to tolerate (or even moderate) toxic behavior, don’t be fooled – it’s bad for business. One reason Riot Games decided to really tackle the issue is because it had a negative effect on gamer recruitment and retention.

It makes sense – you can’t expect members to voice their discomfort if they’re not invested in the community. And if toxic behavior drives people away, then they’ll never become invested.

2. People need guidance

Unfortunately, you can’t assume everyone will know how to interact in your community. Just as it takes work to learn how to be a community manager or builder, it can take work to learn how to become a productive community member.

That’s why it’s so important to create comprehensive guidelines for your members. They need a document they can look at and you can reference, outlining what good behavior looks like and what your strategy is for dealing with toxic behavior.

3. Don’t just ban members – explain

When disciplining members, be sure to spell out exactly what they did wrong. Although it probably seems obvious to you, you can’t assume they know what they did wrong. This goes back to giving members guidance – take this as an opportunity to educate. You may be surprised by the outcome.

4. People care about their peers

As Riot Games found, the reform rate for toxic behavior climbs when offenders get feedback from their peers. This doesn’t mean you should create your own Tribunal, but it does show how much people care about what their peers think. If possible, let members self moderate and call out the toxic behavior.

5. Be fast with feedback

Speed is key – the faster you deal with toxic behavior, the better. In Riot Games’ study, the faster they reprimanded the offender, the more likely they they were to stop.

Molly Talbert

Molly is the Content Marketing Manager at Asana. Previously, she did client services and social media for a small leadership development company. In her downtime, Molly reads through the internet, bikes, hikes and daydreams about her home state, New Mexico. She graduated from Middlebury College in Vermont where she studied the environment and writing, learned how to mountain bike through mud, and helped edit the student newspaper.