1. Insights
  2. Trust & Safety
  3. Case Study

Aligning content moderation operations to new legislation for a games client

Learn about TELUS Digital's work helping a successful games company maintain player safety on its social media channels with content moderation.

Image of several people huddled around a gaming computer with floating icons like a controller
  • Share on Facebook
  • Share via email

The challenge

Games companies that foster communities on social media platforms need to keep up with a rapid pace of change. Not only are there new platforms and trends, but also new online threats and corresponding legislation. Failure to adhere to local laws can result in damage to a company's reputation and player community, as well as significant fines.

Our client is a hugely successful games company with a player community spanning numerous social media platforms. Now legally obligated to moderate the content on their owned and controlled sites, our client was required to remove toxic and defamatory content within 24 hours and terrorist threats within one hour.

Adding to the challenge, the company was experiencing unprecedented user growth, which strained existing systems and resulted in a rise in the number of reported scams and offensive user-generated content (UGC).

The TELUS Digital solution

TELUS Digital has been a trusted player support partner to the games client for more than ten years. As soon as they recognized the need to adapt, the games company knew exactly who to go to for a solution.

In order to ensure compliance with new legislation, the TELUS Digital team immediately identified the need for comprehensive player moderation services and social media moderation across a number of platforms and languages.

Trust and safety experts were deployed to moderate content on multiple social media platforms, reviewing approximately one million posts or messages per month. The team issues warnings, account restrictions and permanent bans for bad actors, and moderates violative comments, images and links. Content moderators also interact directly with the player community, responding to player reports and de-escalating situations via chat.

Additionally, the solution involved moderating the client's live and high-profile esports events to maintain player safety on multiple channels. The ongoing need for this form of moderation has led to the introduction of managed automation services, including a language translation bot to moderate UGC in a number of languages on Discord.

The results

The solution helped our client maintain legal compliance and build trust with their online player community. Partnership highlights include:

  • Established a customized, dedicated, professional content moderation team tasked with continually updating policies to ensure compliance with new regulations while maintaining player safety
  • Scaled the content moderation team by 700% in three years
  • Expanded content moderation language capabilities to include Czech, English, French, German, Hungarian, Italian, Polish, Romanian and Spanish
  • Broadened the range of supported channels for social media and platform management from Facebook, Instagram, X and YouTube to include Discord and Twitch
  • Managed content moderation on Twitch and YouTube for live events and competitions, including large events with hundreds of thousands of players attending and participating in chat
  • Introduced a player-harm team that coordinates with local authorities and emergency services in order to protect the physical and mental health of players

Check out our solutions

Protect the safety and well-being of your user communities to maintain customer trust.

Learn more

Related insights