Glossary

Content moderation

What is content moderation?

Content moderation refers to the process of reviewing, screening and filtering user-generated content (UGC) in order to identify and address items that violate community guidelines and governmental laws. Content moderation can be automated using technology, completed manually by human moderators or a combination of the two.

Moderating UGC can take many different forms, including:

  • Text: This includes social media comments, product reviews and more. Human moderators as well as artificial intelligence technology filter spam, detect profanity and ensure the content adheres to community standards and laws.
  • Audio: Hate speech and harassment are targeted for removal while still allowing for honest dialogue.
  • Image: Image moderation is used to combat the inappropriate use of a company’s brand, depictions of violence, illegal activity and more.
  • Video: Whether the video is being livestreamed or it was previously recorded, videos that show explicit content, instances of bullying and more are removed.

There are a number of different types of content moderation, including:

  1. Pre-moderation: All user submissions are placed in a queue for moderation before they are displayed. This method of moderation is often implemented on platforms that require a high level of protection, such as those used by children.
  2. Post-moderation: Users are able to publish their submissions in real-time, but the content is automatically added to a queue for moderation.
  3. Reactive moderation: Users are asked to flag any content that they find offensive or that violates community guidelines.
  4. Supervisor moderation: Also known as unilateral moderation, this type of moderation involves selecting a group of moderators from an online community and providing them with advanced abilities to enforce guidelines.
  5. Commercial content moderation (CCM): Often outsourced to specialists, CCM involves monitoring content for large, established brands like social media platforms, games companies and other tech giants.
  6. Distributed moderation: Enables users to vote on UGC and flag content that goes against any guidelines. This type of moderation usually takes place under the guidance of experienced moderators.
  7. Automated moderation: Involves the use of a variety of tools such as filters and machine learning algorithms to sort, flag and reject UGC.

Benefits of content moderation

With the amount of user-generated content increasing, it is essential that any company with an online presence have a content moderation strategy in place. An effective strategy will:

  • Protect brand reputation: The content moderation practices of an organization are a reflection of its brand values and personality and go hand-in-hand with how it is perceived by customers.
  • Build trust with customers: Consumers expect a brand’s online community to be safe. Providing a welcoming environment that is free from inappropriate UGC will help build trust with your users.
  • Improve user experience: Creating positive online communities is essential in providing a high-quality customer experience.

Content Moderation Solutions

Keep your brand strong and your users safe

Learn more