Content moderation (CM) is a priority for a wide variety of online platforms that want to ensure a safe environment for their users in line with their Trust and Safety programs. 

Imagga has the right solution for accomplishing just that. Our fully-automatic content moderation (or semi-automatic, if you decide to involve humans in the loop) is a powerful, easy-to-use and scalable option with whose help you can monitor all visual content that users generate on your platform. 

With Imagga, you overcome several crucial issues at once: 

  • The harm that unmoderated user-generated content can bring to both your users and your brand; 
  • The limitations of manual moderation that can’t cope with the ever-growing content volume;
  • The ethical problems and psychological burden on human content moderators caused by disturbing content. 

One of the superpowers of our platform is that it allows you to handle content in a faster and more efficient way than ever before. Quick moderation is essential for working with large amounts of content — and it’s important for boosting the productivity and growth trajectory of your online business. 

Let’s dig into the features and benefits that Imagga’s full-stack content moderation platform offers — and how to make the best of them for managing your visual and livestream content. 

The Tools in Imagga’s CM Platform 

Our content moderation platform consists of three modules. It is designed to flexibly accommodate your needs, combining automatic AI-powered moderation with the necessary input from your in-house or outsourced human moderators. 

The API (Application Programming Interface) is where the AI rolls sleeves to get the work done. It boasts self-improving state-of-the-art deep learning algorithms that identify inappropriate visual content on the basis of visual recognition technology. You can use the platform in the way that best suits your business operations and legal framework — in the cloud or on-premise. You can stick with just this API component if you don’t need human moderation. 

The Admin Dashboard is the web and mobile UI where you get all the functionalities and settings in one place. You can control the different aspects of the moderation process, so that you can skillfully combine the automatic features of the platform for filtering and flagging with human moderation whenever it’s necessary.  

The Moderation Interface is where your human moderators can easily interact with the CM platform. When they open Imagga Content Moderation UI, they’ll see the batches of items that have been assigned to them for moderation and will be able to act on them immediately.  


Once you create an Imagga API account, you can securely analyze and moderate your visual data using our REST API. 

The AI-powered pre-trained model will start processing the information you provide for analysis and screening it for different categories of inappropriate content. You can set the categories for filtration flexibly, depending on your moderation goals. The best part is that the process is automated but the system can also learn on the go from the moderating decisions of human moderators, if such are involved in the process.

While you have the powerful Admin Dashboard from where you can control a broad range of settings for the content moderation process, you can also use the Admin API for that purpose — and not only for feeding items for moderation. You can: 

  • Create projects, add moderators, and set categories of inappropriate content and rules for moderation 
  • Access moderation items’ status and logs 

You can also import data from different sources through the API endpoints specified for this purpose. 

The Admin Dashboard: Your Control Hub

Sorting content for quick moderation is an easy and intuitive process with Imagga’s Admin Dashboard. 

When you open your dashboard, you have access to three important sections: Projects, Moderators, and Rules. Let’s review what you can do in each of them in detail. 


In your Admin Dashboard, you can keep tabs on a number of projects at once. You can create a new project based on the different types of content (or supported language) you want to moderate. 

Let’s say that you want to use content moderation for your travel platform. You can set up, for example, two separate projects for the different streams of content — Accommodation Reviews for monitoring user reviews of properties, and Accommodation Photos for monitoring the visuals uploaded for each property. 

For each project, there are a number of settings you can control.


You can choose the number of hours within which items in this project have to be moderated. 

This is especially useful when you have a single moderation team that needs to handle different projects simultaneously.

Priority Level

You can further prioritize a project by setting its overall priority level in your dashboard.

This priority level overrides the SLA setting, so it pushes a project up a moderator’s list. 

Batch Size

You can set the number of items that a moderator should handle at once when working on a project. Only when they complete one batch, they’ll be able to review the items from the next batch.

With this setting, you can manage the workload of your moderators, ensuring that content is reviewed in the best possible order too. 

Content Privacy 

You have two options for ensuring you meet GDPR and content privacy regulations — blurring faces of people and of car plates.

This setting is especially relevant if you’re working in a heavily regulated field. 

Retention Policy

You can choose how long an item stays in the system before it gets deleted.

This is necessary for the learning purposes of the AI algorithm, which improves over time based on moderators’ feedback on its previous work. 

Add Moderators

You can assign different moderators to different projects. Once you assign a moderator to a project, they’re allowed to flag items for all categories of inappropriate content in this project.

That’s how you make sure the right person is working on specific projects. It also helps you stay on top of managing moderators’ workloads.  

Categories Management

You can set the different categories of inappropriate content that you’d like to moderate. You can create new categories and name them according to your platform’s needs. For example, you can set categories like ‘Inappropriate’, ‘Irrelevant’, and others.

For each category, you can choose different options for:

  • AI model for content moderation
  • Threshold range for forwarding an item to a human moderator – everything outside this range is considered as properly moderated in an automated fashion
  • Number of moderators to review a single item in this category, for ensuring better quality and less bias;
  • Excluding moderators you don’t want to work on a specific category within a project.

In addition, you can write down the guidelines for moderation of this specific category, so moderators can access them easily whenever they work on it. 

When you add a new category, it is added to the system in real-time, so it can be used immediately. 


You can create profiles for the different moderators on your team. They consist of their name, photos, and the languages they use. 

You can set flexible rules individually for each moderator and assign priorities. You’re also able to review the number of items assigned to each person, as well as the rules.  


In the Rules section of the Admin Dashboard, you can create custom rules for your moderation process. 

For example, you can create rules for the different languages used and their priority levels. Then you can assign the rules to specific moderators – i.e. a certain person has to prioritize English-language content, while another one — Spanish. 

The Moderation Interface

The moderators on your team have access to Imagga’s CM platform through a dedicated interface — the Moderation web and mobile UI. 

When a moderator logs in, they can immediately see the assigned projects and the respective batches of items within each project. On the left hand side of their screen, they can review attributes of each item, like item ID and URL, and additional information like the date when the item was submitted for moderation. There is also an option to translate content if it’s not in English, which is great for multi-language moderation.

On the right hand side, the moderator can see the categories for moderation and click on the ones that the item belongs to, i.e. ‘Irrelevant’ or ‘Inappropriate’ — or alternatively, approve the item if it doesn’t breach your platform’s guidelines. 

Moderators can use hotkeys to make the moderation process as quick as possible. The reasons for inappropriateness of a visual are numbered from 1 to 9, so moderators can use that, or the Skip / Approve / Disapprove hotkeys. 

Ace Your Content Moderation with Imagga

With Imagga’s semi-automatic content moderation, you can combine the best of machine and human moderation in one. Our AI-powered system helps you optimize the moderation process, while also protecting moderators from vast amounts of harmful content. 

Don’t have an internal moderation team? Don’t worry — we can get you a highly-qualified external one too. 

Ready to give it a go? Get in touch with us to boost your content moderation with Imagga.

How to Use Imagga’s CM Platform Video