ViewBug is a web and mobile platform for visual creators with more than 2 million members and tens of thousands of image uploads every day. Like a Behance for the enthusiast photographer it gives photographers opportunities to be discovered and improve their skills offering  a full suite of tools that help them learn, work better, and make more money.

As many user generated content platforms, ViewBug needs a safety shield against offensive content that could harm the brand’s reputation, disturb vulnerable groups and cause compliance and legal issues. Filtering out adult images, which are not allowed in their platform, posed a challenge for the ViewBug team given the large scale volume of visual content generated by their users every day. They needed an automated content moderation software that was reliable, precise, accessible and easy to implement. 

After looking at a few solutions, they chose Imagga Not Safe for Work (NSFW) classifier – adult image content moderation categorizer trained on state of the art image recognition technology.  It was integrated in the code, verifying every single image upload in real-time to make sure that it is not explicit content.

The NSFW classifier is part of Imagga AI-powered Content Moderation solution. It’s designed to help visual content platforms of any size keep their reputation and content safe in a cost-efficient and scalable way. It will offer automated detection of diverse and customizable categories of inappropriate content and can be deployed in the cloud and on-premise.

Read full case study.

You have a similar need and you’re not sure what’s the best solution for you? Get in touch with us to discuss your case.