{"id":1684,"date":"2018-06-10T16:19:53","date_gmt":"2018-06-10T13:19:53","guid":{"rendered":"https:\/\/imagga.com\/blog\/?p=1684"},"modified":"2020-05-08T12:19:04","modified_gmt":"2020-05-08T09:19:04","slug":"securing-images-in-python-with-the-imagga-nsfw-categorization-api","status":"publish","type":"post","link":"https:\/\/imagga.com\/blog\/securing-images-in-python-with-the-imagga-nsfw-categorization-api\/","title":{"rendered":"Securing Images in Python With the Imagga NSFW Categorization API"},"content":{"rendered":"<p>In web and mobile applications, as well as any other digital media, the use of images as part of their content is very common. With images being so ubiquitous, there comes a need to ensure that the images posted are appropriate to the medium they are on. This is especially true for any medium accepting user-generated content. Even with set rules for what can and cannot be posted, you can never trust users to adhere to the set conditions. Whenever you have a website or medium accepting user-generated content, you will find that there is a need to moderate the content.<\/p>\n<h2>Why Moderate Content?<\/h2>\n<p>There are various reasons why content moderation might be in your best interest as the owner\/maintainer of a digital medium. Some common ones are:<\/p>\n<ul>\n<li><strong>Legal obligations<\/strong>\u00a0&#8211; If your application accommodates underaged users, then you are obligated to protect them from adult content.<\/li>\n<li><strong>Brand protection<\/strong>\u00a0&#8211; How your brand is perceived by users is important, so you might want to block some content that may negatively affect your image.<\/li>\n<li><strong>Protect your users<\/strong>\u00a0&#8211; You might want to protect your users against harassment from other users. The harassment can be in the form of users attacking others by posting offensive content. An example of this is\u00a0<a href=\"https:\/\/techcrunch.com\/2017\/11\/07\/facebook-revenge-porn-strategy-involes-sending-nudes-to-self\/\">Facebook\u2019s recent techniques<\/a>\u00a0of combating revenge p0rn on their platform.<\/li>\n<li><strong>Financial<\/strong>\u00a0&#8211; It might be in your best interest financially, to moderate the content shown on your applications. For instance, if your content is somewhat problematic, other businesses might not want to associate with you in terms of advertising on your platform or accepting you as an affiliate for them. For some Ad networks, keeping your content clean is a rule that you have to comply with if you want to use them. Google Adsense is an example of this. They strictly\u00a0<a href=\"https:\/\/support.google.com\/adsense\/answer\/4410771?hl=en\">forbid users of the service from placing their ads on pages with adult content<\/a>.<\/li>\n<li><strong>Platform rules<\/strong>\u00a0&#8211; You might be forced to implement some form of content moderation if the platform your application is on requires it. For instance,<a href=\"https:\/\/developer.apple.com\/app-store\/review\/guidelines\/#user-generated-content\">Apple requires applications to have a way of moderating and restricting user-generated content<\/a>\u00a0before they can be placed on the App Store and\u00a0<a href=\"https:\/\/play.google.com\/about\/restricted-content\/\">Google also restricts apps that contain sexually explicit content<\/a><\/li>\n<\/ul>\n<p>As you can see, if your application accepts user-generated content, moderation might be a requirement that you can\u2019t ignore. There are different ways moderation can be carried out:<\/p>\n<ul>\n<li><strong>Individual driven<\/strong>\u00a0&#8211; an example of this is a website that has admins that moderate the content. The website might work by either restricting the display of any uploaded content until it has been approved by an admin or it might allow immediate display of uploaded content, but have admins who constantly check posted content. This method tends to be very accurate in identifying inappropriate content, as the admins will most likely be clear as to what is appropriate\/inappropriate for the medium. The obvious problem with this is the human labor needed. Hiring moderators might get costly especially as the application\u2019s usage grows. Relying on human moderators can also affect the app\u2019s user experience. The human response will always be slower than an automated one. Even if you have people working on moderation at all times, there will still be a delay in identifying and removing problematic content. By the time it is removed, a lot of users could have seen it. On systems that restrict showing uploaded content until it has been approved by an admin, this delay can become annoying to users.<\/li>\n<li><strong>Community driven<\/strong>\u00a0&#8211; with this type of moderation, the owner of the application puts in place features that enable the app\u2019s users to report any inappropriate content e.g. flagging the content. After a user flags a post, an admin will then be notified. This also suffers from a delay in identifying inappropriate content from both the community (who might not act immediately the content is posted) and the administrators (who might be slow to respond to flagged content). Leaving moderation up to the community might also result in reported false positives as content that is safe is seen by some users as inappropriate. With a large community, you will always have differing opinions, and because many people will probably not have read the Terms and Conditions of the medium, they will not have clear-cut rules of what is and isn\u2019t okay.<\/li>\n<li><strong>Automated<\/strong>\u00a0&#8211; with this, a computer system usually using some machine learning algorithm is used to classify and identify problematic content. It can then act by removing the content or flagging it and notifying an admin. With this, there is a decreased need for human labor, but the downside is that it might be less accurate than a human moderator.<\/li>\n<li><strong>A mix of some or all the above methods<\/strong>\u00a0&#8211; Each of the methods described above comes with a shortcoming. The best outcome might be achieved by combining some or all of them e.g. you might have in place an automated system that flags suspicious content while at the same time enabling the community to also flag content. An admin can then come in to determine what to do with the content.<\/li>\n<\/ul>\n<h2><a id=\"A_Look_at_the_Imagga_NSFW_Categorization_API_21\"><\/a>A Look at the Imagga NSFW Categorization API<\/h2>\n<p>Imagga makes available the\u00a0<strong>NSFW (<a href=\"https:\/\/en.wikipedia.org\/wiki\/Not_safe_for_work\">not safe for work<\/a>) Categorization API<\/strong>\u00a0that you can use to build a system that can detect adult content. The API works by categorizing images into three categories:<\/p>\n<ul>\n<li><strong>nsfw<\/strong>\u00a0&#8211; these are images considered not safe. Chances are high that they contain ponographic content and\/or display nude bodies or inappropriate body parts.<\/li>\n<li><strong>underwear<\/strong>\u00a0&#8211; this categorizes medium safe images. These might be images displaying lingerie, underwear, swimwear, e.t.c.<\/li>\n<li><strong>safe<\/strong>\u00a0&#8211; these are completely safe images with no nudity.<\/li>\n<\/ul>\n<p>The API works by giving a confidence level of a submitted image. The confidence is a percentage that indicates the probability of an image belonging to a certain category.<\/p>\n<p>To see the NSFW API in action, we\u2019ll create two simple programs that will process some images using the API. The first program will demonstrate how to categorize a single image while the second will batch process several images.<\/p>\n<h2><a id=\"Setting_up_the_Environment_33\"><\/a>Setting up the Environment<\/h2>\n<p>Before writing any code, we\u2019ll first set up a\u00a0<a href=\"https:\/\/realpython.com\/blog\/python\/python-virtual-environments-a-primer\/\">virtual environment<\/a>. This isn\u2019t necessary but is recommended as it prevents package clutter and version conflicts in your system\u2019s global Python interpreter.<\/p>\n<p>First, create a directory where you\u2019ll put your code files.<\/p>\n<p><pre class=\"EnlighterJSRAW\" data-enlighter-language=\"bash\">$ mkdir nsfw_test<\/pre><\/p>\n<p>Then navigate to that directory with your Terminal application.<\/p>\n<p><pre class=\"EnlighterJSRAW\" data-enlighter-language=\"bash\">$ cd nsfw_test<\/pre><\/p>\n<p>Create the virtual environment by running:<\/p>\n<p><pre class=\"EnlighterJSRAW\" data-enlighter-language=\"bash\">$ python3 -m venv venv<\/pre><\/p>\n<p>We\u2019ll use Python 3 in our code. In the above, we create a virtual environment with Python 3. With this, the default Python version inside the virtual environment will be version 3.<\/p>\n<p>Activate the environment with (on MacOS and Linux):<\/p>\n<p><pre class=\"EnlighterJSRAW\" data-enlighter-language=\"bash\">$ source venv\/bin\/activate<\/pre><\/p>\n<p>On Windows:<\/p>\n<p><pre class=\"EnlighterJSRAW\" data-enlighter-language=\"bash\">$ venv\\Scripts\\activate<\/pre><\/p>\n<h2><a id=\"Categorizing_Images_65\"><\/a>Categorizing Images<\/h2>\n<p>To classify an image with the NSFW API, you can either send a GET request with the image URL to the <code class=\"EnlighterJSRAW\">\/categories\/&lt;categorizer_id&gt;<\/code> endpoint or you can upload the image to <code class=\"EnlighterJSRAW\">\/uploads<\/code>, get back a <code class=\"EnlighterJSRAW\">upload_id<\/code> value which you will then use in the call to the <code class=\"EnlighterJSRAW\">\/categories\/&lt;categorizer_id&gt;\/<\/code> endpoint. We\u2019ll create two applications that demonstrate these two scenarios.<\/p>\n<h3>Processing a Single Image<\/h3>\n<p>The first app we\u2019ll create is a simple web application that can be used to check if an image is safe or not. We\u2019ll create the app with\u00a0<a href=\"http:\/\/flask.pocoo.org\/\">Flask<\/a>.<\/p>\n<p>To start off, install the following dependencies.<\/p>\n<p><pre class=\"EnlighterJSRAW\" data-enlighter-language=\"bash\">$ pip install flask flask-bootstrap requests<\/pre><\/p>\n<p>Then create a folder named\u00a0<code class=\"EnlighterJSRAW\">templates<\/code>\u00a0and inside that folder, create a file named\u00a0<code class=\"EnlighterJSRAW\">index.html<\/code>\u00a0and add the following code to it.<\/p>\n<pre class=\"EnlighterJSRAW\" data-enlighter-language=\"html\">{% extends \"bootstrap\/base.html\" %}\r\n\r\n{% block title %}Imagga NSFW API Test{% endblock %}\r\n\r\n{% block navbar %}\r\n\r\n&lt;nav class=\"navbar navbar-inverse\" role=\"navigation\"&gt;\r\n&lt;div class=\"container\"&gt;&lt;a class=\"navbar-brand\" href=\"{{ url_for('index') }}\"&gt;NSFW API Test&lt;\/a&gt;&lt;\/div&gt;\r\n&lt;\/nav&gt;{% endblock %}\r\n\r\n{% block content %}\r\n&lt;div class=\"container\"&gt;\r\n    &lt;div class=\"row\"&gt;\r\n        &lt;div class=\"col-md-8\"&gt;\r\n            &lt;form action=\"\" method=\"POST\"&gt;\r\n\r\n                &lt;div class=\"form-group\"&gt;\r\n                    &lt;label for=\"image_url\"&gt;Image URL&lt;\/label&gt;\r\n                    &lt;input type=\"url\" id=\"image_url\" name=\"image_url\"\/&gt;\r\n                    &lt;button class=\"btn btn-primary\" type=\"submit\"&gt;Submit&lt;\/button&gt;\r\n                &lt;\/div&gt;\r\n\r\n            &lt;\/form&gt;\r\n        &lt;\/div&gt;\r\n    &lt;\/div&gt;\r\n    {% if image_url %}\r\n    &lt;div class=\"row\"&gt;\r\n        &lt;div class=\"col-md-4\"&gt;&lt;img class=\"img-thumbnail\" src=\"{{ image_url }}\" \/&gt;&lt;\/div&gt;\r\n        &lt;div class=\"col-md-4\"&gt;{{ res }}&lt;\/div&gt;\r\n    &lt;\/div&gt;\r\n    {% endif %}\r\n\r\n&lt;\/div&gt;\r\n{% endblock %}\r\n<\/pre>\n<p>In the above code, we create an HTML template containing a form that the user can use to submit an image URL to the Imagga API. When the response comes back from the server, it will be shown next to the processed image.<\/p>\n<p>Next, create a file named\u00a0<code class=\"EnlighterJSRAW\">app.py<\/code>\u00a0in the root directory of your project and add the following code to it. Be sure to replace\u00a0<code class=\"EnlighterJSRAW\">INSERT_API_KEY<\/code>\u00a0and\u00a0<code class=\"EnlighterJSRAW\">INSERT_API_SECRET<\/code>\u00a0with your Imagga API Key and Secret. You can\u00a0<a href=\"https:\/\/imagga.com\/auth\/signup\/hacker\">signup<\/a>\u00a0for a free account to get these credentials. After creating an account, you\u2019ll find these values on your\u00a0<a href=\"https:\/\/imagga.com\/profile\/dashboard\">dashboard<\/a>:<\/p>\n<pre class=\"EnlighterJSRAW\" data-enlighter-language=\"generic\">from flask import Flask, render_template, request\r\nfrom flask_bootstrap import Bootstrap\r\nimport requests\r\nfrom requests.auth import HTTPBasicAuth\r\n\r\n\r\napp = Flask(__name__)\r\nBootstrap(app)\r\n\r\n# API Credentials. Set your API Key and Secret here\r\nAPI_KEY = 'acc_xxxxxxxxxxxxxxx'\r\nAPI_SECRET = 'xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx'\r\nENDPOINT = 'https:\/\/api.imagga.com\/v2\/categories\/nsfw_beta'\r\nauth = HTTPBasicAuth(API_KEY, API_SECRET)\r\n\r\n\r\n@app.route('\/', methods=['GET', 'POST'])\r\ndef index():\r\n    image_url = None\r\n    res = None\r\n\r\n    if request.method == 'POST' and 'image_url' in request.form:\r\n        image_url = request.form['image_url']\r\n        response = requests.get(\r\n           '%s?image_url=%s' % (ENDPOINT, image_url),\r\n           auth=auth)\r\n\r\n        try:\r\n            res = response.json()\r\n        except Exception as e:\r\n            print('Exception in JSON decode:')\r\n            print(e)\r\n            print(response.content, response.status_code)\r\n\r\n    return render_template('index.html', image_url=image_url, res=res)\r\n\r\n\r\nif __name__ == '__main__':\r\n    app.run(debug=True)\r\n<\/pre>\n<p>Every call to the Imagga API must be authenticated. Currently, the only supported method for authentication is\u00a0<a href=\"https:\/\/developer.mozilla.org\/en-US\/docs\/Web\/HTTP\/Authentication#Basic_authentication_scheme\">Basic<\/a>. With Basic Auth, credentials are transmitted as user ID\/password pairs, encoded using base64. In the above code, we achieve this with a call to\u00a0<code class=\"EnlighterJSRAW\">HTTPBasicAuth()<\/code>.<\/p>\n<p>We then create a function that will be triggered by GET and POST requests to the\u00a0<code class=\"EnlighterJSRAW\">\/<\/code>\u00a0route. If the request is a POST, we get the data submitted by form and send it to the Imagga API for classification.<\/p>\n<p>The NSFW Categorizer is one of a few\u00a0<a href=\"https:\/\/docs.imagga.com\/?python#available-categorizers\">categorizers<\/a>\u00a0made available by the Imagga API. A Categorizer is used to recognize various objects and concepts. There are a couple predefined ones available (<a href=\"https:\/\/docs.imagga.com\/?python#personal_photos-categorizer\">Personal Photos<\/a>\u00a0and\u00a0<a href=\"https:\/\/docs.imagga.com\/?python#nsfw_beta-categorizer\">NSFW Beta<\/a>) but if none of them fit your needs we can\u00a0<a href=\"https:\/\/imagga.com\/solutions\/custom-categorization.html\">build a custom one for you<\/a>.<\/p>\n<p>As mentioned previously, to send an image for classification, you send a GET request to the <code class=\"EnlighterJSRAW\">\/categories\/&lt;categorizer_id&gt;<\/code> endpoint. The <code class=\"EnlighterJSRAW\">categorizer_id<\/code> for the NSFW API is <code class=\"EnlighterJSRAW\">nsfw_beta<\/code>. You can send the following parameters with the request:<\/p>\n<ul>\n<li><strong>image_url<\/strong>: URL of an image to submit for categorization.<\/li>\n<li><strong>image_upload_id<\/strong>: You can also directly send image files for categorization by uploading the images to our <code class=\"EnlighterJSRAW\">\/uploads<\/code> endpoint and then provide the received content identifiers via this parameter.<\/li>\n<li><strong>language<\/strong>: If you\u2019d like to get a translation of the tags in other languages, you should use the language parameter. Its value should be the code of the language you\u2019d like to receive tags in. You can apply this parameter multiple times to request tags translated in several languages. See all available languages\u00a0<a href=\"https:\/\/docs.imagga.com\/?python#multi-language-support\">here<\/a>.<\/li>\n<\/ul>\n<p>After processing the request, the API sends back a JSON object holding the image\u2019s categorization data in case of a successful processing, and an error message incase there was a problem processing the image.<\/p>\n<p>Below you can see the response of a successful categorization:<\/p>\n<pre class=\"EnlighterJSRAW\" data-enlighter-language=\"json\">{\r\n  \"result\": {\r\n    \"categories\": [\r\n      {\r\n        \"confidence\": 99.1496658325195,\r\n        \"name\": {\r\n          \"en\": \"safe\"\r\n        }\r\n      }\r\n    ]\r\n  },\r\n  \"status\": {\r\n    \"text\": \"\",\r\n    \"type\": \"success\"\r\n  }\r\n}<\/pre>\n<p>&nbsp;<\/p>\n<p>Note that you might not always get JSON with the three categories displayed. If the confidence of a category is\u00a0<code class=\"EnlighterJSRAW\">0<\/code>, this category will not be included in the JSON object.<\/p>\n<p>Below you can see the response of a failed categorization.<\/p>\n<pre class=\"EnlighterJSRAW\" data-enlighter-language=\"json\">{\r\n  \"status\": {\r\n    \"text\": \"Unexpected error while running the classification job.\",\r\n    \"type\": \"error\"\r\n  }\r\n}<\/pre>\n<p>Back to our app, you can save your code and run it with:<\/p>\n<p><pre class=\"EnlighterJSRAW\" data-enlighter-language=\"bash\">\r\n$ python app.py\r\n<\/pre><\/p>\n<p>If you navigate to\u00a0<a href=\"http:\/\/127.0.0.1:5000\/\">http:\/\/127.0.0.1:5000\/<\/a>\u00a0you should see a form with one input field. Paste in the URL of an image and submit it. The image will be processed and you will get back a page displaying the image and the JSON returned from the server. To keep it simple, we just display the raw JSON, but in a more sophisticated app, it would be parsed and used to make some decision.<\/p>\n<p>Below, you can see the results of some images we tested the API with.<\/p>\n<p><img class=\"alignnone wp-image-1754 hoverZoomLink\" src=\"https:\/\/imagga.com\/blog\/wp-content\/uploads\/2018\/01\/CAR-800x450.jpg\" alt=\"\" width=\"300\" height=\"169\" srcset=\"https:\/\/imagga.com\/blog\/wp-content\/uploads\/2018\/01\/CAR-800x450.jpg 800w, https:\/\/imagga.com\/blog\/wp-content\/uploads\/2018\/01\/CAR-768x432.jpg 768w, https:\/\/imagga.com\/blog\/wp-content\/uploads\/2018\/01\/CAR-1024x576.jpg 1024w, https:\/\/imagga.com\/blog\/wp-content\/uploads\/2018\/01\/CAR.jpg 1920w\" sizes=\"(max-width: 300px) 100vw, 300px\" \/><\/p>\n<p><img class=\"alignnone wp-image-1756 hoverZoomLink\" src=\"https:\/\/imagga.com\/blog\/wp-content\/uploads\/2018\/01\/PEOPLE-800x450.jpg\" alt=\"\" width=\"300\" height=\"169\" srcset=\"https:\/\/imagga.com\/blog\/wp-content\/uploads\/2018\/01\/PEOPLE-800x450.jpg 800w, https:\/\/imagga.com\/blog\/wp-content\/uploads\/2018\/01\/PEOPLE-768x432.jpg 768w, https:\/\/imagga.com\/blog\/wp-content\/uploads\/2018\/01\/PEOPLE-1024x576.jpg 1024w, https:\/\/imagga.com\/blog\/wp-content\/uploads\/2018\/01\/PEOPLE.jpg 1920w\" sizes=\"(max-width: 300px) 100vw, 300px\" \/><\/p>\n<p><img class=\"alignnone wp-image-1771 hoverZoomLink\" src=\"https:\/\/imagga.com\/blog\/wp-content\/uploads\/2018\/01\/Underwear-800x450.jpg\" alt=\"\" width=\"300\" height=\"169\" srcset=\"https:\/\/imagga.com\/blog\/wp-content\/uploads\/2018\/01\/Underwear-800x450.jpg 800w, https:\/\/imagga.com\/blog\/wp-content\/uploads\/2018\/01\/Underwear-768x432.jpg 768w, https:\/\/imagga.com\/blog\/wp-content\/uploads\/2018\/01\/Underwear-1024x576.jpg 1024w, https:\/\/imagga.com\/blog\/wp-content\/uploads\/2018\/01\/Underwear.jpg 1920w\" sizes=\"(max-width: 300px) 100vw, 300px\" \/><\/p>\n<p>As you can see, the images have been categorized quite accurately. The first two have\u00a0<code class=\"EnlighterJSRAW\">safe<\/code>\u00a0confidence scores of\u00a0<code class=\"EnlighterJSRAW\">99.22<\/code>\u00a0and\u00a0<code class=\"EnlighterJSRAW\">99.23<\/code>\u00a0respectively while the last one has an\u00a0<code class=\"EnlighterJSRAW\">underwear<\/code>\u00a0score of\u00a0<code class=\"EnlighterJSRAW\">96.21<\/code>. Of course, we can\u2019t show an\u00a0<code class=\"EnlighterJSRAW\">nsfw<\/code>\u00a0image here on this blog, but you are free to test that on your own.<\/p>\n<p>To know the exact confidence score to use for your app, you should first test the API with several images. When you look at the results of several images, you will be able to better judge which number to look out for in your code when filtering okay and not okay images. If you are still not sure about this, our suggestion is setting the confidence threshold at 15-20%. However, if you\u2019d like to be more strict on the accuracy of the results, setting the confidence threshold at 30% might do the trick.<\/p>\n<p>You should know that the technology is far from perfect and that the NSFW API is still in beta. From time to time, you might get an incorrect classification.<\/p>\n<p>Note that the API has a limit of 5 seconds for downloading the image. If the limit is exceeded with the URL you send, the analysis will be unsuccessful. If you find that most of your requests are unsuccessful due to timeout error, we suggest uploading the images to our <code class=\"EnlighterJSRAW\">\/uploads<\/code> endpoint first (which is free and not accounted towards your usage) and then use the content id returned to submit the images for processing via the <code class=\"EnlighterJSRAW\">content<\/code> parameter. We\u2019ll see this in action in the next section.<\/p>\n<h3><a id=\"Batch_Processing_Several_Images_239\"><\/a>Batch Processing Several Images<\/h3>\n<p>The last app we created allowed the user to process one image at a time. In this section, we are going to create a program that can batch process several images. This won\u2019t be a web app, it will be a simple script that you can run from the command line.<\/p>\n<p>Create a file named\u00a0<code class=\"EnlighterJSRAW\">upload.py<\/code>\u00a0and add the code below to it. If you are still using the virtual environment created earlier, then the needed dependencies have already been installed, otherwise, install them with\u00a0<code class=\"EnlighterJSRAW\">pip install requests<\/code>.<\/p>\n<pre class=\"EnlighterJSRAW\" data-enlighter-language=\"python\">import os\r\nimport json\r\nimport argparse\r\nimport requests\r\nfrom requests.auth import HTTPBasicAuth\r\n\r\n# API Credentials. Set your API Key and Secret here\r\nAPI_KEY = 'acc_xxxxxxxxxxxxxxx'\r\nAPI_SECRET = 'xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx'\r\n\r\nAPI_ENDPOINT = 'https:\/\/api.imagga.com\/v2'\r\nFILE_TYPES = ['.png', '.jpg', '.jpeg', '.gif']\r\n\r\n\r\nclass ArgumentException(Exception):\r\n    pass\r\n\r\n\r\nif API_KEY == 'YOUR_API_KEY' or \\\r\n   API_SECRET == 'YOUR_API_SECRET':\r\n    raise ArgumentException('You haven\\'t set your API credentials. '\r\n                            'Edit the script and set them.')\r\n\r\nauth = HTTPBasicAuth(API_KEY, API_SECRET)\r\n\r\n\r\ndef upload_image(image_path):\r\n    if not os.path.isfile(image_path):\r\n        raise ArgumentException('Invalid image path')\r\n\r\n    # Open the desired file\r\n    with open(image_path, 'rb') as image_file:\r\n        # Upload the image with a POST\r\n        # request to the \/uploads endpoint\r\n        uploads_response = requests.post(\r\n            '%s\/uploads' % API_ENDPOINT,\r\n            auth=auth,\r\n            files={'image': image_file})\r\n\r\n    # Example \/uploads response:\r\n    # {\r\n    #   \"result\": {\r\n    #     \"upload_id\": \"i05e132196706b94b1d85efb5f3SaM1j\"\r\n    #   },\r\n    #   \"status\": {\r\n    #     \"text\": \"\",\r\n    #     \"type\": \"success\"\r\n    #   }\r\n    # }\r\n    try:\r\n        upload_id = uploads_response.json()['result']['upload_id']\r\n    except Exception as e:\r\n        print('Error when reading upload response: ', e.text)\r\n\r\n    return upload_id\r\n\r\n\r\ndef check_image(upload_id):\r\n    # Using the uploads id, make a GET request to the \/categorizations\/nsfw\r\n    # endpoint to check if the image is safe\r\n    params = {\r\n       'image_upload_id': upload_id\r\n    }\r\n    response = requests.get(\r\n        '%s\/categories\/nsfw_beta' % API_ENDPOINT,\r\n        auth=auth,\r\n        params=params)\r\n    return response.json()\r\n\r\n\r\ndef main():\r\n    parser = argparse.ArgumentParser(description='Tags images in a folder')\r\n    parser.add_argument(\r\n        'input', help='The input - a folder containing images')\r\n    parser.add_argument(\r\n        'output', help='The output - a folder to output the results')\r\n    args = parser.parse_args()\r\n\r\n    tag_input = args.input\r\n    tag_output = args.output\r\n    results = {}\r\n\r\n    if not os.path.exists(tag_output):\r\n        os.makedirs(tag_output)\r\n\r\n    if not os.path.isdir(tag_input):\r\n        raise ArgumentException(\r\n            'The input directory does not exist: %s' % tag_input)\r\n\r\n    images = []\r\n    for img in os.scandir(tag_input):\r\n        if not os.path.isfile(img.path):\r\n            print('Bad file path', img.path)\r\n            continue\r\n\r\n        name, extension = os.path.splitext(img.name)\r\n\r\n        if extension.lower() not in FILE_TYPES:\r\n            print('Extension %s not in allowed' % extension, FILE_TYPES)\r\n            continue\r\n\r\n        images.append(img.path)\r\n\r\n    images_count = len(images)\r\n\r\n    for i, image_path in enumerate(images):\r\n        print('[%s \/ %s] %s uploading' %\r\n              (i + 1, images_count, image_path))\r\n\r\n        upload_id = upload_image(image_path)\r\n        nsfw_result = check_image(upload_id)\r\n        results[image_path] = nsfw_result\r\n\r\n        print('[%s \/ %s] %s checked' % (\r\n            i + 1, images_count, image_path))\r\n\r\n    for image_path, result in results.items():\r\n        image_name = os.path.basename(image_path)\r\n        result_path = os.path.join(tag_output, 'result_%s.json' % image_name)\r\n\r\n        with open(result_path, 'w') as results_file:\r\n            json.dump(result, results_file, indent=4)\r\n\r\n    print('Done. Check your output folder for the results')\r\n\r\n\r\nif __name__ == '__main__':\r\n    main()\r\n<\/pre>\n<p>We use the\u00a0<code class=\"EnlighterJSRAW\">argparse<\/code>\u00a0module to parse arguments from the command line. The first argument passed in will be the path to a folder containing images to be processed while the second argument is a path to a folder where the results will be saved.<\/p>\n<p>For each image in the input folder, the script uploads it with a POST request to the <code class=\"EnlighterJSRAW\">\/uploads<\/code> endpoint. After getting a content id back, it makes another call to the <code class=\"EnlighterJSRAW\">\/categories\/&lt;categorizer_id&gt;<\/code> endpoint. It then writes the response of that request to a file in the output folder.<\/p>\n<p>Note that all uploaded files sent to <code class=\"EnlighterJSRAW\">\/uploads<\/code> remain available for 24 hours. After this period, they are automatically deleted. If you need the file, you have to upload it again. You can also manually delete an image by making a DELETE request to <code class=\"EnlighterJSRAW\">https:\/\/api.imagga.com\/v2\/uploads\/&lt;upload_id&gt;<\/code>.<\/p>\n<p>Add some images to a folder and test the script with:<\/p>\n<p><pre class=\"EnlighterJSRAW\" data-enlighter-language=\"bash\">$ python upload.py path\/to\/input\/folder path\/to\/output\/folder<\/pre><\/p>\n<p>If you look at the output folder you selected, you should see a JSON file for each processed image.<\/p>\n<p>Feel free to test out the Imagga NSFW Categorization API. If you have any suggestions on ways to improve it or just general comments on the API, you can post them in the Comment Section below or get in touch with us directly. We are always happy to get feedback on our products.<\/p>\n<div id=\"hzImg\" style=\"border: 1px solid #ffffff; line-height: 0; overflow: hidden; padding: 2px; margin: 0px; position: absolute; z-index: 2147483647; border-radius: 3px; box-shadow: rgba(0, 0, 0, 0.33) 3px 3px 9px 5px; opacity: 1; top: 4437px; left: 276px; background-color: #ffffff; display: none;\"><\/div>\n","protected":false},"excerpt":{"rendered":"<p>In web and mobile applications, as well as any other digital media, the use of images as part of their [&hellip;]<\/p>\n","protected":false},"author":11,"featured_media":1761,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":[],"categories":[5,211],"tags":[14,48,75,76,143,170,171,172,173,174,175,176,177,178,179,180,181,182,183,184,185,186,187,188,189,190,191,192],"yoast_head":"<!-- This site is optimized with the Yoast SEO plugin v17.3 - https:\/\/yoast.com\/wordpress\/plugins\/seo\/ -->\n<title>Securing Images in Python With the Imagga NSFW Categorization API<\/title>\n<meta name=\"description\" content=\"Create NFSW Categorizer for you Application with Imagga. Build it with Python and starting using our FREE API today for your project.\" \/>\n<meta name=\"robots\" content=\"index, follow, max-snippet:-1, max-image-preview:large, max-video-preview:-1\" \/>\n<link rel=\"canonical\" href=\"https:\/\/imagga.com\/blog\/securing-images-in-python-with-the-imagga-nsfw-categorization-api\/\" \/>\n<meta property=\"og:locale\" content=\"en_US\" \/>\n<meta property=\"og:type\" content=\"article\" \/>\n<meta property=\"og:title\" content=\"Securing Images in Python With the Imagga NSFW Categorization API\" \/>\n<meta property=\"og:description\" content=\"Create NFSW Categorizer for you Application with Imagga. Build it with Python and starting using our FREE API today for your project.\" \/>\n<meta property=\"og:url\" content=\"https:\/\/imagga.com\/blog\/securing-images-in-python-with-the-imagga-nsfw-categorization-api\/\" \/>\n<meta property=\"og:site_name\" content=\"Imagga Blog\" \/>\n<meta property=\"article:publisher\" content=\"https:\/\/www.facebook.com\/imagga\/\" \/>\n<meta property=\"article:published_time\" content=\"2018-06-10T13:19:53+00:00\" \/>\n<meta property=\"article:modified_time\" content=\"2020-05-08T09:19:04+00:00\" \/>\n<meta property=\"og:image\" content=\"http:\/\/imagga.com\/blog\/wp-content\/uploads\/2017\/04\/NSFWAPITest.jpg\" \/>\n\t<meta property=\"og:image:width\" content=\"1920\" \/>\n\t<meta property=\"og:image:height\" content=\"1080\" \/>\n<meta name=\"twitter:card\" content=\"summary\" \/>\n<meta name=\"twitter:creator\" content=\"@imagga\" \/>\n<meta name=\"twitter:site\" content=\"@imagga\" \/>\n<meta name=\"twitter:label1\" content=\"Written by\" \/>\n\t<meta name=\"twitter:data1\" content=\"Joyce Echessa\" \/>\n\t<meta name=\"twitter:label2\" content=\"Est. reading time\" \/>\n\t<meta name=\"twitter:data2\" content=\"17 minutes\" \/>\n<script type=\"application\/ld+json\" class=\"yoast-schema-graph\">{\"@context\":\"https:\/\/schema.org\",\"@graph\":[{\"@type\":\"Organization\",\"@id\":\"https:\/\/imagga.com\/blog\/#organization\",\"name\":\"Imagga\",\"url\":\"https:\/\/imagga.com\/blog\/\",\"sameAs\":[\"https:\/\/www.facebook.com\/imagga\/\",\"https:\/\/twitter.com\/imagga\",\"https:\/\/www.linkedin.com\/company\/imagga\/\",\"https:\/\/twitter.com\/imagga\"],\"logo\":{\"@type\":\"ImageObject\",\"@id\":\"https:\/\/imagga.com\/blog\/#logo\",\"inLanguage\":\"en-US\",\"url\":\"https:\/\/imagga.com\/blog\/wp-content\/uploads\/2017\/04\/logo_white_blog.svg\",\"contentUrl\":\"https:\/\/imagga.com\/blog\/wp-content\/uploads\/2017\/04\/logo_white_blog.svg\",\"width\":\"27\",\"height\":\"29\",\"caption\":\"Imagga\"},\"image\":{\"@id\":\"https:\/\/imagga.com\/blog\/#logo\"}},{\"@type\":\"WebSite\",\"@id\":\"https:\/\/imagga.com\/blog\/#website\",\"url\":\"https:\/\/imagga.com\/blog\/\",\"name\":\"Imagga Blog\",\"description\":\"Image recognition in the cloud\",\"publisher\":{\"@id\":\"https:\/\/imagga.com\/blog\/#organization\"},\"potentialAction\":[{\"@type\":\"SearchAction\",\"target\":{\"@type\":\"EntryPoint\",\"urlTemplate\":\"https:\/\/imagga.com\/blog\/?s={search_term_string}\"},\"query-input\":\"required name=search_term_string\"}],\"inLanguage\":\"en-US\"},{\"@type\":\"ImageObject\",\"@id\":\"https:\/\/imagga.com\/blog\/securing-images-in-python-with-the-imagga-nsfw-categorization-api\/#primaryimage\",\"inLanguage\":\"en-US\",\"url\":\"https:\/\/imagga.com\/blog\/wp-content\/uploads\/2017\/04\/NSFWAPITest.jpg\",\"contentUrl\":\"https:\/\/imagga.com\/blog\/wp-content\/uploads\/2017\/04\/NSFWAPITest.jpg\",\"width\":1920,\"height\":1080},{\"@type\":\"WebPage\",\"@id\":\"https:\/\/imagga.com\/blog\/securing-images-in-python-with-the-imagga-nsfw-categorization-api\/#webpage\",\"url\":\"https:\/\/imagga.com\/blog\/securing-images-in-python-with-the-imagga-nsfw-categorization-api\/\",\"name\":\"Securing Images in Python With the Imagga NSFW Categorization API\",\"isPartOf\":{\"@id\":\"https:\/\/imagga.com\/blog\/#website\"},\"primaryImageOfPage\":{\"@id\":\"https:\/\/imagga.com\/blog\/securing-images-in-python-with-the-imagga-nsfw-categorization-api\/#primaryimage\"},\"datePublished\":\"2018-06-10T13:19:53+00:00\",\"dateModified\":\"2020-05-08T09:19:04+00:00\",\"description\":\"Create NFSW Categorizer for you Application with Imagga. Build it with Python and starting using our FREE API today for your project.\",\"breadcrumb\":{\"@id\":\"https:\/\/imagga.com\/blog\/securing-images-in-python-with-the-imagga-nsfw-categorization-api\/#breadcrumb\"},\"inLanguage\":\"en-US\",\"potentialAction\":[{\"@type\":\"ReadAction\",\"target\":[\"https:\/\/imagga.com\/blog\/securing-images-in-python-with-the-imagga-nsfw-categorization-api\/\"]}]},{\"@type\":\"BreadcrumbList\",\"@id\":\"https:\/\/imagga.com\/blog\/securing-images-in-python-with-the-imagga-nsfw-categorization-api\/#breadcrumb\",\"itemListElement\":[{\"@type\":\"ListItem\",\"position\":1,\"name\":\"Home\",\"item\":\"https:\/\/imagga.com\/blog\/\"},{\"@type\":\"ListItem\",\"position\":2,\"name\":\"Securing Images in Python With the Imagga NSFW Categorization API\"}]},{\"@type\":\"Article\",\"@id\":\"https:\/\/imagga.com\/blog\/securing-images-in-python-with-the-imagga-nsfw-categorization-api\/#article\",\"isPartOf\":{\"@id\":\"https:\/\/imagga.com\/blog\/securing-images-in-python-with-the-imagga-nsfw-categorization-api\/#webpage\"},\"author\":{\"@id\":\"https:\/\/imagga.com\/blog\/#\/schema\/person\/db7025693faf295ed72d8e65eacb90a6\"},\"headline\":\"Securing Images in Python With the Imagga NSFW Categorization API\",\"datePublished\":\"2018-06-10T13:19:53+00:00\",\"dateModified\":\"2020-05-08T09:19:04+00:00\",\"mainEntityOfPage\":{\"@id\":\"https:\/\/imagga.com\/blog\/securing-images-in-python-with-the-imagga-nsfw-categorization-api\/#webpage\"},\"wordCount\":2562,\"commentCount\":0,\"publisher\":{\"@id\":\"https:\/\/imagga.com\/blog\/#organization\"},\"image\":{\"@id\":\"https:\/\/imagga.com\/blog\/securing-images-in-python-with-the-imagga-nsfw-categorization-api\/#primaryimage\"},\"thumbnailUrl\":\"https:\/\/imagga.com\/blog\/wp-content\/uploads\/2017\/04\/NSFWAPITest.jpg\",\"keywords\":[\"api\",\"python\",\"adult content\",\"content moderation\",\"application\",\"json\",\"nfsw categorization\",\"div class\",\"output folder\",\"content endpoint\",\"content id\",\"post request\",\"uploaded content\",\"input folder\",\"process image\",\"response request\",\"request auth\",\"single image\",\"safe image\",\"flag content\",\"content parameter\",\"request form\",\"import request\",\"user\",\"categorize\",\"course\",\"grade\",\"length\"],\"articleSection\":[\"Tech Insider\",\"Code Hacks\"],\"inLanguage\":\"en-US\",\"potentialAction\":[{\"@type\":\"CommentAction\",\"name\":\"Comment\",\"target\":[\"https:\/\/imagga.com\/blog\/securing-images-in-python-with-the-imagga-nsfw-categorization-api\/#respond\"]}]},{\"@type\":\"Person\",\"@id\":\"https:\/\/imagga.com\/blog\/#\/schema\/person\/db7025693faf295ed72d8e65eacb90a6\",\"name\":\"Joyce Echessa\",\"url\":\"https:\/\/imagga.com\/blog\/author\/joyce\/\"}]}<\/script>\n<!-- \/ Yoast SEO plugin. -->","yoast_head_json":{"title":"Securing Images in Python With the Imagga NSFW Categorization API","description":"Create NFSW Categorizer for you Application with Imagga. Build it with Python and starting using our FREE API today for your project.","robots":{"index":"index","follow":"follow","max-snippet":"max-snippet:-1","max-image-preview":"max-image-preview:large","max-video-preview":"max-video-preview:-1"},"canonical":"https:\/\/imagga.com\/blog\/securing-images-in-python-with-the-imagga-nsfw-categorization-api\/","og_locale":"en_US","og_type":"article","og_title":"Securing Images in Python With the Imagga NSFW Categorization API","og_description":"Create NFSW Categorizer for you Application with Imagga. Build it with Python and starting using our FREE API today for your project.","og_url":"https:\/\/imagga.com\/blog\/securing-images-in-python-with-the-imagga-nsfw-categorization-api\/","og_site_name":"Imagga Blog","article_publisher":"https:\/\/www.facebook.com\/imagga\/","article_published_time":"2018-06-10T13:19:53+00:00","article_modified_time":"2020-05-08T09:19:04+00:00","og_image":[{"width":1920,"height":1080,"url":"http:\/\/imagga.com\/blog\/wp-content\/uploads\/2017\/04\/NSFWAPITest.jpg","type":"image\/jpeg"}],"twitter_card":"summary","twitter_creator":"@imagga","twitter_site":"@imagga","twitter_misc":{"Written by":"Joyce Echessa","Est. reading time":"17 minutes"},"schema":{"@context":"https:\/\/schema.org","@graph":[{"@type":"Organization","@id":"https:\/\/imagga.com\/blog\/#organization","name":"Imagga","url":"https:\/\/imagga.com\/blog\/","sameAs":["https:\/\/www.facebook.com\/imagga\/","https:\/\/twitter.com\/imagga","https:\/\/www.linkedin.com\/company\/imagga\/","https:\/\/twitter.com\/imagga"],"logo":{"@type":"ImageObject","@id":"https:\/\/imagga.com\/blog\/#logo","inLanguage":"en-US","url":"https:\/\/imagga.com\/blog\/wp-content\/uploads\/2017\/04\/logo_white_blog.svg","contentUrl":"https:\/\/imagga.com\/blog\/wp-content\/uploads\/2017\/04\/logo_white_blog.svg","width":"27","height":"29","caption":"Imagga"},"image":{"@id":"https:\/\/imagga.com\/blog\/#logo"}},{"@type":"WebSite","@id":"https:\/\/imagga.com\/blog\/#website","url":"https:\/\/imagga.com\/blog\/","name":"Imagga Blog","description":"Image recognition in the cloud","publisher":{"@id":"https:\/\/imagga.com\/blog\/#organization"},"potentialAction":[{"@type":"SearchAction","target":{"@type":"EntryPoint","urlTemplate":"https:\/\/imagga.com\/blog\/?s={search_term_string}"},"query-input":"required name=search_term_string"}],"inLanguage":"en-US"},{"@type":"ImageObject","@id":"https:\/\/imagga.com\/blog\/securing-images-in-python-with-the-imagga-nsfw-categorization-api\/#primaryimage","inLanguage":"en-US","url":"https:\/\/imagga.com\/blog\/wp-content\/uploads\/2017\/04\/NSFWAPITest.jpg","contentUrl":"https:\/\/imagga.com\/blog\/wp-content\/uploads\/2017\/04\/NSFWAPITest.jpg","width":1920,"height":1080},{"@type":"WebPage","@id":"https:\/\/imagga.com\/blog\/securing-images-in-python-with-the-imagga-nsfw-categorization-api\/#webpage","url":"https:\/\/imagga.com\/blog\/securing-images-in-python-with-the-imagga-nsfw-categorization-api\/","name":"Securing Images in Python With the Imagga NSFW Categorization API","isPartOf":{"@id":"https:\/\/imagga.com\/blog\/#website"},"primaryImageOfPage":{"@id":"https:\/\/imagga.com\/blog\/securing-images-in-python-with-the-imagga-nsfw-categorization-api\/#primaryimage"},"datePublished":"2018-06-10T13:19:53+00:00","dateModified":"2020-05-08T09:19:04+00:00","description":"Create NFSW Categorizer for you Application with Imagga. Build it with Python and starting using our FREE API today for your project.","breadcrumb":{"@id":"https:\/\/imagga.com\/blog\/securing-images-in-python-with-the-imagga-nsfw-categorization-api\/#breadcrumb"},"inLanguage":"en-US","potentialAction":[{"@type":"ReadAction","target":["https:\/\/imagga.com\/blog\/securing-images-in-python-with-the-imagga-nsfw-categorization-api\/"]}]},{"@type":"BreadcrumbList","@id":"https:\/\/imagga.com\/blog\/securing-images-in-python-with-the-imagga-nsfw-categorization-api\/#breadcrumb","itemListElement":[{"@type":"ListItem","position":1,"name":"Home","item":"https:\/\/imagga.com\/blog\/"},{"@type":"ListItem","position":2,"name":"Securing Images in Python With the Imagga NSFW Categorization API"}]},{"@type":"Article","@id":"https:\/\/imagga.com\/blog\/securing-images-in-python-with-the-imagga-nsfw-categorization-api\/#article","isPartOf":{"@id":"https:\/\/imagga.com\/blog\/securing-images-in-python-with-the-imagga-nsfw-categorization-api\/#webpage"},"author":{"@id":"https:\/\/imagga.com\/blog\/#\/schema\/person\/db7025693faf295ed72d8e65eacb90a6"},"headline":"Securing Images in Python With the Imagga NSFW Categorization API","datePublished":"2018-06-10T13:19:53+00:00","dateModified":"2020-05-08T09:19:04+00:00","mainEntityOfPage":{"@id":"https:\/\/imagga.com\/blog\/securing-images-in-python-with-the-imagga-nsfw-categorization-api\/#webpage"},"wordCount":2562,"commentCount":0,"publisher":{"@id":"https:\/\/imagga.com\/blog\/#organization"},"image":{"@id":"https:\/\/imagga.com\/blog\/securing-images-in-python-with-the-imagga-nsfw-categorization-api\/#primaryimage"},"thumbnailUrl":"https:\/\/imagga.com\/blog\/wp-content\/uploads\/2017\/04\/NSFWAPITest.jpg","keywords":["api","python","adult content","content moderation","application","json","nfsw categorization","div class","output folder","content endpoint","content id","post request","uploaded content","input folder","process image","response request","request auth","single image","safe image","flag content","content parameter","request form","import request","user","categorize","course","grade","length"],"articleSection":["Tech Insider","Code Hacks"],"inLanguage":"en-US","potentialAction":[{"@type":"CommentAction","name":"Comment","target":["https:\/\/imagga.com\/blog\/securing-images-in-python-with-the-imagga-nsfw-categorization-api\/#respond"]}]},{"@type":"Person","@id":"https:\/\/imagga.com\/blog\/#\/schema\/person\/db7025693faf295ed72d8e65eacb90a6","name":"Joyce Echessa","url":"https:\/\/imagga.com\/blog\/author\/joyce\/"}]}},"_links":{"self":[{"href":"https:\/\/imagga.com\/blog\/wp-json\/wp\/v2\/posts\/1684"}],"collection":[{"href":"https:\/\/imagga.com\/blog\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/imagga.com\/blog\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/imagga.com\/blog\/wp-json\/wp\/v2\/users\/11"}],"replies":[{"embeddable":true,"href":"https:\/\/imagga.com\/blog\/wp-json\/wp\/v2\/comments?post=1684"}],"version-history":[{"count":58,"href":"https:\/\/imagga.com\/blog\/wp-json\/wp\/v2\/posts\/1684\/revisions"}],"predecessor-version":[{"id":3747,"href":"https:\/\/imagga.com\/blog\/wp-json\/wp\/v2\/posts\/1684\/revisions\/3747"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/imagga.com\/blog\/wp-json\/wp\/v2\/media\/1761"}],"wp:attachment":[{"href":"https:\/\/imagga.com\/blog\/wp-json\/wp\/v2\/media?parent=1684"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/imagga.com\/blog\/wp-json\/wp\/v2\/categories?post=1684"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/imagga.com\/blog\/wp-json\/wp\/v2\/tags?post=1684"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}