Predictive Hacks

Docker + Flask | Dockerizing a Python API

docker python api

Docker containers are one of the hottest trends in software development right now. Not only it makes it easier to create, deploy and run applications but by using containers you are confident that your application will run on any machine regardless of anything that may differ from yours that you created and tested the code.

In this tutorial, we will show you how you can dockerize easily a Flask API. We will use this Python Rest API Example. It’s a simple API that given an image URL it returns the dominant colors of the image.

We highly recommend creating a new python environment using Conda or pip so you can easily create your requirements.txt file that contains all the libraries that you are using in the project.


The Flask API that we will dockerize uses two .py files.

The color.py

import PIL
from PIL import Image
import requests
from io import BytesIO
import webcolors
import pandas as pd
import webcolors
  
def closest_colour(requested_colour):
    min_colours = {}
    for key, name in webcolors.css3_hex_to_names.items():
        r_c, g_c, b_c = webcolors.hex_to_rgb(key)
        rd = (r_c - requested_colour[0]) ** 2
        gd = (g_c - requested_colour[1]) ** 2
        bd = (b_c - requested_colour[2]) ** 2
        min_colours[(rd + gd + bd)] = name
    return min_colours[min(min_colours.keys())]
def top_colors(url, n=10):
    # read images from URL
    response = requests.get(url)
    img = Image.open(BytesIO(response.content))
    # convert the image to rgb
    image = img.convert('RGB')
     
    # resize the image to 100 x 100
    image = image.resize((100,100))
     
    detected_colors =[]
    for x in range(image.width):
        for y in range(image.height):
            detected_colors.append(closest_colour(image.getpixel((x,y))))
    Series_Colors = pd.Series(detected_colors)
    output=Series_Colors.value_counts()/len(Series_Colors)
    return(output.head(n).to_dict())

The main.py

from flask import Flask, jsonify, request
app=Flask(__name__)
#we are importing our function from the colors.py file
from colors import top_colors
@app.route("/",methods=['GET','POST'])
def index():
    if request.method=='GET':
#getting the url argument       
        url = request.args.get('url')
        result=top_colors(str(url))
        return jsonify(result)
    else:
        return jsonify({'Error':"This is a GET API method"})
if __name__ == '__main__':
    app.run(debug=True,host='0.0.0.0', port=9007)

As we said before we have to create the requirements.txt file. We are using the pip freeze command after activating the projects environment.

pip freeze > requirements.txt

If you open the requirements.txt you should see listed all the required libraries of the project.

certifi==2020.6.20
chardet==3.0.4
click==7.1.2
Flask==1.1.2
idna==2.10
itsdangerous==1.1.0
Jinja2==2.11.2
jsonify==0.5
MarkupSafe==1.1.1
numpy==1.19.2
pandas==1.1.3
Pillow==8.0.1
python-dateutil==2.8.1
pytz==2020.1
requests==2.24.0
six==1.15.0
urllib3==1.25.11
webcolors==1.4
Werkzeug==1.0.1

Dockerizing

Let’s start the dockerizing process. We only need to create a new file called Dockerfile. Then we will add some lines of code inside.

The Dockerfile is made of simple commands that define how to build the image. The first line is our base image. There are a lot of images that you can use like Linux, Linux with preinstalled Python and libraries or images that are made especially for data science projects. You can explore them all at the docker hub. We will use the Python:3.8 image.

FROM python:3.8

Then we need to copy the required files from our host machine and add it to the filesystem of the container. To make it simpler we will not add any subfolders.

FROM python:3.8

COPY requirements.txt ./requirements.txt
COPY colors.py ./colors.py
COPY main.py ./main.py

Then we have to install the libraries so we have to add the pip install command to be run.

FROM python:3.8

COPY requirements.txt ./requirements.txt
COPY colors.py ./colors.py
COPY main.py ./main.py

RUN pip install -r requirements.txt

Lastly, we have to specify what command to run within the container using CMD. In our case is the python main.py.

FROM python:3.8

COPY requirements.txt ./requirements.txt
COPY colors.py ./colors.py
COPY main.py ./main.py

RUN pip install -r requirements.txt

CMD ["python", "./main.py"]

How to build the Image and run the Container

To build the docker image you need to go to our working directory that Dockerfile is placed and run the following.

docker build -t your_docker_image_name -f Dockerfile .

You just build your image! The next step is to run our container. The tricky part here is the mapping of the ports. The first is the local port we will use and the second is in which port the API runs in our container.

 docker run -d -p 5000:9007 your_docker_image_name

If everything is ok, you should get a response if you hit the following in your browser.

http://localhost:5000/?url=https://image.shutterstock.com/z/stock-photo-at-o-clock-at-the-top-of-the-mountains-sunrise-1602307492.jpg
{
burlywood: 0.1212,
cornsilk: 0.0257,
darksalmon: 0.229,
darkslategrey: 0.0928,
indianred: 0.1663,
lemonchiffon: 0.021,
lightsalmon: 0.0479,
navajowhite: 0.0426,
rosybrown: 0.097,
wheat: 0.0308
}

You made it! You’ve just dockerized your Flask API! Simple as that.

docker in python

Some useful commands for Docker

Get the list of the running containers

docker container list
CONTAINER ID        IMAGE               COMMAND              CREATED             STATUS              PORTS                    NAMES
fe7726349933        image_name          "python ./main.py"   About an hour ago   Up About an hour    0.0.0.0:5000->9007/tcp   eager_chaum

If you want to stop the container, take the first 3-4 characters of the container id from the previews command and run the following

docker stop fe77

Get the Logs of the API

docker logs fe77

Share This Post

Share on facebook
Share on linkedin
Share on twitter
Share on email

Leave a Comment

Subscribe To Our Newsletter

Get updates and learn from the best

More To Explore

data science journey
Miscellaneous

My Journey as a Data Science Blogger

Μy Background My Studies Back in 2001, I entered university to study Statistics. During my first year, I ran my