This repository stores a test to demonstrate skills mainly with Python, Keras, Flask, Docker, Jupyter Notebook, microservices, REST API and GitHub Actions.
- PURPOSE
- DEPENDENCIES
- REPOSITORY CONTENT
- ARCHITECTURE
- DEEP LEARNING MODEL
- HOW TO RUN DEEP LEARNING ON FLASK WITH DOCKER COMPOSE
- TEST SERVER & REST API
- CREDITS
The goal is to deploy on Flask a Deep Learning model as a microservice. The model is used to predict handwritten digits and it has been previously trained on a Jupyter Notebook. REST API are utilized to communicate with the deployed model. e.g. send image to be analized and return the generated predictions to the client. GitHub Actions are employed to implement CI/CD workflows in the project.
The code has been tested using:
- Python (3.12): an interpreted high-level programming language for general-purpose programming.
- Jupyter Lab (4.3): a web-based interactive development environment for Jupyter Notebooks, code and data.
- Flask (3.1): a microframework for Python based on Werkzeug, Jinja 2 and good intentions.
- Gunicorn (23.0): a Python WSGI HTTP Server for UNIX.
- NGINX (1.27): a free, open-source, high-performance HTTP server, reverse proxy, and IMAP/POP3 proxy server.
- Docker (27.4): an open platform for developers and sysadmins to build, ship, and run distributed applications, whether on laptops, data center VMs, or the cloud.
- Docker Compose (2.32): a tool for defining and running multi-container Docker applications.
- Keras (TensorFlow built-in): a high-level neural networks [API], written in Python and capable of running on top of TensorFlow.
- TensorFlow (2.18): an open source software Deep Learning library for high performance numerical computation using data flow graphs.
- Matplotlib (3.10): a plotting library for Python and its numerical mathematics extension NumPy.
- NumPy (2.0): a library for Python, adding support for large, multi-dimensional arrays and matrices, along with a large collection of high-level mathematical functions to operate on these arrays.
- Ruff (0.9): An extremely fast Python linter and code formatter, written in Rust.
- scikit-image (0.25): a collection of algorithms for image processing with Python.
Virtual environment (<env_name>=.venv) can be generated from requirements_dev.txt file located in the repository.
Command to configure virtual environment with venv:
~/deeplearning_flask$ python3 -m venv .venv
~/deeplearning_flask$ source .venv/bin/activate
(.venv)~/deeplearning_flask$ python3 -m pip install pip==24.3.1
(.venv)~/deeplearning_flask$ python3 -m pip install setuptools==75.8.0
(.venv)~/deeplearning_flask$ python3 -m pip install -r requirements_dev.txt
(.venv)~/deeplearning_flask$ pre-commit install
The repository main folder contains:
deeplearning_flask
├── .env.example
├── .env.test
├── .github
│ └── workflows
│ └── ci_tests.yml
├── .gitignore
├── .pre-commit-config.yaml
├── app
│ ├── app
│ │ ├── __init__.py
│ │ ├── api.py
│ │ ├── model.py
│ │ ├── static
│ │ │ └── 4.jpg
│ │ └── templates
│ │ └── dlflask.html
│ ├── config.py
│ ├── Makefile
│ ├── mnist_model.keras
│ ├── server.py
│ └── tests
│ ├── __init__.py
│ ├── conftest.py
│ └── test_app.py
├── Deep Learning MNIST prediction model with Keras.ipynb
├── docker-compose.yml
├── Dockerfile
├── nginx
│ └── conf.d
│ └── local.conf
├── pyproject.toml
├── README.md
├── requirements.txt
└── requirements_dev.txt
The architecture created with Docker Compose uses two different Docker containers for:
The following diagram illustrates the architecture in blocks:
flowchart LR;
Client<-->NGINX;
NGINX<--brigde-->Gunicorn;
subgraph web;
Gunicorn<-->Flask;
end;
The definition and training of the Deep Learning MNIST model was done through a notebook in Jupyter Lab. The employed notebook is stored in the main folder, to run it use the command shown below:
(.venv)~/deeplearning_flask$ jupyter lab Deep\ Learning\ MNIST\ prediction\ model\ with\ Keras.ipynb
The steps and commands to run the Deep Learning model on the Flask server with Docker Compose are described below.
Before executing Docker Compose is strongly recommended to close other applications to free up resources and ports to avoid potential issues. Then Docker Compose can be executed to build services.
~/deeplearning_flask$ docker compose build
Next step consists in executing Docker Compose up command.
~/deeplearning_flask$ docker compose up
If everything goes fine at the end it should appear something similar to:
...
...
web_1 | 2020-06-04 19:30:17.818273: I tensorflow/compiler/xla/service/service.cc:176] StreamExecutor device (0): Host, Default Version
There are different ways to check that the server is running properly. One is opening a web browser such as Chrome or Firefox and paste the following URL:
http://127.0.0.1/
The web browser should show the text "Deep Learning on Flask".
REST API can be tested with pytest or curl.
It is possible to execute tests of Flask microservice created with pytest from inside the Flask Docker container using Makefile:
~/deeplearning_flask$ docker exec -it deeplearning_flask-web-1 /bin/bash
~/app# make test
...
============================= test session starts ==============================
platform linux -- Python 3.12.8, pytest-8.3.4, pluggy-1.5.0
rootdir: /app/tests
collected 2 items
test_app.py .. [100%]
Those tests are also automatically executed with CI/CD workflows implemented with GitHub Actions for every push and pull request in the project repository.
A POST example using curl from outside Docker container is shown below:
~/deeplearning_flask$ curl -F file=@app/app/static/4.jpg -X POST 'http://127.0.0.1/api/predictlabel' | json_pp
% Total % Received % Xferd Average Speed Time Time Time Current
Dload Upload Total Spent Left Speed
100 11650 100 489 100 11161 321 7347 0:00:01 0:00:01 --:--:-- 7664
{
"most_probable_label" : "4",
"predictions" : [
{
"label" : "0",
"probability" : "8.270098e-08"
},
{
"label" : "1",
"probability" : "0.00016669065"
},
{
"label" : "2",
"probability" : "4.821898e-05"
},
{
"label" : "3",
"probability" : "2.3290573e-05"
},
{
"label" : "4",
"probability" : "0.99914443"
},
{
"label" : "5",
"probability" : "1.4074722e-06"
},
{
"label" : "6",
"probability" : "2.4940262e-05"
},
{
"label" : "7",
"probability" : "0.0004908524"
},
{
"label" : "8",
"probability" : "4.4384862e-05"
},
{
"label" : "9",
"probability" : "5.569217e-05"
}
],
"success" : true
}
author: alvertogit copyright: 2018-2025