This repository contains the backend application for the project room acoustics user interface, implemented in Python using the Flask framework. Currently, it has been tested only in the development environment. Future plans include support for Docker and deployment in production environments. The application supports SQLite3 as the default database, with provisions for PostgreSQL as an alternative.
- Technology
- Requirements
- Environments
- Flask Commands
- Database commands
- Swagger
- Reference
- Contribution
- Operating System: Windows, Ubuntu
- Web Framework: Flask
- ORM: Flask-sqlalchemy
- Swagger: Swagger-UI
- Serialization: Marshmallow
- Deserialization: Marshmallow
- Validation: Marshmallow
- Migration Database: Flask-migrate
- Environment manager: Anaconda/Miniconda
- Containerization: Docker, docker-compose
- Database: PostgreSQL, SQLite3
- Python WSGI HTTP Server: Gunicorn (env specification)
- Proxy: Nginx
- Tests: Under Planning
- Deployment platform: Under planning for AWS
- CI/CD: Under planning for Github Actions -Celery Job queue
- Python
- Anaconda/Miniconda
- Docker
- Docker-Compose
- Other requirements such as python libraries and so on.
Development environment uses SQLite3/Postgresql locally and runs the Flask server in debug mode. You can customize the environment variables in the corresponding .env file. 0. Setup Celery celery -A app.celery worker --loglevel=info -P eventlet
- Create environment and install packages
In general, I am using conda to handle virtual env and packages installations, however you can use other alternatives to install the python dependencies as well.
conda create -n NAME_OF_VENV python=3.10
conda activate NAME_OF_VENV
pip install -r requirements.txt
- Create PosgresSQL on Linux[Ubuntu] (optional)
# Install PosgresSQL
sudo apt-get install postgresql-12
# Access to PosgresSQL
sudo -u postgres psql
# Create user and password
CREATE USER db_user WITH PASSWORD 'db_password';
# Create Database dev
CREATE DATABASE db_dev;
# Add permission User to Database
GRANT ALL PRIVILEGES ON DATABASE db_dev TO db_user;
Note: remember to change the default env configuration to switch from sqllite to postgresql.
Note: if you are using Windows machine, you can simply download and install the postgresql from its website. However, upon installing the software try to remember what is the password for the superuser.
- Create or update
.env
file
# APP configuration
APP_NAME=Flask API Rest Template
APP_ENV=develop
# Flask Configuration
FLASK_APP=app:app
FLASK_DEBUG=true
APP_SETTINGS_MODULE=config.DevelopConfig
APP_TEST_SETTINGS_MODULE=config.TestingConfig
FLASK_RUN_HOST=192.168.0.104
FLASK_RUN_PORT=5000
# Database service configuration
DATABASE_URL=postgresql://db_user:db_password@localhost/db_dev
DATABASE_TEST_URL=postgresql://db_user:db_password@localhost/db_test
- Run application Once you are done with the above steps, you are ready to run the application.
# Create database
flask create-db
# Run a development server
flask run
Some of the planned commands:
- Run all tests
flask tests
- Run coverage
flask coverage
- Run coverage report
flask coverage_report
Containerized services separately with PostgreSQL databases (db), API (api) and Nginx reverse proxy (nginx) with Docker and docker-compose.
-
Create
.env.api.local
,.env.db.local
files-
.env.api.local
# APP configuration APP_NAME=[Name APP] # For example Flask API Rest Template APP_ENV=local # Flask configuration API_ENTRYPOINT=app:app APP_SETTINGS_MODULE=config.LocalConfig APP_TEST_SETTINGS_MODULE=config.TestingConfig # API service configuration API_HOST=<api_host> # For example 0.0.0.0 API_PORT=<port_api> # For example 5000 # Database service configuration DATABASE=postgres DB_HOST=<name_container_bbdd> # For example db_service (name service in docker-compose) DB_PORT=<port_container_bbdd> # For example 5432 (port service in docker-compose) POSTGRES_DB=<name_database> # For example db_dev POSTGRES_USER=<name_user> # For example db_user PGPASSWORD=<password_user> # For example db_password # Secret key SECRET_KEY=<your-secret-key> JWT_SECRET_KEY=<your-jwt-secret-key> DATABASE_TEST_URL=<url database test> # For example postgresql+psycopg2://db_user:db_password@db_service:5432/db_test DATABASE_URL=<url database> # For example postgresql+psycopg2://db_user:db_password@db_service:5432/db_dev
-
.env.db.local:
POSTGRES_USER=<name_user> # For example db_user POSTGRES_PASSWORD=<password> # For example db_password POSTGRES_DB=<name_DB> # For example db_dev
-
-
Build and run services
shell docker-compose up --build
2. Stop services:shell docker-compose stop
3. Delete services:shell docker compose down
4. Remove services (removing volumes):shell docker-compose down -v
4. Remove services (removing volumes and images):shell docker-compose down -v --rmi all
5. View services:shell docker-compose ps
NOTE: The Rest API defaults to host localhost and port 80.
Apply CI/CD with Github Actions to automatically deployed to AWS platform use EC2, RDS PostgresSQL.
-
Create file .env.pro and enter the environment variables needed for production. For example:
# APP configuration APP_NAME=Flask API Rest Template APP_ENV=production # Flask configuration API_ENTRYPOINT=app:app APP_SETTINGS_MODULE=config.ProductionConfig # API service configuration API_HOST=<api_host> # For example 0.0.0.0 # Secret key SECRET_KEY=<your-secret-key> JWT_SECRET_KEY=<your-jwt-secret-key> # Database service configuration DATABASE_URL=<url_database> # For example sqlite:///production.db # Deploy platform PLATFORM_DEPLOY=AWS
You can run the following commands in order to use specific features of the application. Note that all of these commands are also ran in the pipeline on every commit, but it is recommended to run them locally too. Most notably, PEP8 verification, Import sorting, and test cases, as committing without satisfying these tools will cause the pipeline to fail.
You can check whether your code adheres to the PEP8 code style standard by doing
cd painting_explorer
. And then running the following:
flake8
If you want to write the output of this to file as well, you can run the following:
flake8 --output-file ../reports/flake8/flake8stats.txt
To sort all of your imported Python packages, similarly make sure you are in the painting_explorer
folder, and then
run the following:
isort .
If you just want to check what changes isort would make, you can run the following:
isort . --check --diff
For code coverage as well, make sure you are in the painting_explorer
folder. You can then run the following command
to first erase any previous coverage reports:
coverage erase --data-file=../reports/coverage/.coverage
Next, calculate the new coverage using:
coverage run --source='.' --data-file=../reports/coverage/.coverage manage.py test
To print a report of this coverage, run:
coverage report --data-file=../reports/coverage/.coverage
Lastly if you want to generate a report in XML format too, you can use this:
coverage xml --data-file=../reports/coverage/.coverage -o ../reports/coverage/coverage.xml
Install the git hook scripts run pre-commit install to set up the git hook scripts run: pre-commit
celery -A app.celery worker --loglevel=info -P eventlet
-
Create all tables in the database:
flask create_db
-
Delete all tables in the database:
flask drop_db
-
Database reset:
flask reset-db
-
Run tests with coverage without report in html:
flask cov
-
Run tests with coverage with report in html:
flask cov-html
-
Create a migration repository:
flask db init
-
Generate a migration version:
flask db migrate -m "Init"
-
Apply migration to the Database:
flask db upgrade
http://localhost:<port>/swagger-ui
Feel free to make any suggestions or improvements to the project.
- sphinx-quickstart docs
- sphinx-build -M html docs/source/ docs/build/