For this project you must have installed Docker and docker-compose.
If you don't have poetry already installed, just execute:
curl -sSL https://raw.githubusercontent.com/python-poetry/poetry/master/get-poetry.py | python -
If anything wrong happens, check the docs.
If you have python3.10 on your machine, just run the code below to create a new env for this project.
poetry env use python3.11
You may check the others envs by running:
poetry env list
If you prefer to work the old way, with requirements.txt, you may export it with:
poetry export --without-hashes -o requirements.txt
Export with dev dependencies:
poetry export --without-hashes -o requirements-dev.txt --with dev
In order to install the project dependencies you all need is just run:
poetry install
First you have to set airflow user id and group id by running:
echo -e "AIRFLOW_UID=$(id -u)\nAIRFLOW_GID=0" >> .env
Then, just initialize the database to use:
docker-compose up initdb
All configuration parameters are set in .env file.
The last command, in order to run all services is:
docker-compose up -d
Access metrics at here.
If you want to add DAGs, all you have to do is to mount a volume of your DAGs on x-dags-and-logs anchor on the yaml file. After that, just update the DAG_FOLDERS environment variable on x-airflow-env anchor to notify airflow where to get the new DAGs.