This section provides a high-level overview of the project, including its objectives, target audience, and key features.
- Create a real-time leaderboard of the top TFT players on the EUW server
- Gather the last 10 games for each top player for further data analysis
- Develop a data pipeline using Docker, PostgreSQL, Airflow, and Python
- Docker: Used for containerization and managing the infrastructure of the project.
- Postgres: The chosen relational database for storing both the tools' metadata and the game data.
- Airflow: Utilized for scheduling and orchestrating the data pipelines, ensuring seamless and timely execution.
- Python: The primary language for developing data pipelines, handling API requests, and performing data analysis.
To use the template, please install the following.
- git
- Github account
- Docker with at least 4GB of RAM and Docker Compose v1.27.0 or later
Setup Docker Environment:
-
to set up the environment with docker use: docker-compose up -d After running the docker-compose, run "docker ps" to see if the containers are up and running:

-
Make sure to create your .env file with the required variables to be able to run the script
-
the init.sql file runs the first time the database is created to give the Metabase and airflow databases the required privileges (Yes we migrating the SQLite databases from the tools to Postgres)
-
The docker-compose file will bring up all the services, Metabase, Airflow, and Postgres
Accessing the tools:
- Airflow
- Snowflake / Postgres:
Since visualizing data in the terminal for Postgres is a bit messy, I just rather stick with Snowflake since it's more than enough.
- Metabase (Data Visualization)
Brose Data:






