Apache Airflow is a powerful workflow orchestration tool used for scheduling, monitoring, and managing complex workflows. Read my previous blog on the Fundamentals of Apache Airflow. Installing Airflow can sometimes be tricky due to its various dependencies, but Docker simplifies the process by allowing you to set up Airflow in a containerized environment and ensuring a smooth and consistent deployment.
In this guide, you’ll learn how to install Apache Airflow using Docker step by step. With Docker, you can:
- Run Airflow without worrying about system dependencies
- Easily manage and scale your workflows
- Ensure a clean and isolated Airflow environment
By the end of this guide, you’ll have a fully functional Apache Airflow instance running on Docker, ready to orchestrate your workflows!
Prerequisites
Before we begin, make sure you have the following installed:
- Docker Desktop → Download Docker
- Visual Studio Code → Download Visual Studio Code
Important: Docker requires administrator privileges to function correctly. Ensure you have the necessary permissions before proceeding.
Step-by-Step Installation Guide
1. Create a Working Directory
- Open your Documents folder.
- Create a new folder named “airflow”.
2. Download the Docker Compose File
- Open your CMD prompt and go into the airflow folder
- Download the docker-compose.yaml file into the airflow folder.
curl "https://airflow.apache.org/docs/apache-airflow/2.10.4/docker-compose.yaml" -o docker-compose.yaml
or go directly in the above Url, download the file but make sure the file is saved as docker_compose.yaml and not as docker-compose.yaml.txt.
3. Set Up Your Environment File
- Open command Prompt and navigate to the airflow folder:
cd ~/Documents/airflow
- Open Visual Studio Code from this folder by running:
code .
- Inside VS Code, right-click below docker-compose.yaml and create a new file named:
.env
(Don’t forget the dot before env) - Open the
.env
file and add the following lines (Check the docker-compose.yaml file for the values)
AIRFLOW_IMAGE_NAME=apache/airflow:2.10.4 AIRFLOW_UID=50000
- Save the file.
4. Run Apache Airflow
- Open a new terminal in VS Code: Click Terminal in the top menu → Select New Terminal.
- Start Apache Airflow by running:
docker-compose up -d
- Wait for the installation to complete. This can take a few minutes, depending on your internet speed.
If you encounter a download error: Ensure you’re not behind a proxy, VPN, or corporate network that blocks Docker image downloads. If needed, switch to a personal internet connection.
Success! You’ve installed Apache Airflow with Docker!
5. Access the Airflow Web UI
- Open your web browser.
- Go to: http://localhost:8080
Note: If your web UI is not loading, check if port 8080 is in use. If something else is running on port 8080, Airflow won’t load.
If you do NOT want example DAGs to populate in the Airflow UI, update the following line of code in your docker-compose.yaml file.
AIRFLOW__CORE__LOAD_EXAMPLES: 'false'
Cheat sheet on docker-compose commands
1. Starting & Stopping
# Start all services. -d is the detached mode. docker-compose up -d # Stop and remove containers, networks, volumes docker-compose down # Stop containers but preserve data (volumes) docker-compose down --volumes # Restart a specific service (e.g., web) docker-compose restart web
2. Managing Services
# List running containers from the compose file docker-compose ps # View logs for all services (follow in real-time) docker-compose logs -f # View logs for a specific service (e.g., db) docker-compose logs -f db # Execute a command inside a running container docker-compose exec web # Open shell in web service # Open an interactive terminal (Bash shell) inside a running Docker container. docker exec -it /bin/bash # Execute command inside the postgres DB container docker exec -it airflow-postgres-1 /bin/bash Psql -Uairflow
3. Building & Rebuilding
# Build/rebuild images (force fresh build) docker-compose build # Rebuild and restart containers (e.g., after code changes) docker-compose up -d --build # Pull latest images for services (e.g., from Docker Hub) docker-compose pull
4. Configuration & Debugging
# Validate your docker-compose.yaml file docker-compose config # Remove stopped containers docker-compose rm # List all environment variables for a service docker-compose run web env # Use a custom compose file (e.g., docker-compose.prod.yaml) docker-compose -f docker-compose.prod.yaml up -d
5. Cleanup
# Remove stopped containers, unused networks, dangling images docker-compose down --remove-orphans # Remove all unused volumes (irreversible!) docker volume prune
Pro Tips
- Place environment variables (e.g., DB_PASSWORD=secret) in a .env file in the same folder as your docker-compose.yaml.
- Use depends_on in docker-compose.yaml to control startup order (e.g., database starts before app).
Final Thoughts
You’ve successfully installed Apache Airflow using Docker! Now you can start creating and managing your workflows like a pro.
If you run into issues, feel free to check out the official Airflow documentation.