Predictive Hacks

How to Run your first Airflow DAG in Docker

airflow

We have how to start running Airflow in Docker, in this post we will provide an example of how you can run a DAG in Docker. We assume that you have already followed the steps of running Airflow in Docker and you are ready to run the compose.

Run the docker-compose

The first thing that you need to do, is to set your working directory to your airflow directory which most probably consists of the following folders.

How to Run your first Airflow DAG in Docker 1

At this point, you can run the docker-compose command.

docker-compose up

Now, you should be able to enter the http://localhost:8080/home and to see the Airflow UI. Note that the credentials for username and password are airflow and airflow respectively.

How to Run your first Airflow DAG in Docker 2

You can open a new terminal and run the command docker ps to see the running containers.

docker ps
How to Run your first Airflow DAG in Docker 3

Build your Python Operator DAG

Bear in mind that we have used docker volumes, which means that whatever we write in our local environment passes to Docker (mount). The folders that we have mounted are the dags, logs and plugins. So we can go to our local computer, within the dags folder and create a new .py file where I will call it my_test_dag.py

How to Run your first Airflow DAG in Docker 4

This DAG is simple and it is for exhibition purposes. What it does is to simply print the “I am coming first” and then the “I am coming last”, just to show the order of the DAG. The mytest_dag.py is the following:

from airflow.models import DAG
from airflow.utils.dates import days_ago
from airflow.operators.python_operator import PythonOperator
args = {

    'owner': 'pipis',
    'start_date': days_ago(1)
}

dag = DAG(dag_id = 'my_sample_dag', default_args=args, schedule_interval=None)


def run_this_func():
    print('I am coming first')

def run_also_this_func():
    print('I am coming last')


with dag:
    run_this_task = PythonOperator(
        task_id='run_this_first',
        python_callable = run_this_func
    )

    run_this_task_too = PythonOperator(
        task_id='run_this_last',
        python_callable = run_also_this_func
    )

    run_this_task >> run_this_task_too

Check that your DAG is in Docker

Due to docker volumes, we should be able to see the mytest_dag in the UI.

How to Run your first Airflow DAG in Docker 5

Run your DAG

In order to run your DAG, you need to “unpause” it

How to Run your first Airflow DAG in Docker 6

Then you click on the DAG and you click on the play button to trigger it:

How to Run your first Airflow DAG in Docker 7

Once you trigger it, it will run and you will get the status of each task.

How to Run your first Airflow DAG in Docker 8

The dark green colors mean success. We can click on each green circle and rectangular to get more details. Let’s click on the run_this_first and go to the logs.

How to Run your first Airflow DAG in Docker 9

As we can see in the logs there is the print statement “I am coming first” for the first task and similarly, you will see the other print statement for the second task. Great, you ran your first DAG with Docker!

Interact with Docker Environment

In case you want to have access to the Airflow environment you should do the following steps.

  • Get the Container ID of he airflow-docker_airflow-worker_1
  • Run the command docker exec -it <container id> bash
How to Run your first Airflow DAG in Docker 10

And now you are within the Airflow docker environment. For example, let’s see the folders.

How to Run your first Airflow DAG in Docker 11

Let’s get all the dags

How to Run your first Airflow DAG in Docker 12

As we can see, there exists the my_test_dag. Finally. you can exit the terminal by taping exit and you can shutdown the docker by running docker-compose down.

How to Run your first Airflow DAG in Docker 13

Share This Post

Share on facebook
Share on linkedin
Share on twitter
Share on email

Leave a Comment

Subscribe To Our Newsletter

Get updates and learn from the best

More To Explore