Quantcast
Channel: DELIVERBI Blog GCP Google | BIG DATA | OBIEE | OBIA | Oracle | Trino
Viewing all articles
Browse latest Browse all 90

Apache Airflow Check Previous Run Status for a DAG

$
0
0

Apache Airflow Check Previous Run Status for a DAG


The Problem

We encountered a scenario where if any previous DAG run fails the next scheduled DAG run should not proceed. Checked for the solution and did not find a robust solution. So I created one working with Shahed Munir

The Solution

I came up with a very generic approach that can be applied to any DAG that is created within Apache Airflow. Here is a sample DAG that uses and applies this concept in practice. 

To enable the capability of checking any of the previous DAG run (self references) the first task needs to be created on the lines provided in the sample DAG within GitHub https://github.com/deliverbi/Airflow_Code.git

This solution relies on a Airflow Connection a mysql database used to store the Airflow Metadata.
In the sample DAG code this Airflow connection is called deliverbi_mysql_airflow

Then place the sample DAG in the dags directory.

Notes

All previous DAG runs must be set to success for the current DAG run to proceed. This task should be the first in any DAG that is created to enable this functionality

Here is the link to GitHub Repo Click Here to go to GitHub for DeliverBI

Over and Out. Krishna


Viewing all articles
Browse latest Browse all 90

Trending Articles