Kindly fill up the following to try out our sandbox experience. We will get back to you at the earliest.
Newly launched: Airflow integration!
Discover our new integration with Apache Airflow. Author, schedule, and monitor workflows programmatically with our latest release.
By popular demand, we’ve now released our latest integration with Apache Airflow, the open-source platform for programmatically authoring, scheduling, and monitoring workflows. This new integration allows our users to take their data management to the next level, making it easier to track and manage their data pipelines.
Centralizing Metadata in the Catalog
Connecting your Airflow to decube allows us to automatically add all your Airflow DAGs into our Catalog, so you have a central repository for all your organization’s data assets. This helps your team to search and discover relevant workflows and help them to understand the relationships between different workflows and data assets.
From the Catalog page, you can get a high-level overview of your Airflow jobs' schedule, the status of the most recent run, and the time it last ran.
Navigating into the Asset Details, in the Overview tab you’ll get a full picture of each tasks within your DAG, and the status of the last 25 runs of each task. It’s a great way to see on one page the status of all your runs, which were successful, skipped or had an error without navigating to Airflow account.
Within each Data Job, you’ll also able to see the definition of the DAG directly within decube itself.
Monitoring your Airflow Jobs
Upon connecting your Airflow to decube, pipeline monitoring is automatically enabled. This is so that if there’s an issue with any of your Airflow jobs, we’ll immediately raise an alert in the form of an incident type “Job Failure”.
Let us know whether you’ll like to be notified via Slack or the email you have authorized in decube’s My Account page by toggling the Notify.
Set up pipeline owners
But it’s no use if there’s no one accountable to fix issues or keep information updated.
One of the most important features within decube is the ability to assign ownership to your pipelines. This makes it easy to identify who is responsible for each pipeline and ensure that information on the pipeline, such as metadata and documentation, is up-to-date and easily accessible to others in your team. It’s also a way to quickly see who is responsibly for resolving issues so that problems are dealt with in a timely manner.
If you’re interested in connecting your Airflow integration today, check out our docs here.
What’s next?
We’re still working on adding more integrations to a range of tools in your modern data stack, so stay tuned on our Public Roadmap. If you have a connector to suggest, just reach out to us via the Live Chat!