Abstract
Orchestrating data pipelines with data
Date
Apr 27, 2022 12:18 AM — 12:20 AM
The fragmented modern data stack has emerged as the unbundling of Airflow. Various tools operate in silos. Dagster as a next-generation data orchestrator allows you to clearly see the data dependencies of the individual pipelines on your data factory floor. Following along with my blog post series about Dagster I will cover:
- Getting started with dagster and building simple data pipelines
- How software-defined assets allow to turn data pipelines around and result in higher quality by allowing to integrate data quality tests straight into the pipelines as well as by separating business logic from infrastructure allowing for better testability.
URLS for reference:
Researcher & data scientist
My research interests include large geo-spatial time and network data analytics.