Data science and engineering are becoming an increasingly important part of application development. However, many data tools and platforms were designed to produce artifacts that were not intended to be operationalized. Dagster; however, was built to support data teams as they develop and test, deploy and execute, and monitor and observe data applications. In this post, Nick Schrock compares Dagster to Airflow, a popular data workflow orchestrator. He describes how Dagster’s core abstractions (e.g. solids that model inputs and outputs, a structured event log that records all computations) and design decisions (e.g decoupling I/O from compute, scheduling runs not tasks) enable better developer experiences, affords teams more flexibility, and lead to more resilient data applications.