airflow

Installation
SKILL.md

airflow

Purpose

Airflow is an open-source workflow orchestration tool for defining, scheduling, and monitoring data pipelines as code. It uses Python to create Directed Acyclic Graphs (DAGs) that represent task dependencies and execution flows.

When to Use

Use Airflow for scenarios involving recurring data tasks, such as ETL processes, batch jobs, or complex workflows with dependencies. It's ideal when you need dynamic scheduling, retries, and monitoring in data engineering pipelines, especially for production-scale operations with tools like Spark or databases.

Key Capabilities

  • Define workflows as DAGs in Python, specifying tasks, dependencies, and schedules.
  • Built-in schedulers that run tasks at defined intervals (e.g., cron-style).
  • Web UI for real-time monitoring, including task logs and DAG status via endpoints like /admin/.
  • Operators like BashOperator for shell commands or PythonOperator for custom functions.
  • Extensible hooks for integrations, such as PostgresHook for database connections.
  • Configuration via airflow.cfg file, e.g., set [core] executor = LocalExecutor for local testing.

Usage Patterns

To use Airflow, install it via pip install apache-airflow, then initialize the database with airflow db init. Define DAGs in the dags folder of your Airflow home directory. Always use a virtual environment to avoid conflicts. For authentication, set environment variables like $AIRFLOW_UID for user isolation.

Related skills
Installs
22
GitHub Stars
5
First Seen
Mar 5, 2026