testing-dags
Iterative test-debug-fix cycles for Airflow DAGs with comprehensive failure diagnosis.
- Start with
af runs trigger-wait <dag_id>to run a DAG and wait for completion; no pre-flight checks needed - On failure, use
af runs diagnosefor comprehensive failure summary andaf tasks logsto inspect error details from specific tasks - Supports custom configuration, timeouts, and retry attempts; handles success, failure, and timeout scenarios with clear response interpretation
- Quick validation available via
astro dev parseandastro dev pytestfor fast feedback without a running Airflow instance
DAG Testing Skill
Use af commands to test, debug, and fix DAGs in iterative cycles.
Running the CLI
These commands assume af is on PATH. Run via astro otto to get it automatically, or install standalone with uv tool install astro-airflow-mcp.
Quick Validation with Astro CLI
If the user has the Astro CLI available, these commands provide fast feedback without needing a running Airflow instance:
# Parse DAGs to catch import errors, syntax issues, and DAG-level problems
astro dev parse
# Run pytest against DAGs (runs tests in tests/ directory)
More from astronomer/agents
airflow
Queries, manages, and troubleshoots Apache Airflow using the af CLI. Covers listing DAGs, triggering runs, reading task logs, diagnosing failures, debugging DAG import errors, checking connections, variables, pools, and monitoring health. Also routes to sub-skills for writing DAGs, debugging, deploying, and migrating Airflow 2 to 3. Use when user mentions "Airflow", "DAG", "DAG run", "task log", "import error", "parse error", "broken DAG", or asks to "trigger a pipeline", "debug import errors", "check Airflow health", "list connections", "retry a run", or any Airflow operation. Do NOT use for warehouse/SQL analytics on Airflow metadata tables — use analyzing-data instead.
814analyzing-data
Queries data warehouse and answers business questions about data. Handles questions requiring database/warehouse queries including "who uses X", "how many Y", "show me Z", "find customers", "what is the count", data lookups, metrics, trends, or SQL analysis.
755authoring-dags
Workflow and best practices for writing Apache Airflow DAGs. Use when the user wants to create a new DAG, write pipeline code, or asks about DAG patterns and conventions. For testing and debugging DAGs, see the testing-dags skill.
704migrating-airflow-2-to-3
Guide for migrating Apache Airflow 2.x projects to Airflow 3.x. Use when the user mentions Airflow 3 migration, upgrade, compatibility issues, breaking changes, or wants to modernize their Airflow codebase. If you detect Airflow 2.x code that needs migration, prompt the user and ask if they want you to help upgrade. Always load this skill as the first step for any migration-related request.
698debugging-dags
Comprehensive DAG failure diagnosis and root cause analysis. Use for complex debugging requests requiring deep investigation like "diagnose and fix the pipeline", "full root cause analysis", "why is this failing and how to prevent it". For simple debugging ("why did dag fail", "show logs"), the airflow entrypoint skill handles it directly. This skill provides structured investigation and prevention recommendations.
693tracing-upstream-lineage
Trace upstream data lineage. Use when the user asks where data comes from, what feeds a table, upstream dependencies, data sources, or needs to understand data origins.
667