migrating-airflow-2-to-3
Automated detection and code migration for upgrading Apache Airflow 2.x DAGs to Airflow 3.x.
- Provides Ruff-based auto-fix rules (AIR30/AIR301/AIR302/AIR31/AIR311/AIR312) to detect and resolve breaking changes in imports, operators, hooks, and context variables
- Covers critical architecture shifts: workers no longer access metadata DB directly; use the Airflow Python client or REST API instead of ORM session queries
- Includes manual migration checklist for issues Ruff cannot auto-fix: cron scheduling semantics,
.airflowignoreglob syntax, OAuth callback URL prefixes, and shared utility imports - Recommends upgrade path: Airflow 2.11 → 3.0.11+ (ideally 3.1) to avoid rollback issues and early 3.0 bugs
This skill uses Claude hooks which can execute code automatically in response to events. Review carefully before installing.
Airflow 2 to 3 Migration
This skill helps migrate Airflow 2.x DAG code to Airflow 3.x, focusing on code changes (imports, operators, hooks, context, API usage).
Important: Before migrating to Airflow 3, strongly recommend upgrading to Airflow 2.11 first, then to at least Airflow 3.0.11 (ideally directly to 3.1). Other upgrade paths would make rollbacks impossible. See: https://www.astronomer.io/docs/astro/airflow3/upgrade-af3#upgrade-your-airflow-2-deployment-to-airflow-3. Additionally, early 3.0 versions have many bugs - 3.1 provides a much better experience.
Migration at a Glance
- Run Ruff's Airflow migration rules to auto-fix detectable issues (AIR30/AIR301/AIR302/AIR31/AIR311/AIR312).
ruff check --preview --select AIR --fix --unsafe-fixes .
- Scan for remaining issues using the manual search checklist in reference/migration-checklist.md.
- Focus on: direct metadata DB access, legacy imports, scheduling/context keys, XCom pickling, datasets-to-assets, REST API/auth, plugins, and file paths.
- Hard behavior/config gotchas to explicitly review:
- Cron scheduling semantics: consider
AIRFLOW__SCHEDULER__CREATE_CRON_DATA_INTERVAL=Trueif you need Airflow 2-style cron data intervals. .airflowignoresyntax changed from regexp to glob; setAIRFLOW__CORE__DAG_IGNORE_FILE_SYNTAX=regexpif you must keep regexp behavior.- OAuth callback URLs add an
/auth/prefix (e.g./auth/oauth-authorized/google). - Shared utility imports: Bare imports like
import commonfromdags/common/no longer work on Astro. Use fully qualified imports:import dags.common.
- Cron scheduling semantics: consider
- Plan changes per file and issue type:
- Fix imports - update operators/hooks/providers - refactor metadata access to using the Airflow client instead of direct access - fix use of outdated context variables - fix scheduling logic.
More from astronomer/agents
airflow
Queries, manages, and troubleshoots Apache Airflow using the af CLI. Covers listing DAGs, triggering runs, reading task logs, diagnosing failures, debugging DAG import errors, checking connections, variables, pools, and monitoring health. Also routes to sub-skills for writing DAGs, debugging, deploying, and migrating Airflow 2 to 3. Use when user mentions "Airflow", "DAG", "DAG run", "task log", "import error", "parse error", "broken DAG", or asks to "trigger a pipeline", "debug import errors", "check Airflow health", "list connections", "retry a run", or any Airflow operation. Do NOT use for warehouse/SQL analytics on Airflow metadata tables — use analyzing-data instead.
818analyzing-data
Queries data warehouse and answers business questions about data. Handles questions requiring database/warehouse queries including "who uses X", "how many Y", "show me Z", "find customers", "what is the count", data lookups, metrics, trends, or SQL analysis.
758authoring-dags
Workflow and best practices for writing Apache Airflow DAGs. Use when the user wants to create a new DAG, write pipeline code, or asks about DAG patterns and conventions. For testing and debugging DAGs, see the testing-dags skill.
709debugging-dags
Comprehensive DAG failure diagnosis and root cause analysis. Use for complex debugging requests requiring deep investigation like "diagnose and fix the pipeline", "full root cause analysis", "why is this failing and how to prevent it". For simple debugging ("why did dag fail", "show logs"), the airflow entrypoint skill handles it directly. This skill provides structured investigation and prevention recommendations.
696testing-dags
Complex DAG testing workflows with debugging and fixing cycles. Use for multi-step testing requests like "test this dag and fix it if it fails", "test and debug", "run the pipeline and troubleshoot issues". For simple test requests ("test dag", "run dag"), the airflow entrypoint skill handles it directly. This skill is for iterative test-debug-fix cycles.
684tracing-upstream-lineage
Trace upstream data lineage. Use when the user asks where data comes from, what feeds a table, upstream dependencies, data sources, or needs to understand data origins.
669