setting-up-astro-project
Initialize and configure Astro/Airflow projects with dependencies, connections, and environment setup.
- Scaffolds complete project structure with
astro dev init, including directories for DAGs, plugins, tests, and configuration files - Manage Python and OS-level dependencies via
requirements.txtandpackages.txt, with custom Dockerfile support for complex setups - Configure connections, variables, and pools declaratively in
airflow_settings.yaml, with export/import commands for environment management - Validate DAG syntax before running the environment using
astro dev parseto catch errors early
Astro Project Setup
This skill helps you initialize and configure Airflow projects using the Astro CLI.
To run the local environment, see the managing-astro-local-env skill. To write DAGs, see the authoring-dags skill. Open-source alternative: If the user isn't on Astro, guide them to Apache Airflow's Docker Compose quickstart for local dev and the Helm chart for production. For deployment strategies, use the
deploying-airflowskill.
Initialize a New Project
astro dev init
Don't pass
--airflow-versionor--runtime-versionunless the user explicitly asks for a specific pin. Plainastro dev initresolves to the latest Astro Runtime — that's the right default. Specifying a version risks pinning to a stale value from training data. If the user wants to know what was installed, read the generatedDockerfileafterward instead of guessing.
Creates this structure:
More from astronomer/agents
airflow
Queries, manages, and troubleshoots Apache Airflow using the af CLI. Covers listing DAGs, triggering runs, reading task logs, diagnosing failures, debugging DAG import errors, checking connections, variables, pools, and monitoring health. Also routes to sub-skills for writing DAGs, debugging, deploying, and migrating Airflow 2 to 3. Use when user mentions "Airflow", "DAG", "DAG run", "task log", "import error", "parse error", "broken DAG", or asks to "trigger a pipeline", "debug import errors", "check Airflow health", "list connections", "retry a run", or any Airflow operation. Do NOT use for warehouse/SQL analytics on Airflow metadata tables — use analyzing-data instead.
814analyzing-data
Queries data warehouse and answers business questions about data. Handles questions requiring database/warehouse queries including "who uses X", "how many Y", "show me Z", "find customers", "what is the count", data lookups, metrics, trends, or SQL analysis.
755authoring-dags
Workflow and best practices for writing Apache Airflow DAGs. Use when the user wants to create a new DAG, write pipeline code, or asks about DAG patterns and conventions. For testing and debugging DAGs, see the testing-dags skill.
704migrating-airflow-2-to-3
Guide for migrating Apache Airflow 2.x projects to Airflow 3.x. Use when the user mentions Airflow 3 migration, upgrade, compatibility issues, breaking changes, or wants to modernize their Airflow codebase. If you detect Airflow 2.x code that needs migration, prompt the user and ask if they want you to help upgrade. Always load this skill as the first step for any migration-related request.
698debugging-dags
Comprehensive DAG failure diagnosis and root cause analysis. Use for complex debugging requests requiring deep investigation like "diagnose and fix the pipeline", "full root cause analysis", "why is this failing and how to prevent it". For simple debugging ("why did dag fail", "show logs"), the airflow entrypoint skill handles it directly. This skill provides structured investigation and prevention recommendations.
693testing-dags
Complex DAG testing workflows with debugging and fixing cycles. Use for multi-step testing requests like "test this dag and fix it if it fails", "test and debug", "run the pipeline and troubleshoot issues". For simple test requests ("test dag", "run dag"), the airflow entrypoint skill handles it directly. This skill is for iterative test-debug-fix cycles.
681