cosmos-dbt-fusion

Installation
Summary

Configure Astronomer Cosmos for dbt Fusion projects on Snowflake, Databricks, BigQuery, or Redshift with local execution.

  • Requires Cosmos 1.11.0+, dbt Fusion binary installed separately in the Airflow runtime, and ExecutionMode.LOCAL with subprocess invocation
  • Supports three parsing strategies: dbt_manifest (fastest for large projects), dbt_ls (for complex selectors), or automatic (simple setups)
  • Covers ProfileConfig setup for warehouse connections, ProjectConfig for dbt project paths, and RenderConfig for parsing behavior
  • Includes step-by-step checklist, Dockerfile installation example, and both DbtDag and DbtTaskGroup assembly patterns with validation steps
SKILL.md

Cosmos + dbt Fusion: Implementation Checklist

Execute steps in order. This skill covers Fusion-specific constraints only.

Version note: dbt Fusion support was introduced in Cosmos 1.11.0. Requires Cosmos ≥1.11.

Reference: See reference/cosmos-config.md for ProfileConfig, operator_args, and Airflow 3 compatibility details.

Before starting, confirm: (1) dbt engine = Fusion (not Core → use cosmos-dbt-core), (2) warehouse = Snowflake, Databricks, Bigquery and Redshift only.

Fusion-Specific Constraints

Constraint Details
No async AIRFLOW_ASYNC not supported
No virtualenv Fusion is a binary, not a Python package
Warehouse support Snowflake, Databricks, Bigquery and Redshift support while in preview

Related skills

More from astronomer/agents

Installs
557
GitHub Stars
361
First Seen
Feb 4, 2026