databricks-multi-env-setup
Installation
SKILL.md
Databricks Multi-Environment Setup
Overview
Configure Databricks across dev, staging, and production with isolated workspaces (or catalog-level isolation), per-environment secrets, Asset Bundle targets, and Terraform for workspace provisioning. Each environment gets its own credentials, Unity Catalog namespace, and compute policies.
Prerequisites
- Databricks account with multiple workspaces (or Premium for catalog-level isolation)
- Service principals per environment
- Secret management (Databricks Secret Scopes, AWS Secrets Manager, or GCP Secret Manager)
- CI/CD pipeline (GitHub Actions, Azure DevOps, etc.)
Environment Strategy
| Environment | Workspace | Catalog | Auth | Compute |
|---|---|---|---|---|
| Development | Shared or dedicated | dev_catalog |
Personal PAT | Single-node, 15min auto-stop |
| Staging | Dedicated | staging_catalog |
Service principal | Production-like, spot instances |
| Production | Dedicated | prod_catalog |
Service principal (OAuth M2M) | Instance pools, auto-scale |
Instructions
Related skills