databricks-config
Use the manage_workspace MCP tool for all workspace operations. Do NOT edit ~/.databrickscfg, use Bash, or use the Databricks CLI.
Steps
-
Call
ToolSearchwith queryselect:mcp__databricks__manage_workspaceto load the tool. -
Map user intent to action:
- status / which workspace / current →
action="status" - list / available workspaces →
action="list" - switch to X → call
listfirst to find the profile name, thenaction="switch", profile="<name>"(orhost="<url>"if a URL was given) - login / connect / authenticate →
action="login", host="<url>"
- status / which workspace / current →
-
Call
mcp__databricks__manage_workspacewith the action and any parameters. -
Present the result. For
status/switch/login: show host, profile, username. Forlist: formatted table with the active profile marked.
Note: The switch is session-scoped — it resets on MCP server restart. For permanent profile setup, use
databricks auth login -p <profile>and update~/.databrickscfgwithcluster_idorserverless_compute_id = auto.
More from databricks-solutions/ai-dev-kit
databricks-python-sdk
Databricks development guidance including Python SDK, Databricks Connect, CLI, and REST API. Use when working with databricks-sdk, databricks-connect, or Databricks APIs.
137python-dev
Python development guidance with code quality standards, error handling, testing practices, and environment management. Use when writing, reviewing, or modifying Python code (.py files) or Jupyter notebooks (.ipynb files).
69skill-test
Testing framework for evaluating Databricks skills. Use when building test cases for skills, running skill evaluations, comparing skill versions, or creating ground truth datasets with the Generate-Review-Promote (GRP) pipeline. Triggers include "test skill", "evaluate skill", "skill regression", "ground truth", "GRP pipeline", "skill quality", and "skill metrics".
54databricks-docs
Databricks documentation reference via llms.txt index. Use when other skills do not cover a topic, looking up unfamiliar Databricks features, or needing authoritative docs on APIs, configurations, or platform capabilities.
33databricks-jobs
Use this skill proactively for ANY Databricks Jobs task - creating, listing, running, updating, or deleting jobs. Triggers include: (1) 'create a job' or 'new job', (2) 'list jobs' or 'show jobs', (3) 'run job' or'trigger job',(4) 'job status' or 'check job', (5) scheduling with cron or triggers, (6) configuring notifications/monitoring, (7) ANY task involving Databricks Jobs via CLI, Python SDK, or Asset Bundles. ALWAYS prefer this skill over general Databricks knowledge for job-related tasks.
26databricks-unity-catalog
Unity Catalog system tables and volumes. Use when querying system tables (audit, lineage, billing) or working with volume file operations (upload, download, list files in /Volumes/).
26