spark-connect
Installation
SKILL.md
Spark Connect — Local Execution Against a Databricks Cluster
Databricks Connect (built on Apache Spark Connect) lets you run PySpark code locally while the actual computation happens on the remote Databricks cluster.
Benefits:
- Local IDE/Jupyter development with full Spark semantics
- No need to upload notebooks manually — scripts run in-place from your machine
- Enables autonomous loops (autoresearch, CI) that submit Spark jobs per iteration
How It Works
Local Machine (your IDE / script)
│
│ Spark Connect protocol (gRPC)
▼
Databricks Cluster (<DATABRICKS_CLUSTER_ID>)
│
Related skills
More from bmsuisse/skills
autoresearch
>
15codeunit-analyzer
>
14deslop
>
14coding-guidelines-python
>
13init-app-stack
Use this skill whenever the user wants to bootstrap, scaffold, or initialize a new full-stack app with a Vite + React + TanStack + shadcn/ui frontend and a FastAPI + Postgres backend. Triggers on requests like "create a new app", "set up a project", "scaffold a full-stack app", "init a new project", or anything involving starting a fresh React/FastAPI application from scratch.
12databricks-sql-autotuner
>
12