databricks-jobs

Installation
SKILL.md

Lakeflow Jobs Development

FIRST: Use the parent databricks-core skill for CLI basics, authentication, profile selection, and data exploration commands.

Lakeflow Jobs are scheduled workflows that run notebooks, Python scripts, SQL queries, and other tasks on Databricks.

Scaffolding a New Job Project

Use databricks bundle init with a config file to scaffold non-interactively. This creates a project in the <project_name>/ directory:

databricks bundle init default-python --config-file <(echo '{"project_name": "my_job", "include_job": "yes", "include_pipeline": "no", "include_python": "yes", "serverless": "yes"}') --profile <PROFILE> < /dev/null
  • project_name: letters, numbers, underscores only

After scaffolding, create CLAUDE.md and AGENTS.md in the project directory. These files are essential to provide agents with guidance on how to work with the project. Use this content:

Related skills
Installs
282
GitHub Stars
116
First Seen
Mar 8, 2026