metric-views
Unity Catalog Metric Views
Define reusable, governed business metrics in YAML that separate measure definitions from dimension groupings for flexible querying.
When to Use
Use this skill when:
- Defining standardized business metrics (revenue, order counts, conversion rates)
- Building KPI layers shared across dashboards, Genie, and SQL queries
- Creating metrics with complex aggregations (ratios, distinct counts, filtered measures)
- Defining window measures (moving averages, running totals, period-over-period, YTD)
- Modeling star or snowflake schemas with joins in metric definitions
- Enabling materialization for pre-computed metric aggregations
Prerequisites
- Databricks Runtime 17.2+ (for YAML version 1.1)
- SQL warehouse with
CAN USEpermissions SELECTon source tables,CREATE TABLE+USE SCHEMAin the target schema
More from databricks-solutions/ai-dev-kit
databricks-python-sdk
Databricks development guidance including Python SDK, Databricks Connect, CLI, and REST API. Use when working with databricks-sdk, databricks-connect, or Databricks APIs.
132python-dev
Python development guidance with code quality standards, error handling, testing practices, and environment management. Use when writing, reviewing, or modifying Python code (.py files) or Jupyter notebooks (.ipynb files).
68skill-test
Testing framework for evaluating Databricks skills. Use when building test cases for skills, running skill evaluations, comparing skill versions, or creating ground truth datasets with the Generate-Review-Promote (GRP) pipeline. Triggers include "test skill", "evaluate skill", "skill regression", "ground truth", "GRP pipeline", "skill quality", and "skill metrics".
53databricks-docs
Databricks documentation reference via llms.txt index. Use when other skills do not cover a topic, looking up unfamiliar Databricks features, or needing authoritative docs on APIs, configurations, or platform capabilities.
29databricks-config
Manage Databricks workspace connections: check current workspace, switch profiles, list available workspaces, or authenticate to a new workspace. Use when the user mentions \"switch workspace\", \"which workspace\", \"current profile\", \"databrickscfg\", \"connect to workspace\", or \"databricks auth\".
26databricks-app-python
Builds Python-based Databricks applications using Dash, Streamlit, Gradio, Flask, FastAPI, or Reflex. Handles OAuth authorization (app and user auth), app resources, SQL warehouse and Lakebase connectivity, model serving integration, foundation model APIs, LLM integration, and deployment. Use when building Python web apps, dashboards, ML demos, or REST APIs for Databricks, or when the user mentions Streamlit, Dash, Gradio, Flask, FastAPI, Reflex, or Databricks app.
22