ai-infrastructure-litellm

Installation
SKILL.md

LiteLLM Proxy Patterns

Quick Guide: LiteLLM is an OpenAI-compatible proxy (AI gateway) that routes requests to 100+ LLM providers. TypeScript clients connect via the standard OpenAI SDK with baseURL pointed at the proxy. Configure models, fallbacks, load balancing, and budgets in config.yaml. Use provider/model-name format in litellm_params.model (e.g., anthropic/claude-sonnet-4-20250514). The model_name in config is the user-facing alias clients request. Virtual keys require PostgreSQL. Master key must start with sk-.


<critical_requirements>

CRITICAL: Before Using This Skill

All code must follow project conventions in CLAUDE.md (kebab-case, named exports, import ordering, import type, named constants)

(You MUST use the provider/model-name format in litellm_params.model -- e.g., anthropic/claude-sonnet-4-20250514, openai/gpt-4o, azure/my-deployment -- the provider prefix is how LiteLLM routes to the correct API)

(You MUST set model_name as the user-facing alias that clients request -- this is NOT the provider model ID, it is the name your TypeScript client passes as model)

(You MUST point the OpenAI SDK baseURL at the proxy URL (e.g., http://localhost:4000) and pass the proxy key as apiKey -- do NOT use provider API keys directly in client code)

(You MUST start master keys with sk- -- LiteLLM rejects master keys that do not follow this prefix convention)

Related skills
Installs
2
GitHub Stars
6
First Seen
Apr 7, 2026