litellm

Installation
SKILL.md

LiteLLM

Unified Python interface for calling 100+ LLM APIs using consistent OpenAI format. Provides standardized exception handling, retry/fallback logic, and cost tracking across multiple providers.

When to Use This Skill

Use this skill when:

  • Integrating with multiple LLM providers through a single interface
  • Routing requests to local llamafile servers using OpenAI-compatible endpoints
  • Implementing retry and fallback logic for LLM calls
  • Building applications requiring consistent error handling across providers
  • Tracking LLM usage costs across different providers
  • Converting between provider-specific APIs and OpenAI format
  • Deploying LLM proxy servers with unified configuration
  • Testing applications against both cloud and local LLM endpoints

Core Capabilities

Related skills

More from jamie-bitflight/claude_skills

Installs
10
GitHub Stars
44
First Seen
Mar 23, 2026