multi-provider-llm-proxy-debugging

Installation
SKILL.md

Multi-Provider LLM Proxy Chain Debugging

Problem

When building an API proxy that falls back across multiple LLM providers (e.g., Claude Max -> Google AI Studio -> OpenRouter), silent failures can occur where the proxy reports "success" but the downstream consumer (bot) receives error bodies instead of actual LLM responses.

Context / Trigger Conditions

  • Bot responds instantly (<500ms for what should be a multi-second LLM call)
  • Bot produces empty or garbage responses (it parsed a JSON error as the "response")
  • Provider status shows lastSuccess: null despite requests > 0 and circuit: CLOSED
  • Low outputTokens count (e.g., 23 tokens for what should be a paragraph)
  • Fallback chain stops at the wrong provider — first fallback fails but doesn't cascade

Three-Layer Debugging Pattern

Related skills
Installs
1
First Seen
Apr 16, 2026