audit-prompt-caching

Installation
SKILL.md

Prompt Cache Audit

Diagnose and fix LLM prompt/prefix cache misses. Treat caching as an engineering property of the request path: stable prefix, cache-aware routing, and cache entries that live long enough to be reused.

Caching is an optimization only when the prefix is stable, long enough, repeated, measurable, and safe. Do not add cache controls, cache keys, or routing hints blindly.

When to use

Use this skill when reviewing or designing LLM calls where repeated prompt prefixes may reduce cost or latency through provider-native prompt caching, managed-router cache locality, or self-hosted KV reuse.

Installs
17
GitHub Stars
23
First Seen
Apr 30, 2026