prompt-caching-patterns

Installation
SKILL.md

Prompt Caching Patterns

Implement effective caching strategies to reduce LLM costs by up to 90%.

When to Use

  • Same or similar prompts are sent repeatedly
  • Large system prompts are reused across requests
  • Responses can be reused for identical queries
  • Need to reduce latency for common requests
  • Optimizing costs for high-volume applications

Caching Strategies

1. Provider-Level Caching (Anthropic)

Anthropic offers built-in prompt caching with 90% cost reduction.

Related skills
Installs
3
GitHub Stars
3
First Seen
Feb 4, 2026