ai-monitoring

Installation
SKILL.md

Know When Your AI Breaks in Production

Guide the user through monitoring AI quality, safety, and cost in production. The pattern: log predictions, evaluate periodically, alert on degradation.

When you need monitoring

  • Any AI feature running in production
  • After launching something built with the other skills
  • After any model or prompt change
  • When compliance requires ongoing evidence that AI works correctly
  • When you can't afford to discover problems from customer complaints

What can go wrong (without monitoring)

Problem How it happens Impact
Silent model changes Provider updates model behavior Accuracy drops, nobody notices for weeks
Input drift Users start asking questions you didn't train for Quality degrades on new use cases
Gradual degradation Prompts rot as data distribution shifts Slow decline — death by a thousand cuts
Related skills

More from lebsral/dspy-programming-not-prompting-lms-skills

Installs
20
GitHub Stars
5
First Seen
Feb 8, 2026