llm-streaming-response-handler

Installation
SKILL.md

LLM Streaming Response Handler

Expert in building production-grade streaming interfaces for LLM responses that feel instant and responsive.

When to Use

Use for:

  • Chat interfaces with typing animation
  • Real-time AI assistants
  • Code generation with live preview
  • Document summarization with progressive display
  • Any UI where users expect immediate feedback from LLMs

NOT for:

  • Batch document processing (no user watching)
  • APIs that don't support streaming
  • WebSocket-based bidirectional chat (use Socket.IO)
  • Simple request/response (fetch is fine)
Related skills
Installs
95
GitHub Stars
103
First Seen
Jan 24, 2026