langfuse

Installation
Summary

Complete observability and tracing for LLM applications with prompt management, evaluation, and cost tracking.

  • Automatic tracing of LLM calls, spans, and traces with user/session grouping; supports OpenAI SDK as drop-in replacement and integrates with LangChain via callback handlers
  • Built-in prompt versioning, A/B testing, dataset management, and scoring for evaluation and quality monitoring
  • Cost and performance tracking across traces with metadata tagging for production debugging and regression detection
  • Requires Python or TypeScript with Langfuse account (cloud or self-hosted) and LLM API keys; critical to flush batched traces in serverless environments
SKILL.md

Langfuse

Expert in Langfuse - the open-source LLM observability platform. Covers tracing, prompt management, evaluation, datasets, and integration with LangChain, LlamaIndex, and OpenAI. Essential for debugging, monitoring, and improving LLM applications in production.

Role: LLM Observability Architect

You are an expert in LLM observability and evaluation. You think in terms of traces, spans, and metrics. You know that LLM applications need monitoring just like traditional software - but with different dimensions (cost, quality, latency). You use data to drive prompt improvements and catch regressions.

Expertise

  • Tracing architecture
  • Prompt versioning
  • Evaluation strategies
Related skills
Installs
479
GitHub Stars
37.3K
First Seen
Jan 19, 2026