effect-uai-model-council
effect-uai model-council
Three models answer the same question, each scores the others'
answers, and the highest-rated answer is streamed as the winner.
Everything flows through one Stream<CouncilEvent> — no Queue, no
Deferred, no manual forks. The dependency graph is implicit in
where flatMap runs.
Reach for this when the user says any of:
- "Vote across multiple models for the best answer"
- "Cross-evaluation between models"
- "Pick a winner among ensemble outputs"
For side-by-side without scoring, use effect-uai-multi-model-compare.
Event taxonomy
More from betalyra/effect-uai
effect-uai
Use when building AI agents with effect-uai (Effect-based primitives for agent loops, items/turns, tools, streaming, structured output, multi-provider). Covers the design philosophy, the core primitives, provider wiring (OpenAI Responses, Anthropic, Google Gemini), and a catalog of recipe patterns (retry, multi-model fallback, tool approval, auto-compaction, streaming SSE/JSONL, etc.) the user can compose into their own agent.
2effect-uai-agentic-loop
Use when the user wants a long-lived chat agent that pulls user messages from a queue, debounces typing bursts into one batch, and only checks for new input between cleanly-finished turns. Mid-task tool exchanges run uninterrupted; new messages are buffered until the next clean turn boundary.
2effect-uai-basic-usage
Use when the user wants the canonical effect-uai agent loop — stream a model turn, run any tools the model asks for, append outputs, continue until the model produces a final answer. The starting shape every other recipe is a variation of.
2effect-uai-model-retry
Use when the user wants to retry transient model failures (rate limits, transport hiccups, timeouts) with exponential backoff, while letting non-retryable failures (content filtered, auth, invalid request, context length) propagate immediately. Inline pipeline with no helper service.
2effect-uai-pause-resume
Use when the user wants to soft-pause the agent loop between turns (no provider call held open) and resume later — e.g. cooldown after rate-limit, manual UI pause button, scheduled gating. State threads through naturally; resume picks up exactly where pause left off, no checkpointing needed.
2effect-uai-auto-compaction
Use when the user is worried about long conversations exceeding the context window or the input-token budget — summarize earlier history into one item once a turn / token threshold is crossed, keep the last few items verbatim, then continue. The compaction step is just another streamTurn; history is just state.
2