effect-uai-pause-resume
effect-uai pause-resume
Soft pause/resume of an in-flight agent loop using Latch. The body
waits on the latch before each iteration; closing it pauses the loop
(no new streamTurn is initiated, no HTTP connection held), opening
it resumes. State threads through the loop naturally, so resume picks
up exactly where pause left off — no checkpoint to write.
Reach for this when the user says any of:
- "Pause the agent for X seconds and resume"
- "Manual pause button between turns"
- "Cool down between iterations"
The mechanism
One Latch.await at the top of the loop body is the entire pause.
More from betalyra/effect-uai
effect-uai
Use when building AI agents with effect-uai (Effect-based primitives for agent loops, items/turns, tools, streaming, structured output, multi-provider). Covers the design philosophy, the core primitives, provider wiring (OpenAI Responses, Anthropic, Google Gemini), and a catalog of recipe patterns (retry, multi-model fallback, tool approval, auto-compaction, streaming SSE/JSONL, etc.) the user can compose into their own agent.
2effect-uai-agentic-loop
Use when the user wants a long-lived chat agent that pulls user messages from a queue, debounces typing bursts into one batch, and only checks for new input between cleanly-finished turns. Mid-task tool exchanges run uninterrupted; new messages are buffered until the next clean turn boundary.
2effect-uai-basic-usage
Use when the user wants the canonical effect-uai agent loop — stream a model turn, run any tools the model asks for, append outputs, continue until the model produces a final answer. The starting shape every other recipe is a variation of.
2effect-uai-model-retry
Use when the user wants to retry transient model failures (rate limits, transport hiccups, timeouts) with exponential backoff, while letting non-retryable failures (content filtered, auth, invalid request, context length) propagate immediately. Inline pipeline with no helper service.
2effect-uai-model-council
Use when the user wants three (or more) models to answer the same question, score each other's answers (no self-judging), and emit a winner — e.g. consensus voting, audit, automated quality picking. Pure Stream composition; no Queue, no Deferred, no manual forks.
2effect-uai-auto-compaction
Use when the user is worried about long conversations exceeding the context window or the input-token budget — summarize earlier history into one item once a turn / token threshold is crossed, keep the last few items verbatim, then continue. The compaction step is just another streamTurn; history is just state.
2