handoff
Handoff Prompt Generator
Generate a prompt that another agent can execute without guessing.
Choose the handoff mode
- Use a shared-workspace handoff when the receiving agent can access the same repo, files, and artifacts.
- Use a fresh-context handoff when the receiving agent starts cold, in another session, or on another platform.
- Ask for the target model family if it is not implied by the user's request. If it still is not known, draft a vendor-neutral base prompt and mark any missing model-specific adjustments.
Read one model reference
Read only the reference that matches the receiving model:
More from blacktop/dotfiles
ratatui-tui
|
131code-simplifier
Simplifies and refines code for clarity, consistency, and maintainability while preserving all functionality. Focuses on recently modified code unless instructed otherwise.
3rust-profiling
Profile Rust code using samply to identify CPU bottlenecks. Use when performance is slow, before optimizing, or when the user asks to profile.
1go-performance
Measure and improve Go program performance using current Go 1.26-era workflow. Use when profiling Go code, diagnosing CPU or memory bottlenecks, investigating latency or contention, writing or fixing benchmarks, comparing benchmark results, using pprof or trace data, applying PGO, or tuning hot-path Go code.
1second-opinion
Run an external LLM code review with Codex CLI, Gemini CLI, or both. Use when the user asks for a second opinion, external review, Codex review, Gemini review, or wants a model-vs-model review of current changes, a branch diff, a specific commit, or a GitHub pull request.
1humanizer
|
1