distill

Installation
SKILL.md

Distill

Overview

All subagent dispatches use disk-mediated dispatch. See shared/dispatch-convention.md for the full protocol.

Convert heavy document formats to token-efficient representations (Markdown, CSV) for LLM consumption. The core deliverable is the .digest.md — a structurally-aware compression at 20-30% of token count.

Skill type: Rigid — follow exactly, no shortcuts.

Models:

  • PDF structuring agent: Sonnet
  • Digest agent: Sonnet
  • Orchestrator: runs on whatever model the session uses

Announce at start: "I'm using the distill skill to convert documents to token-efficient formats."

Invocation API

Related skills
Installs
2
Repository
raddue/crucible
GitHub Stars
10
First Seen
Apr 10, 2026