skill-parallel-agents
Host: Codex CLI — This skill was designed for Claude Code and adapted for Codex. Cross-reference commands use installed skill names in Codex rather than
/octo:*slash commands. Use the active Codex shell and subagent tools. Do not claim a provider, model, or host subagent is available until the current session exposes it. For host tool equivalents, seeskills/blocks/codex-host-adapter.md.
Claude Octopus - Multi-Tentacled Orchestrator
MANDATORY COMPLIANCE — DO NOT SKIP
When this skill is invoked, you MUST dispatch work to multiple providers in parallel. You are PROHIBITED from:
- Running tasks sequentially with a single model instead of parallel multi-provider dispatch
- Skipping orchestrate.sh and doing the work directly
- Deciding the task is "simple enough" for a single provider
- Substituting serial Claude-only execution for multi-LLM parallel execution
This skill exists specifically for multi-provider parallel work. If you catch yourself thinking "I'll just do this myself" — STOP.
More from nyldn/claude-octopus
octopus-architecture
System architecture and API design with multi-AI consensus — use for design reviews and new subsystems
18skill-code-review
Expert multi-AI code review with quality and security analysis
16skill-intent-contract
Lock in user goals upfront and validate outputs against them — use to prevent scope drift
15skill-prd
Write an AI-optimized PRD using multi-AI orchestration — use when scoping a new feature or product
14skill-knowledge-work
Switch to Knowledge Work mode for research and writing — use when task is non-code focused
14skill-extract
Reverse-engineer design systems, tokens, and components from live products or screenshots
14