scan
Codebase Scan
Audit the codebase against another skill's criteria using a parallel agent team.
Workflow
1. Parse args & load skill
Extract the skill name from the args passed to this skill.
- If no skill name provided, list available skills in
.claude/skills/and ask the user which to scan against. - If skill doesn't exist, list available skills and tell the user.
- Read
.claude/skills/<name>/SKILL.mdplus any files inreferences/andrules/subdirectories. - Distill the skill's content into a numbered criteria checklist: a flat list of concrete, testable rules labeled C1, C2, C3, etc. Each criterion should be a single sentence describing what to check for.
- If the skill has no evaluable code criteria (e.g., workflow-only skills like
whythat don't define code patterns or rules), tell the user it's not scannable and stop.
2. Discover relevant files
Use the skill's criteria to infer file scope:
More from continuedev/skills
check
Runs .continue/checks locally against the current diff, simulating the GitHub PR checks experience. Use when the user says /check to review their changes before pushing.
466writing-checks
Write Continue check files that review pull requests with AI agents. Use when the user asks to create, write, or generate a check, or wants to enforce a convention on PRs.
138all-green
Addresses all PR review comments, resolves merge conflicts, and fixes failing CI checks to get the PR ready to merge. Use when the user wants to make their PR "all green" or ready for merge.
46polish-repo
Polish an open-source repository with branding, community files, README overhaul, OG card, usage skill, PR checks, and publishing setup. Designed as a reusable template for Continue repos.
25