prompt-repetition

Installation
SKILL.md

Prompt Repetition

Problem Being Solved

LLMs are trained as Causal Language Models, where each token attends only to previous tokens. This leads to:

  1. Context-Question Problem: The question is unknown when processing context
  2. Options-First MCQ Problem: Cannot fully understand the question context when viewing answer choices
  3. Position/Index Problem: Attention weights weaken for specific position information in long lists

Prompt repetition enables the second pass to reference the entire first pass, effectively mimicking some benefits of bidirectional attention.


When to use this skill

  • When using lightweight models: claude-haiku, gemini-flash, gpt-4o-mini, etc.
  • Options-First MCQ: Multiple choice where answer choices appear before the question
Related skills
Installs
1
GitHub Stars
8
First Seen
Mar 21, 2026