entropy
Installation
SKILL.md
Entropy
When to Use
Use this skill when working on entropy problems in information theory.
Decision Tree
-
Shannon Entropy
- H(X) = -sum p(x) log2 p(x)
- Maximum for uniform distribution: H_max = log2(n)
- Minimum = 0 for deterministic (one outcome certain)
scipy.stats.entropy(p, base=2)for discrete
-
Entropy Properties
- Non-negative: H(X) >= 0
- Concave in p
- Chain rule: H(X,Y) = H(X) + H(Y|X)
Related skills
More from parcadei/continuous-claude-v3
discovery-interview
Deep interview process to transform vague ideas into detailed specs. Works for technical and non-technical users.
3.2Kgithub-search
Search GitHub code, repositories, issues, and PRs via MCP
516math
Unified math capabilities - computation, solving, and explanation. I route to the right tool.
514dead-code
Find unused functions and dead code in the codebase
432premortem
Identify failure modes before they occur using structured risk analysis
407agentic-workflow
Agentic Workflow Pattern
391