entropy

Installation
SKILL.md

Entropy

When to Use

Use this skill when working on entropy problems in information theory.

Decision Tree

  1. Shannon Entropy

    • H(X) = -sum p(x) log2 p(x)
    • Maximum for uniform distribution: H_max = log2(n)
    • Minimum = 0 for deterministic (one outcome certain)
    • scipy.stats.entropy(p, base=2) for discrete
  2. Entropy Properties

    • Non-negative: H(X) >= 0
    • Concave in p
    • Chain rule: H(X,Y) = H(X) + H(Y|X)
Related skills
Installs
1
GitHub Stars
3.8K
First Seen
Apr 5, 2026