Math and science::INF ML AI
Entropy of an ensemble
The entropy of an ensemble, \( X = (x, A_x, P_x) \), is defined to be the average Shannon information content over all outcomes:
[\[ H(X) = \quad ? \] ]
Properties of entropy:
- [\( H(X) \geq \; ? \) ] with equality iff [...].
- Entropy is maximized if [something about the outcomes is true].
- The Entropy is less than [some opperation applied to] the number of outcomes.
The last two points can be expressed as:
[\[ H(X) \leq \; ? \text{, with equality iff ? } \]]
Proof on the back side.