Math and science::INF ML AI
Entropy of an ensemble
The entropy of an ensemble,
Properties of entropy:
with equality iff some outcome has probability 1.- Entropy is maximized if the probability of outcomes is uniform.
- The Entropy is less than the log of the number of outcomes.
The last two points can be expressed as:
Proof on the back side.
Proof that :
First note that:
Now then:
Where

Perspectives on entropy:
- If
is a random variable, we can create another random variable, let's call it , by applying a fuction to the outcomes, . What if this function depended not on the value of the outcome, but on the probability of the outcome? As there is a function P that maps outcomes to probabilities, we could create a random variable . For Shannon information content and entropy, we create the variable . The entropy is the expected value of this random variable.