header
Show Answer
Math and science::INF ML AI

Entropy of an ensemble

The entropy of an ensemble, X=(x,Ax,Px), is defined to be the average Shannon information content over all outcomes:

[H(X)=? ]

Properties of entropy:

  • [H(X)? ] with equality iff [...].
  • Entropy is maximized if [something about the outcomes is true].
  • The Entropy is less than [some opperation applied to] the number of outcomes.

The last two points can be expressed as:

[H(X)?, with equality iff    ? ]

Proof on the back side.