deepdream of
          a sidewalk
Show Question
Joint entropy of two random variables

For two ensembles, X=(x,Ax,Px) and Y=(y,Ay,Py), where there may be dependency between Px and Py, the joint entropy of X, Y is:

H(X,Y)=xAxyAyP(x,y)log1P(x,y)

Entropy is additive for independent random variables.

Proof

H(X,Y)=xAxyAyP(x)P(y)log1P(x)P(y)=xAxyAyP(x)P(y)log1P(x)+xAxyAyP(x)P(y)log1P(y)=xAxP(x)log1P(x)+yAyP(y)log1P(y) (the first sum's terms are independent of y, and the second's independent of x)=H(X)+H(Y)