Chain rule of entropy

Chain rule of entropy

by Gabin Paul Jacques Leroy -
Number of replies: 1

Hello,

Let X,Y,Z be discrete random variables on \Omega,\mathcal{F},\Bbb{P} a probability space. Then is it true that H(X,Y,Z)=H(X)+H(Y,Z|X) ? I have two ideas to prove this result. The first uses the chain rule : H(X_1,X_2,X_3)=H(X_1)+H(X_2|X_1)+H(X_3|X_2,X_1) and since (not sure of this) H(X_2|X_1)+H(X_3|X_2,X_1)=H(X_2,X_3|X_1) we have the result. The second uses the definition of conditional entropy :

 H(X)+H(Y,Z|X)=-\sum_xp(x)\log[p(x)]-\sum_{x,y,z}p(x,y,z)\log[p(y,z | x)]

=-\sum_{x,y,z}p(x,y,z)\log[p(x)]-\sum_{x,y,z}p(x,y,z)\log[p(y,z | x)

=-\sum_{x,y,z}p(x,y,z)\log[\underbrace{p(x)p(y,z | x)}_{p(x,y,z)}]=H(X,Y,Z)

Are my two "proofs" correct ? Best regards, GL