For random variables $X$ and $Y$, the conditional entropy of $Y$ given $X$is defined as $$H(Y|X) = - \sum_{x, y} p(x, y) \log p(y|x) = \sum_x p(x) H(Y|X=x)$$ where $$H(Y|X=x) = - \sum_y p(y|x) \log p(y|x)$$
note: $p(x, y) = p(x) \times p(y|x)$ can be used to prove the above equality.
Similarly, for $H(Y|X, Z)$, we write $$H(Y|X, Z) = \sum_z p(z) H(Y|X, Z=z)$$ where $$H(Y|X, Z=z) = - \sum_{x, y} p(x, y|z) \log p(y|x, z)$$
This is Definition 2.15 of chapter II from Information Theory and Network Coding book.
I get confused on how to prove $H(Y|X, Z)$.
check this answer
– cyborg_ Mar 25 '20 at 13:08(1) $H(Y|X,Z)=H(X, Y|Z)-H(X|Z)$ ([proof](see: https://math.stackexchange.com/q/2676703)) and (2) $I(X;Y|Z) = H(X|Z)-H(X|Y,Z)$.
From (1), we get $H(X|Y,Z)=H(Y, X|Z)-H(Y|Z)$.
So $I(X;Y|Z) = H(X|Z)+H(Y|Z)-H(Y, X|Z)$. $H(Y|X,Z)=H(X,Y|Z)-I(X;Y|Z)+H(Y|Z)-H(Y,X|Z)=H(Y|Z)-I(X;Y|Z)\leq H(Y|Z)$.
Because $H(X,Y|Z)=H(Y,X|Z)$ by symmetry and $I(X;Y|Z)\geq 0$. Am I correct in my proof?
– Joe Mar 25 '20 at 18:26