I'm writing a paper on Information Theory and I can't get my head around this task:
I'd like to prove that the maximum value of the Shannon entropy function $H$ is reached when each event in the set of probabilities $Ps=\{P(x_1), P(x_2), ..., P(x_n)\}$ has the same value.
The $H$ function is defined like this:
$$ H(S)=\sum\limits_{i=1}^{card(S)}P(x_i)*(-\log_2(P(x_i))) $$
I could only prove this with $card(S)<=2$ but could not find any technique to do it for $card(S) = N$.
I think that a possible solution would be solving it with a proof by induction using $card(S)$ (the length of $S$) as our parameter.