I'm trying to understand how to figure out how much information is lost in going from n dice to the total score.
If I rolled n-dice I could add up their face values to find the total sum, however if I just take the total sum I most likely can't tell which dice had which face value (except for all 1's or 6's). That is, such as the sum of 3 dice to be 11, just knowing 11, I can't tell exactly how much each die contributed to the sum. So it seems as if there is a loss of information in going from knowing the faces of the die to just knowing the total score.
Now, from A Webb's answer here it is simple to find the amount of information in n dice rolls:
$$- \log_2{((\frac{1}{6})^{n})} = - n * \log_2{(\frac{1}{6})} = n * 2.58$$ bits of information. As the probabilities follow a uniform distribution and all face probabilities are the same we'd expect the average information of each roll to be the same as each of the individual events. So the entropy (average information) for each die roll 2.58 bits. We can then sum this entropy $n$ times for $n$ dice.
Next, as the expected value of the sum of random variables is the sum of their expected values and the expected value for each die is $\frac{1+2+3+4+5+6}{6} = \frac{21}{6}$ we have that the expected value of the sum of n dice/rolls to be $n * 3.5$. I however am not sure where this gets me (if anywhere) in finding the information entropy of the sum of scores so as to compare that to that of the entropy of the n dice.