"Uncertainty", "unpredictability", "randomness", "information content" have all been used to describe what differential entropy measures. The first two definitions rein in unrequited comparisons with variance and volatility. To me, "randomness" resonates the best, having a stats background, but I struggle to convey the meaning of the last one especially.
If entropy measures information content of a random variable's statistical distribution, how can "information content" be put in layman's terms for a different scientific discipline that doesn't understand bits or any computer science jargon, nor deals with messengers and receivers, given that a low entropy variable is said to be "more informative" than a high entropy variable? maybe what I'm really looking for is a synonym for usage in place of "information"