I have a totally random source of signal data that looks like a typical normal distribution. I've included an image as I like pictures:-
The source has a mean of 0, and a standard deviation of 1. It's analogue and therefore of infinite resolution. Very typical.
I then sample the signal with a Tricorder and record the raw data. Clearly as the original signal is random, irreducible information entropy is accrued at some rate, and will be smaller (in bits) than the raw data. My particular Tricorder model has a sample resolution setting, so for example I can record at anything from 1 bit resolution /sample to say 48 bits /sample. You might think of this as a quantization setting.
The crux of my question is: what is the relationship between the recorded entropy rate and the level of quantization? I'm hoping for either a formula or an example calculation such as 4.5 bits /sample with 10 bit quantization.
There is (perhaps) a similar question at Compressing normally distributed data, but I'm not sure and it doesn't really deal with sample bit depth in the same way. I'm developing the argument that real world entropy is generated by the observer, not the underlying process.
PS. Read analogue to digital converter for Tricorder.
