Let us take as an example a stationary process $X(t)$. Let its autocovariance be described as a simple exponential decay: $$ \langle X(\Delta t) X(0) \rangle = \sigma_X^2 \exp(-\Delta t/\tau), $$ where $\langle {\cdot} \rangle$ is an ensemble average. Intuitively from a physical standpoint I would interpret that $X$ forgets its initial state on a characteristic timescale of $\tau$.
In other words, $X$ gradually loses information of its initial state. Is there a formalization to this statement? What would be the proper definition of information in this case and how to connect its rate of change to an autocorrelation function?
UPD I have the following take on the problem.
In such a stationary process we can formulate the loss of information as our inability to tell what was the initial value $X(0)$ when we know the current value $X(\Delta t)$. For very small $\Delta t$, i.e. $\Delta t \ll \tau$, we can assert that $X(0) \approx X(\Delta t)$. However, for $\Delta t \gg \tau$ we have no idea what $X(0)$ was, although we know $X(\Delta t)$.
So this notion of loss of information is connected to $\langle [X(\Delta t) - X(0)]^2 \rangle$, which, in turn, is related to the autocovariance: $$ \langle [X(\Delta t) - X(0)]^2 \rangle = 2 \sigma_X^2 - 2 \langle X(\Delta t) X(0) \rangle. $$
The quantity $\langle [X(\Delta t) - X(0)]^2 \rangle$ could be taken as a proxy for differential entropy, though not sure of what PDF (see the MSE answer https://math.stackexchange.com/a/651213/14862).