6

Let $I$ denote the unit interval and $\mu$ be the Lebesgue measure. Let $S:I\to I$ be the map defined as $S(x)=2x \pmod{1}$. Then it is known that for any measurable subset $A$ of $I$ we have $$ \lim_{n\to \infty} \mu(S^{-n}A\cap B) = \mu(A)\mu(B) $$

So if we fix a subset $A$, we see that $A$ gets "mixed out" in the interval by backwards iterates of $S$.

I would like to say that $A$ had a certain entropy in the beginning, but as $n$ increases the entropy of $S^{-n}(A)$ increases.

It's like what we read in not-so-rigorous thermodynamics: Start with a box with a partition. On one side of the partition is a gas, and the other side is vacuum. Once the partition is removed the gas takes up all the space. The entropy of the gas (whatever that means) increases in the process.

So my question is: Is there a rigorous formulation of the the notion of entropy I was trying to hint at in my example? Also, can the notions of ergodicity and mixing be recast in the language of entropy?

Thank you.

mathworker21
  • 35,247
  • 1
  • 34
  • 88
  • Do you know what is the entropy of a partition (in the context of ergodic theory)? It gives exactly what you describe for any set $A$ whose preimages generate the $\sigma$-algebra. But it is only an analogy to what you describe, now we are talking about equilibrium thermodynamics. – John B Nov 07 '18 at 00:45
  • I just read the definition of the entropy of a partition of a probability space. If $\mathcal A ={A_1, \ldots, A_n}$ is a partition of a probability space $(X, \mathcal F, \mu)$, then the entropy of $\mathcal A$ is $\sum_i -\mu(A_i)\log_2(A_i)$. Though I don't know the motivation behind this definition. Can you articulate your thoughts in an answer? Thanks. – caffeinemachine Nov 07 '18 at 09:43
  • It's only a simple application of Sinai's generator theorem. – John B Nov 07 '18 at 10:41
  • I found the statement of Sinai's generator theorem here: http://web.stanford.edu/~tonyfeng/ergodic_theory.pdf (Theorem 10.21). But this theorem is for invertible measure preserving transformations. How do I use it in the situation at hand? – caffeinemachine Nov 08 '18 at 12:44
  • Nothing changes: for a noninvertible transformation it is also true that if you have a one-sided generator the entropy is equal to the entropy of the generator. Actually the following is also true: for an invertible transformation the theorem in the notes that you mention (Theorem 10.21) certainly holds but then the entropy of the partition is zero (this is a simple exercise). That is why for invertible transformations we really want to consider two-sided generators, in which case again the entropy is that of the generator (but not necessarily zero). – John B Nov 08 '18 at 14:30
  • PS: In other words, an invertible transformation (always$\bmod0$) has a one-sided generator (again$\bmod0$, which is all that matters) if and only if the transformation has entropy zero. – John B Nov 08 '18 at 14:32
  • So I think the statement that you want me to use is the following: Let $T:X\to X$ be a measure preserving transformation of a probability space $(X, \mathcal F, \mu)$. Let $\mathcal A\subseteq \mathcal F$ be a finite $\sigma$-algebra such that $\mathcal F = \bigvee_{i=0}^\infty T^{-i}\mathcal A=\mathcal F\pmod{\mu}$. Then $h(T)= h(T, \mathcal A)$, where $h$ is the entropy. – caffeinemachine Nov 08 '18 at 14:55
  • I take the above for granted. Still. I do not see how this is can help me with my question. How do I relate strong-mixing with entropy using the above statement? Thanks. – caffeinemachine Nov 08 '18 at 14:57
  • Yes, you can replace "finite $\sigma$-algebra" by "finite partition", but it gives the same. In strongly mixing systems you can vary $B$, you need to think about that. – John B Nov 08 '18 at 16:38
  • @caffeinemachine you should not say "this is because the map $S$ is strong mixing". what you wrote is the definition of strong mixing. I'm sorry; I can't help myself, I have to edit that out. – mathworker21 Apr 28 '20 at 23:26
  • 1
    @caffeinemachine the answer to ur second question is nearly certainly "no". the answer to your first question I'd say is also "no" (there are of course rigorous formulations of entropy, but I don't think they reflect your intuition. your intuition is just about strong mixing) – mathworker21 Apr 28 '20 at 23:30
  • @mathworker21 Okay. Thanks. Perhaps then I should abandon this. However, JohnB seemed to suggest that there is something here. I still don't see why he mentioned the Sinai's generator theorem. – caffeinemachine Apr 28 '20 at 23:34
  • @caffeinemachine the way I think about entropy is that it reflects uncertainty of the present if you have knowledge of the past. for Bernoulli systems, knowing $x_{-1},x_{-2},\dots$ gives you absolutely no information about $x_0$, so the entropy is maximal (to be clear, the partition is partitioning all sequences based on their value at index $0$, and knowing the past means knowing which partition element $T^{-i}x$ is in, i.e. knowing $x_{-i}$). But strong mixing is just something becoming completely spread out over time. I could be wrong, but they seem/are pretty different. – mathworker21 Apr 28 '20 at 23:42
  • @mathworker21 Thanks. Can you write an answer so that I can accept. – caffeinemachine Apr 29 '20 at 00:10
  • @caffeinemachine I really think you should ask JohnB as well. I'm really not an expert. Your question is good. When writing my answer, I realized I couldn't answer some very basic questions. Like, if a system is mixing, must it have positive entropy? If an ergodic system has positive entropy, must it be mixing? I'm pretty sure both answers are "no", but the issue is that I barely know any examples. I think the only example of mixing I know is Bernoulli shifts, and the only example I know of positive entropy I know is Bernoulli shifts. I learned ergodic theory a while ago, and – mathworker21 Apr 29 '20 at 10:38
  • didn't learn too many examples – mathworker21 Apr 29 '20 at 10:38

1 Answers1

2

Entropy reflects uncertainty of the present if you have knowledge of the past. Consider, for instance, $\{0,1\}^\mathbb{Z}$ with $\frac{1}{2}-\frac{1}{2}$ measure and the partition $P := \{\{x : x_0 = 0\},\{x : x_0 = 1\}\}$. Then the join of $T^{-1}P,\dots,T^{-i}P$ is separating $x$'s based on $(x_{-i},\dots,x_{-1})$. This is the "past" and gives absolutely no information about which element of $P$ that $x$ is in (since $x_0$ is completely independent of $x_{-i},\dots,x_{-1}$). Therefore, the entropy is as large as possible: $\log 2$.

Strong mixing means that sets get spread out over time.

One kind of stupid difference between mixing and entropy is that mixing is relative to the total system. For example, if you take two disjoint systems of positive entropy, the entropy is still positive, but the total system is not mixing (it's not ergodic). If you impose ergodicity, and then ask about mixing and entropy, then I think neither one implies the other.

I knew I wrote more about entropy before, and I just tracked it down, and it was on one of your questions! See here.

mathworker21
  • 35,247
  • 1
  • 34
  • 88