3

Let $X$ be a random variable, and let $\hat{\mu_X}$ be its characteristic function.

Suppose that $|\hat{\mu_X}(u)| = |\hat{\mu_X}(v)| = 1$ for some $u,v \in \mathbb{R}^*$, with $uv^{-1} \not \in \mathbb{Q}$. We want to show that $X$ is a.s. constant.

My solution: if $|\hat{\mu_X}(u)| = 1$, it is not difficult to show that the image of $uX$ (on a set of full measure) lies in $\theta + 2 \pi \mathbb{Z}$ for some $\theta \in \mathbb{R}$. Hence, the image of $X$ (almost surely) lies in:

$$ \bigg( \frac{\theta_1}{u} + \frac{2 \pi}{u} \mathbb{Z} \bigg) \cap \bigg( \frac{\theta_2}{v} + \frac{2 \pi}{v} \mathbb{Z} \bigg)$$

for some $\theta_1, \theta_2 \in \mathbb{R}$. This has at most one solution, by the conditions on $u,v$, so $X$ is almost surely constant.

Question: Is this proof correct? I'm slightly unsure because the question gives the hint to consider "an independent copy of $X$", so maybe an argument like this is expected, but I don't see how it works here, and if the above is correct, then it is surely simpler. I was also wondering if it is relevant that $\langle u, v \rangle \le (\mathbb{R}, +)$ is dense, but I couldn't think of how that might be applied either.

I would be interested to see any other arguments which can be used to solve this question.

legionwhale
  • 2,505
  • I believe a long time ago I solved a similar problem in the book of Breiman, and yes, I recall my solution being similar to yours. The "independent copy" argument is nicer and cleaner, but at the time (during college) the idea of an independent copy I still couldn't grasp it well. – William M. May 16 '23 at 15:47
  • @WilliamM. Do you recall how it works? I think I understand the argument in the link I gave, but I can't immediately see how it could be applied here. – legionwhale May 16 '23 at 16:42
  • If $Z = X - X',$ then the characteristic function of $Z$ should be equal to $1$ around zero and therefore $Z = 1$ in distribution. – William M. May 16 '23 at 22:14
  • @WilliamM. Thanks for the help. The key ingredient that I was missing was the continuity of $\hat{\mu}$ to extend the constancy from a dense set to an interval around $0$. I've posted an answer. If you have the time, I'd be grateful for confirmation that all is in order. – legionwhale May 18 '23 at 22:35

1 Answers1

0

We first prove a lemma.

Lemma: Let $X$ be such that $|\hat{\mu}_X(u)| = 1$, $u \neq 0$. Then, $u$ is a period for |$\hat{\mu}_X|$.

Proof

Note that we have the inequality: $$|\mathbb{E} [e^{itX}]| \le \mathbb{E}[|e^{itX}|] = 1$$

In order for equality to hold, for example at $t = u$, we must have:

$$e^{itX} \equiv e^{i \theta} \; \; \text{a. s.}$$

for some $\theta \in \mathbb{R}$. This holds by applying the argument here (after rotation). We deduce that:

$$uX(\omega) \in \theta + 2 \pi \mathbb{Z} \;\; \text{a.s.}$$

Now observe that since $X$ is, in fact, effectively discrete, we can write:

$$\hat{\mu_X}(t) = \mathbb{E}[e^{itX}] = \sum_\mathbb{Z} \exp\bigg(it\frac{(\theta + 2 k\pi)}{u}\bigg) \mathbb{P}(uX = \theta + 2 k\pi) $$

$$\implies \hat{\mu_X}(t +u) = e^{i \theta}\hat{\mu_X}(t)$$

$$ \implies |\hat{\mu_X}(t)| = |\hat{\mu_X}(t+u)|$$

Result follows. $\square$

Proposition: Let $X$ be a random variable, and let $\hat{\mu_X}$ be its characteristic function. Suppose that $|\hat{\mu_X}(u)| = |\hat{\mu_X}(v)| = 1$ for some $u,v \in \mathbb{R}^*$, with $uv^{-1} \not \in \mathbb{Q}$. Then, $X$ is a.s. constant.

Proof

By the conditions on $u,v$, we have that $\langle u,v \rangle \le (\mathbb{R},+)$ is dense in $\mathbb{R}$. Consider the random variable $Z := X -X'$, where $X'$ is an independent copy of $X$. Then:

$$\hat{\mu_Z}(t) = \mathbb{E}[e^{itZ}]$$

$$= \mathbb{E}[e^{it(X-X')}]$$ $$\stackrel{\text{indep.}}{=} \mathbb{E}[e^{itX}] \cdot \mathbb{E}[e^{-itX'}]$$ $$ = \hat{\mu_X}(t) \overline{\hat{\mu_X}(t)}$$ $$ = | \hat{\mu_X(t)}|^2$$

By the first observation we made, this is equal to $1$ on a dense subset of $\mathbb{R}$. But note that, as the Fourier transform of a probability measure, $\hat{\mu_Z}$ is continuous! Thus, we must have $\hat{\mu_Z} \equiv 1$.

Since characteristic functions uniquely characterise the laws of their underlying random variables, we may deduce that $Z$ is a.s. constant with value $0$. In particular, $X = X'$ a.s.

To conclude, we consider:

$$\mathbb{P}(X \le c) = \mathbb{P}(X,X' \le c)$$

$$\stackrel{\text{indep.}}{=} \mathbb{P}(X \le c) \cdot \mathbb{P}(X' \le c)$$

$$ = \mathbb{P}(X \le c)^2$$

$$ \implies \mathbb{P}(X \le c) = 0 \text{ or } \mathbb{P}(X \le c) = 1$$

This holds for all $c \in \mathbb{R}$.

Hence, let $c^* := \inf \{c : \mathbb{P}(X \le c) = 1 \}$. We see that $X = c^*$ a.s. $\square$

legionwhale
  • 2,505
  • 'We have the inequality $|E[e^{it X}]|\leq 1$. In order for the equality to hold we must have $e^{it X}\equiv e^{i\theta}$'. Could you give some detail? Shouldn't it be $E[e^{it X}] \equiv e^{i\theta}$? – Snoop May 18 '23 at 22:54
  • @Snoop We can extend the equality to $e^{itX}$ by virtue of the fact that we are integrating over a probability space. I can expand on this a little in an edit. I did miss an "almost surely" though. – legionwhale May 18 '23 at 22:56
  • 1
    I think it would be alright to add detail. Also, a remark: CFs characterise the laws of rvs, not the rvs themselves. – Snoop May 18 '23 at 22:59
  • @Snoop Thanks for pointing that out. I have quite limited experience with these concepts. – legionwhale May 18 '23 at 23:03
  • You may ask a question about it separately, then fill the detail in for this answer. – Snoop May 18 '23 at 23:17
  • @Snoop I have asked it as a question, if you would like to answer. – legionwhale May 18 '23 at 23:35
  • I upvoted it (don't know how I would approach that right now) – Snoop May 18 '23 at 23:41
  • @Snoop Turns out to be quite simple. I got very close while thinking about it. I need to get more sleep, it seems. Thanks for the observation. – legionwhale May 18 '23 at 23:46