2

For some time I've been curious about large-number behaviour for i.i.d. sequences of non-integrable random variables. A couple of basic-seeming questions:

Let $(X_n)_{n \geq 1}$ be an i.i.d. sequence of real-valued random variables such that $\mathbb{E}[X_n^+]=\mathbb{E}[X_n^-]=\infty$.

(1) Is it the case that for any $M>0$, $$ \frac{1}{N}\left|\left\{1 \leq n \leq N : \left| \frac{1}{n} \sum_{i=1}^n X_i \right| > M \right\} \right| \,\to\, 1 \ \ \textrm{as } N \to \infty $$ almost surely?

(2) Suppose additionally that $X_n$ and $-X_n$ have the same law. Does it follow that $$ \lim_{M \to \infty} \limsup_{N \to \infty} \frac{1}{N}\left|\left\{1 \leq n \leq N : \frac{1}{n} \sum_{i=1}^n X_i > M \right\} \right| \, > \, 0 $$ almost surely?

(This question is, in some sense, a more advanced version of aspects of Strong Law of Large Numbers for a i.i.d. sequence whose integral does not exist.)

  • Do you think that $X_n = \frac{1}{k}$ with probability $\frac{3/\pi^2}{k^2}$ and $\frac{-1}{k}$ with prob $\frac{3/\pi^2}{k^2}$ for each $k \ge 1$ is a counterexample to (1)? I think it is. – mathworker21 Feb 09 '20 at 06:23
  • Are you able to describe vaguely the intuition behind your counter-example? Your counter-example will certainly have $\liminf_{N\to\infty}\frac{S_N}N = -\infty$ and $\limsup_{N\to\infty}\frac{S_N}N = +\infty$ almost surely (where $S_N=\sum_{i=1}^N X_i$), by the Kesten result in the answer to https://math.stackexchange.com/questions/1831101/ combined with the symmetry of your example. So you would need to have some reason to believe that when $\frac{S_N}N$ attains very large-magnitude values, it is then "statistically relatively quick to go back in the other direction". – Julian Newman Feb 10 '20 at 14:01
  • I actually don't have to believe it is statistically relatively quick to go back in the other direction. it is perfectly possible (and indeed I think the reality), that $\frac{S_N}{N}$ will be arbitrarily positively large sometimes and arbitrarily negatively large sometimes and take a long time to switch between the two, but also be $0$ sometimes and stay near-enough to $0$ for a relatively long time. Let me know if you believe it is a counter-example. If you agree, then I'll try to get a proof. If you still disagree, I'd be happy to hear why. – mathworker21 Feb 10 '20 at 14:20
  • @mathworker21 My crude intuition was that for any fixed $M>0$, as $R \to \infty$ the "typical" proportion of time that a downcrossing of the process $(\frac{S_N}N){N \geq 1}$ from $R$ to $-R$ spends in $[-M,M]$ will tend to $0$, simply because $[-M,M]$ is a very small interval compared to $[-R,R]$ for sufficiently large $R$. This is why I expected that if $(\frac{S_N}N){N \geq 1}$ almost surely reaches both arbitrarily large positive values and arbitrarily large negative values as $N \to \infty$, then the overall asymptotic proportion of time spent in $[-M,M]$ tends to $0$. – Julian Newman Feb 21 '20 at 19:08
  • You're not measuring at the right spot. You need to measure the proportion of time spent in $[-M,M]$ right after it leaves $[-M,M]$. I think the proportion will be very high at such measurements, but maybe I'm wrong; though, it definitely is a possibility that you seem to not be acknowledging. In any event, it seems that we should let the math settle our debate. I'll try to write up a proof soon. – mathworker21 Feb 21 '20 at 19:23
  • @mathworker21 However, I recently found out that there exist probability distributions without a well-defined mean such that the Cesàro averages converge in probability to $0$. In fact, if I recall correctly, if the distribution is symmetric about $0$ and has the property that $n\mathbb{P}(X>n)\to 0$ as $n\to\infty$ then this holds. Do you have some kind of result along these lines in mind? But even if your example (where I assume you mean "$X_n=k$" in place of "$X_n=\frac{1}k$") is not a counterexample, I guess other examples with convergence in probability to $0$ are counterexamples to (1). – Julian Newman Feb 21 '20 at 19:24
  • I did mean "$X_n = k$", sorry. All I had in mind was my specific example, due to the reasons I gave (that I still feel very strongly about). I'm not sure why examples with convergence in probability to $0$ are counterexamples to (1). For question (2), since you have the limsup, I think the answer is yes – mathworker21 Feb 21 '20 at 19:28
  • @mathworker21 I never actually understood your intuition behind why it will spend a sufficiently long time near $0$, only that it might spend a sufficiently long time near $0$. But let me try to explain more clearly the intuition that I was trying to express earlier. Define recursively the increasing sequence $T_n\in\mathbb{N}$ as follows: $T_1$ is the first time $N$ at which $\frac{S_N}N\geq 1$. Then, for even $n\geq 2$, $T_n$ is the first time $N\geq T_{n-1}$ at which $\frac{S_N}N \leq -n$; and for odd $n\geq 3$, $T_n$ is the first time $N\geq T_{n-1}$ at which $\frac{S_N}N\geq n$. – Julian Newman Feb 21 '20 at 19:59
  • Then, since $[-M,M]$ is very small compared to $[-n,n+1]$ or $[-(n+1),n]$ for large $n$, my intuition was that as $n\to\infty$ the proportion of times $N\in[T_n,T_{n+1}]$ for which $\frac{S_N}N$ is in $[-M,M]$ converges (in some sense) to $0$. If this is true, with the sense of convergence being almost sure convergence, then (1) is clearly satisfied, without needing any further bounds on the time spent in $[-M,M]$ immediately after leaving $[-M,M]$. – Julian Newman Feb 21 '20 at 19:59
  • I'm objecting to "Then, since [−,] is very small compared to [−,+1] or [−(+1),] for large , my intuition was that as →∞ the proportion of times ∈[,+1] for which is in [−,] converges (in some sense) to 0". The reason I object is that the part of $[T_n,T_{n+1}]$ with $\frac{S_N}{N} \in [0,M]$ should be the vast majority of $[T_n,T_{n+1}]$, since $S_N = 0$ at some time and then you need $S_N \ge Mn$, where $n$ is HUGE. I'll just write up a rigorous proof soon. I don't think you're appreciating how long it takes for $S_n$ to go from $0$ to $nM$. – mathworker21 Feb 21 '20 at 20:36
  • @mathworker21 ??? In the interval $[T_n,T_{n+1}]$ (with $n$ even), I agree that the time taken to get from $S_N=0$ to $S_N \geq MN$ is HUGE. But since $\frac{n}{T_n}$ is very small, the time taken to get from $S_N=-nN$ to $S_N=(-n+M)N$ is SIMILARLY HUGE, as is the time to get from $S_N=(-n+M)N$ to $S_N=(-n+2M)N$, and from $S_N=(-n+2M)N$ to $S_N=(-n+3M)N$, and so on...! Or am I getting something confused? – Julian Newman Feb 21 '20 at 23:14
  • I thought it wouldn't be similarly huge. I thought it would be huge but much less huge. I'm busy right now; I'll figure out the truth later. At least we found our source of disagreement (I'm not too strongheld on my view but at least I got it across). – mathworker21 Feb 22 '20 at 00:02
  • Ok, let's say $M=1$ (and recall the example we discussed above). Suppose $N$ is such that $\sum_{i=1}^N X_i = 0$. Let $N'$ be the smallest thing greater than $N$ with $|\sum_{i=1}^{N'} X_i| \ge N'$. The question is whether $N' = N+o(N)$. If it is not, then we have a counterexample to (1), since $\frac{1}{N'}|{1 \le n \le N' : |\frac{1}{n}\sum_{i \le n} X_i| > 1} \le \frac{N}{N'}$. Do you think $N' = N+o(N)$? – mathworker21 Apr 02 '20 at 22:06
  • @mathworker21 Sorry for the slow reply. Okay, I now see that you may well be right. I realise my previous assertion that "If ... then (1) is clearly satisfied" (Feb 21 at 19:59) is actually complete rubbish. – Julian Newman Apr 17 '20 at 16:20

0 Answers0