0

The Central limit theorem states that (under some conditions) the sample mean $y_N = \frac{1}{N} \sum_1^N X_k$ is distributed like a normal variable with mean $\mu$ and variance $\sigma^2/N$. So if I want to study the probability of deviations from the mean I get

$$P(y_N) \sim e^{-\frac{N(y_N-\mu)^2}{2\sigma^2}} \sim e^{-NI(y)}$$

where I(y) decreases when $|y_N-\mu|$ increases. Isn't this the same as a large deviations principle?

I read (in the answers to this post) that it is wrong because CLT is used to study $O(\sqrt{n})$ deviations from $N\mu$ and not $O(n)$, but studying the CLT proof I don't get why. I'd be glad if someone could point the part of the classic proof (using the characteristic function) where this restriction comes from.

frps
  • 1
  • I'm not sure what opinion you read stating this is wrong, but they're clearly very-closely related. One way to link the two is through the theory of random signals/"cylinder sets"; you can view the asymptotic equipartition property as an alternate "representation" of the same idea as the central limit theorem. – user3716267 Aug 13 '24 at 14:55
  • Hi, welcome to Math SE. Which large deviations principle do you have in mind? This one, for example? – J.G. Aug 13 '24 at 15:43
  • @J.G. Yes, but even without referring to the precise form of the rate function, the general result that P(y) ~ exp(-N I(y)) – frps Aug 13 '24 at 15:53

0 Answers0