3

From the Berry Esseen theorem I know, that $$\sup_{x\in\mathbb R}|P(B_n \le x)-\Phi(x)|\in O\left(\frac 1{\sqrt n}\right)$$ whereby $B_n$ has the standardized binomial distribution and $N$ has the standardized normal distribution. I can prove this for $x\approx 0$ with Stirling's formula and a similar proof shown here.

Unfortunately Stirling's approximation becomes worse the bigger $|x|$ is, so that I can only provide that

$$\sup_{|x| \le c}|P(B_n \le x)-\Phi(x)|\in O\left(\frac 1{\sqrt n}\right)$$ for a fixed $c > 0$.

My question: Is there a good proof for $\sup_{|x| > c}|P(B_n \le x)-\Phi(x)|\in O\left(\frac 1{\sqrt n}\right)$? How can I estimate the error of the difference in the tail probability of the binomial distribution to the normal distribution?

Note: Chebyshev's inequality does not provide the right convergence speed. As zhoraster shows in his answer to Finding an error estimation for the De Moivre–Laplace theorem, also Chernoff's inequality does not provide the right convergence speed, too. That's why I am looking for another method. I also want to use more easy and direct approximations than Stein's method which is used in the proof of the Berry Esseen theorem.

Update: I reasked this question on MO, see https://mathoverflow.net/questions/220030/normal-approximation-of-tail-probability-in-binomial-distribution

  • 1
    The question does not seem no be very different from the another one. Anyway, I wanted to make a different comment: it is enough to estimate $\sup_{c<|x|<c\sqrt{\log n}}\cdots$. I would recommend trying to write estimates form Balazs and Toth's paper using more terms in the Taylor formula for $\log(1+x)$. Then it might happen (and why not?) that the cubic terms somehow cancel out, or at least some big portion of them. – zhoraster Sep 21 '15 at 19:21
  • You are right, an answer to Point of maximal error in the normal approximation of the binomial distribution implies an answer to this question... Thanks also for your suggestion! I will try to follow it... – Stephan Kulla Sep 21 '15 at 19:29
  • 1
    By the way, in your pictures, the next term of Edgeworth expansion is clearly visible (especially the first one). – zhoraster Sep 21 '15 at 19:42
  • @zhoraster: Thanks for your comment, from which I have an idea: Can the Edgeworth expansion be used to find a proof of Point of maximal error in the normal approximation of the binomial distribution? Unfortunately I do not know much about those expansions, but I will study this topic in the following days... – Stephan Kulla Sep 22 '15 at 11:00
  • 1
    Highly unlikely... I believe that writing additional terms of log expansion will lead you to something similar. – zhoraster Sep 22 '15 at 11:56
  • A lot of what's been going on here goes over my head, but it seems to me that if you can run your estimate for linearly growing $c$, then you should be able to manage the "very large" $c$ with Cramer's theorem. – Ian Oct 05 '15 at 10:26
  • @Ian: That's right. Unfortunately I only managed to prove $\sup_{|x| \le c}|P(B_n \le x)-\Phi(x)|\in O\left(\frac 1{\sqrt n}\right)$ with fixed $c > 0$. When $c$ grows I loose convergence speed... – Stephan Kulla Oct 05 '15 at 10:36
  • Sorry, I neglected to notice that you have standardized. That means that you only need to get your estimate for $c$ on the order of $\sqrt{n}$ before Cramer's theorem kicks in. (When $B_n$ deviates from $0$ by $d$, the underlying binomial deviates from $np$ by $d\sqrt{np(1-p)}$.) What happens on this order? Still problems? (Also I see now that zhoraster has pointed out something much stronger than I am saying already...) – Ian Oct 05 '15 at 11:04
  • @Ian: With any needed growth I will loose convergence speed (when I only use Stirling's approximation in my proof) – Stephan Kulla Oct 05 '15 at 20:01

1 Answers1

1

See the answer on mathoverflow.com by Iosif Pinelis.