5

After this post, I started wondering about possible approximations of the infinite product $$A_p=\prod _{k=p+1}^{\infty } \cos \left(\frac{p \,\pi}{2 k}\right)\tag 1$$ where $p$ is an integer. As far as I could see, there is no closed form expressions.

So, as I did in the linked question, I used Bhaskara I's approximation (for $-\frac \pi 2 \leq x\leq\frac \pi 2$) $$\cos(x) \simeq\frac{\pi ^2-4x^2}{\pi ^2+x^2}\implies \cos\left(\frac{p\,\pi}{2k}\right)=\frac{4 \left(k^2-p^2\right)}{4 k^2+p^2}$$ and then computed $$B_p=\prod _{k=p+1}^{\infty }\frac{4 \left(k^2-p^2\right)}{4 k^2+p^2}=\frac{\Gamma \left( p+1-i\frac p 2\right)\,\, \Gamma \left( p+1+i\frac p 2\right)}{(2p)!}\tag 2$$ which does not seem to be very bad (see the table below).

Using Stirling approximation and Taylor series for supposed large values of $p$, I ended with $$\log(B_p)=\frac 12\log \left(\frac{5\pi}{4}\right)-\left(\log \left(\frac{16}{5}\right)+\cot ^{-1}(2)\right)p+\frac 12 \log(p)+\frac{11}{120 p}+O\left(\frac{1}{p^3}\right)\tag 3$$ What is interesting is to notice that $$\log \left(\frac{16}{5}\right)+\cot ^{-1}(2)\approx 1.62680$$ which is not so far from $\frac \pi 2$ ($3.56$% relative difference) and this already bring questions (at least, to me).

What is interesting is to see how close are the numbers in a logarithmic scale. $$\left( \begin{array}{cccc} p & (2) & (3) & (1) \\ 1 & -0.85190 & -0.85120 & -0.84448 \\ 2 & -2.17732 & -2.17725 & -2.16333 \\ 3 & -3.61661 & -3.61660 & -3.59727 \\ 4 & -5.10720 & -5.10719 & -5.08299 \\ 5 & -6.62701 & -6.62700 & -6.59815 \\ 6 & -8.16570 & -8.16570 & -8.13233 \\ 7 & -9.71760 & -9.71760 & -9.67979 \\ 8 & -11.2793 & -11.2793 & -11.2371 \\ 9 & -12.8485 & -12.8485 & -12.8019 \\ 10 & -14.4236 & -14.4236 & -14.3727 \\ 20 & -30.3496 & -30.3496 & -30.2557 \\ 30 & -46.4164 & -46.4164 & -46.2798 \\ 40 & -62.5413 & -62.5413 & -62.3621 \\ 50 & -78.6981 & -78.6981 & -78.4764 \end{array} \right)$$

Just out of curiosity, for $1 \leq p \leq 50$, I adjusted the parameters for the model $$\log(A_p)=a+b\,p+c\log(p)+\frac d p$$ and obtained a real good fit $$\begin{array}{clclclclc} \text{} & \text{Estimate} & \text{Standard Error} & \text{Confidence Interval} \\ a & +0.69204 & 0.00006 & \{+0.69193,+0.69216\} \\ b & -1.62255 & 0.00000 & \{-1.62256,-1.62255\} \\ c & +0.50040 & 0.00003 & \{+0.50035,+0.50045\} \\ d & +0.08598 & 0.00007 & \{+0.08583,+0.08613\} \\ \end{array}$$ while is $(3)$, the coefficients are $(+0.68394,-1.62680,+0.50000,+0.09167)$ that is to say very very close.

For sure, all of this shows the high quality of the approximation of $\cos(x)$ but asks me questions about the value of $A_p$. If we truncate $(3)$ to $O\left(\frac{1}{p}\right)$,we should have $$A_p\sim \sqrt{\frac{5\pi p} 4 }\,e^{-\alpha \pi p}$$ with $\alpha \approx \frac 12$.

I wonder if we could find another better approximation of $A_p$ even at the price of more complex functions. Any idea would be welcome.

  • I haven't worked out all the details yet, but using this inequality and because $0<\frac{p \pi}{2k}<\frac{\pi}{2}$ for $k\geq p+1$, we have $$\prod\limits_{k=p+1}\cos{\frac{p\pi}{2k}}\leq \frac{1}{e^{\frac{\pi^2 \cdot p^2}{8}\sum\limits_{k=p+1}\frac{1}{k^2}}}= \frac{1}{e^{\frac{\pi^2 \cdot p^2}{8}\left(\zeta(2)-\sum\limits_{k=1}^p\frac{1}{k^2}\right)}}=$$ and $$\sum\limits_{n\leq x}\frac{1}{n^s}=\frac{x^{1-s}}{1-s}+\zeta(s)+O(x^{-s})$$ for $s>0, s\ne 1$. – rtybase Apr 22 '19 at 11:42
  • ... which leads to $\approx \frac{1}{e^{\alpha \pi p}}$ rather than $\approx \frac{\sqrt{p}}{e^{\alpha \pi p}}$ – rtybase Apr 22 '19 at 11:42
  • @rtybase. This is interesting ! Thanks for that. But how could I explain the highly significant coefficient $c$ from the quick and dirty regression ? Cheers :-) – Claude Leibovici Apr 22 '19 at 12:16
  • @rtybase. Based on my own answer, using the asymptotics of the gamma functions, I effectively do not see how $\log(p)$ could appear and then agree with you. The problem is that, using $1 \leq p \leq 200$, this $\log(p)$ term seems to be crucial. Any idea ? Cheers :-) – Claude Leibovici Apr 24 '19 at 08:51
  • That's an interesting finding, indeed. Nope, no idea, but it's inline with Taylor's expansion of $\frac{1}{2}$ you mentioned, I believe for small $p$'s (or function wise, for $x$ around $0$)? – rtybase Apr 24 '19 at 21:57

3 Answers3

1

Considering that Bhaskara I's approximation $$\cos(x) \simeq\frac{\pi ^2-4x^2}{\pi ^2+x^2}\qquad \text{for} \qquad -\frac \pi 2 \leq x\leq\frac \pi 2$$ looks like a Padé approximant, I had the feeling that I could either pure Padé approximants built at $x=0$, that is to say $$f_n=\frac {1+\sum_{m=1}^n a_m x^{2m}}{1+\sum_{m=1}^n b_m x^{2m}}$$ or, may be better, $$g_n=\left(1-\frac{2x}{\pi }\right) \left(1+\frac{2x}{ \pi }\right)\frac {1+\sum_{m=1}^{n-1} c_m x^{2m}}{1+\sum_{m=1}^n d_m x^{2m}}$$ the second rational fraction being the Padé approximant of $\frac{\cos (x)}{\left(1-\frac{2x}{ \pi }\right) \left(1+\frac{2x}{ \pi }\right)}$; $g_n$ was chosen as potentially better than $f_n$ because the calculations will start closer and closer to $\frac \pi 2$. To confirm the validity of this choice were computed $$I_n=\int_0^{\frac \pi 2} \big(\cos(x)-f_n\big)^2\,dx \qquad \text{and}\qquad J_n=\int_0^{\frac \pi 2} \big(\cos(x)-g_n\big)^2\,dx$$ The table below shows clearly the superiority of $g_n$. $$\left( \begin{array}{ccc} n & I_n & J_n \\ 1 & 7.086 \times 10^{-5} & 7.962 \times 10^{-6} \\ 2 & 6.754 \times 10^{-11} & 1.909 \times 10^{-12} \\ 3 & 2.375 \times 10^{-18} & 3.014 \times 10^{-20} \end{array} \right)$$ while $$\int_0^{\frac \pi 2} \Big(\cos(x)-\frac{\pi ^2-4x^2}{\pi ^2+x^2}\Big)^2\,dx =1.489\times 10^{-6}$$

Now, replacing $x$ by $\frac {p \pi}{2k}$ makes $$\cos \left(\frac{\pi p}{2 k}\right)=\frac {P_{m}(k^2)}{Q_{m}(k^2)}$$ where $P_m$ and $Q_m$ are homogeneous polynomials. $P_m$ shows only one real root and $Q_m$ only complex roots. By the end, limited to $n=2$, we can write $$\cos \left(\frac{\pi p}{2 k}\right)=\frac{(k-r_1 p)\,(k+r_1p)\,(k-r_2p)\,(k+r_2p)} {(k-s_1 p)\,(k+s_1p)\,(k-s_2p)\,(k+s_2 p) }$$ and if $$A_p=\prod _{k=p+1}^{\infty } \cos \left(\frac{p \,\pi}{2 k}\right)$$ then $$A_p\sim\frac{\Gamma [1+(1-s_1) p]\, \Gamma [1+(1+s_1) p]\, \Gamma [1+(1-s_2) p]\, \Gamma [1+(1+s_2) p]}{\Gamma [1+(1-r_1) p]\, \Gamma [1+(1+r_1) p]\,\Gamma [1+(1-r_2) p]\, \Gamma [1+(1+r_2) p]}$$

This leads to the following results (for the logarithms) $$\left( \begin{array}{cccc} p & \text{using } f_n & \text{using } g_n& \text{exact} \\ 1 & -0.844481 & -0.844481 & -0.844481 \\ 2 & -2.163326 & -2.163329 & -2.163327 \\ 3 & -3.597265 & -3.597275 & -3.597270 \\ 4 & -5.082976 & -5.082997 & -5.082989 \\ 5 & -6.598132 & -6.598168 & -6.598155 \\ 6 & -8.132292 & -8.132346 & -8.132328 \\ 7 & -9.679735 & -9.679810 & -9.679787 \\ 8 & -11.23699 & -11.23708 & -11.23706 \\ 9 & -12.80178 & -12.80190 & -12.80187 \\ 10 & -14.37255 & -14.37270 & -14.37266 \\ 20 & -30.25528 & -30.25578 & -30.25570 \\ 30 & -46.27897 & -46.27993 & -46.27979 \\ 40 & -62.36080 & -62.36228 & -62.36208 \\ 50 & -78.47457 & -78.47661 & -78.47636 \end{array} \right)$$ which looks much better than in my post.

1

As per the comments, but with references and links. Using this inequality and because $0<\frac{p \pi}{2k}<\frac{\pi}{2}$ for $k\geq p+1$, we have $$\color{red}{\prod\limits_{k=p+1}\cos{\frac{p\pi}{2k}}\leq} \frac{1}{e^{\frac{\pi^2 \cdot p^2}{8}\sum\limits_{k=p+1}\frac{1}{k^2}}}= \frac{1}{e^{\frac{\pi^2 \cdot p^2}{8}\left(\zeta(2)-\sum\limits_{k=1}^p\frac{1}{k^2}\right)}}=... \tag{1}$$ and (e.g. Apostol's "Introduction to Analytic Number Theory", page 55) $$\sum\limits_{n\leq x}\frac{1}{n^s}=\frac{x^{1-s}}{1-s}+\zeta(s)+O(x^{-s}) \tag{2}$$ for $s>0, s\ne 1$, leading to $$\sum\limits_{n\leq p}\frac{1}{n^2}=-\frac{1}{p}+\zeta(2)+O\left(\frac{1}{p^2}\right) \tag{3}$$ and, substituting in $(1)$ $$...=\frac{1}{e^{\frac{\pi^2 \cdot p^2}{8}\left(\frac{1}{p}-O\left(\frac{1}{p^2}\right)\right)}}= \frac{1}{e^{\frac{\pi^2 \cdot p}{8}-\frac{\pi^2 \cdot O(1)}{8}}}= \color{red}{\frac{e^{\frac{\pi^2}{8}\cdot O(1)}}{e^{\frac{\pi^2}{8}\cdot p}}}= \frac{O(1)}{e^{\frac{\pi^2}{8}\cdot p}}$$

Which leads to $\sim \frac{1}{e^{\alpha \pi p}}$ rather than $\sim \frac{\sqrt{p}}{e^{\alpha \pi p}}$.


However, this doesn't seem to be very important for large $p$, simply because (I will use $-b$ to have $b=1.62255>0$): $$A_p=e^{\ln{A_p}}= e^{a-bp+c\ln{p}+\frac{d}{p}}= e^{a}\cdot e^{\frac{d}{p}} \cdot \frac{p^c}{e^{bp}}= O(1)\cdot \frac{1}{e^{(b-c)p}}\cdot \frac{p^c}{e^{cp}}\leq ...$$ because $\frac{p^c}{e^{cp}}\rightarrow 0, p\rightarrow\infty$ $$...\leq \frac{O(1)}{e^{(b-c)p}}$$ But $\frac{\pi^2}{8}> b-c$, thus, in a way $$\frac{O(1)}{e^{\frac{\pi^2}{8}\cdot p}}<\frac{O(1)}{e^{(b-c)p}}$$ from some large enough $p$ onwards.

rtybase
  • 17,398
1

One can recognize a Riemann sum. The limit $$ A:=\color{blue}{\lim_{p\to\infty}\frac{\ln A_p}{p}}=\lim_{p\to\infty}\frac{1}{p}\sum_{k=p+1}^{\infty}\ln\cos\frac{p\pi}{2k}=\color{blue}{\int_{1}^{\infty}\ln\cos\frac{\pi}{2x}\,dx} $$ exists. Numerically, $A=-1.62254367352281126916452953\cdots$

The Euler-Maclaurin approach gives more details. For the first order, \begin{align} \ln A_p&=\left[\int_{p}^{\infty}-\int_{p}^{p+1}\right]\ln\cos\frac{p\pi}{2x}\,dx+\frac{1}{2}\ln\cos\frac{p\pi}{2(p+1)}+R_p \\&=Ap+\frac{\ln p}{2}-\int_{0}^{1}f_p(x)\,dx+\frac{1}{2}f_p(1)+R_p, \end{align} where $f_p(x)=\ln\left(p\cos\dfrac{p\pi}{2(p+x)}\right)\underset{p\to\infty}{\longrightarrow}\ln\dfrac{x\pi}{2}$, and \begin{align} R_p&=\frac{p\pi}{2}\int_{p+1}^{\infty}\left(\{x\}-\frac{1}{2}\right)\tan\frac{p\pi}{2x}\,\frac{dx}{x^2}&&\color{gray}{[x=n+(1-y)/2,\ -1\leqslant y\leqslant 1]} \\&=-\frac{p\pi}{2}\sum_{n=1}^{\infty}\int_{-1}^{1}yg_{p,n}(y)\,dy&&\color{gray}{\left[g_{p,n}(y)=\frac{\tan\dfrac{p\pi}{2(p+n)+1-y}}{\big(2(p+n)+1-y\big)^2}\right]} \\&=-\frac{p\pi}{2}\sum_{n=1}^{\infty}\int_{0}^{1}x\big(g_{p,n}(x)-g_{p,n}(-x)\big)\,dx&&\underset{p\to\infty}{\longrightarrow}\quad -R, \end{align} where (terms are nonnegative, so we can take the limit termwise) $$R=\int_{0}^{1}\sum_{n=1}^{\infty}\frac{x^2}{(2n+1)^2-x^2}\,dx=\int_{0}^{1}x\left(\frac{\pi}{4}\tan\frac{x\pi}{2}-\frac{x}{1-x^2}\right)\,dx.$$ Omitting the computations, $R=1-\dfrac{\ln 2\pi}{2}$, and finally $$\ln A_p=Ap+\dfrac{1}{2}\ln 4p+o(1).$$

metamorphy
  • 43,591