1

This question is stimulate by the previous two question here and here. We are interested in studying the following special case of Fox-Wright function \begin{align} \Psi_{1,1} \left[ \begin{array}{l} (1/k,2/k) \\ (1/2,1)\end{array} ; -x^2\right], x\in \mathbb{R}, k\in (1,\infty. \end{align}

Where Fox-Write function $\Psi_{1,1} \left[ \begin{array}{l} (a,A) \\ (b,B)\end{array} ; z\right]$ is defined as

\begin{align} \Psi_{1,1} \left[ \begin{array}{l} (a,A) \\ (b,B)\end{array} ; z\right] =\sum_{n=0}^\infty \frac{\Gamma(a+An)}{\Gamma(b+Bn)} \frac{z^n}{n!}. \end{align}

The reason this is interesting is because of the following Fourier cosine: \begin{align} \Psi_{1,1} \left[ \begin{array}{l} (1/k,2/k) \\ (1/2,1)\end{array} ; -x^2\right]= \int_0^\infty \cos(xt) e^{-t^k} dt \end{align} see this question for more details.

Goal We are interest in finding conditions on $k$ such that $ \Psi_{1,1} \left[ \begin{array}{l} (1/k,2/k) \\ (1/2,1)\end{array} ; -x^2\right]$ has no zeros.

The current conjecture is that there are no zeros for $k \le 2$ and at least one zero for $k>2$. Thanks for any ideas you might have.

Edit: Perhaps there is a connection to the notion of Stable Distribution in probability. See comments below.

user48672
  • 1,202
Boby
  • 6,381
  • 2
    Given the hypergeometric behaviour of the involved series, maybe it is possible to apply https://en.wikipedia.org/wiki/Sturm%27s_theorem. – Jack D'Aurizio Aug 04 '16 at 17:22
  • Thanks. You always great ideas. Cheers. Let you know if I have something. – Boby Aug 04 '16 at 17:47
  • I just read through the theorem. But I don't see the connection. Could you give more on the idea that you had. – Boby Aug 04 '16 at 17:59
  • my idea is to take a suitable truncation of the power series and compute the discriminant of such a polynomial through a resultant or other techniques. With a bit of luck, the positivity/negativity of the discriminant just depends on $k$ and should tell us if there are real zeroes or not. – Jack D'Aurizio Aug 04 '16 at 18:06
  • 2
    Notice that $$ \int_0^\infty \cos(xt) e^{-t^k} , \rm{d} t = \frac12 \int_{-\infty}^\infty e^{itx} e^{-|t|^k} , \rm{d} t. $$ Your conjecture corresponds to a theorem of Paul Lévy about the characteristic functions of stable distributions (that only exist for $k \le 2$). – sometempname Aug 10 '16 at 15:24
  • @sometempname Can you point me to this result and Why is it stable only for $k\le2$. – Boby Aug 10 '16 at 15:32
  • The basic reason is that If $\varphi(t)$ is the characteristic function of some random variable $X$, and $\varphi^{\prime\prime}(0)$ is finite, then $E(X^2) = - \varphi^{\prime\prime}(0)$. For $k>2$, the second derivative of $e^{-|t|^k}$ at $t=0$ is zero, so it is not a characteristic function. An actual proof is longer. – sometempname Aug 10 '16 at 16:27
  • @sometempname Whey you say $\phi(t)$ what do you mean $\phi(t)=e^{-|t|^k}$ or do you mean $\phi(t)=\int_0^\infty cos(xt)e^{-t^k}$? I think in this case the characteristic function is $\phi(t)=\int_0^\infty cos(xt)e^{-t^k}$ and not $e^{-|t|^k}$. But I might be wrong. Anyway if you could give more details or point me to some reference that would be great. – Boby Aug 10 '16 at 17:44
  • The characteristic function is $\varphi(t) = e^{-|t|^k}$. You can recover the density of $X$ using the integral $$ \frac{1}{2\pi} \int_{-\infty}^\infty e^{-i t x} \varphi(t) , \rm{d} t,$$ which is equivalent to your integral up to a constant. See for example "Inversion formulas" in the wikipedia entry for characteristic functions. I'm not sure about a good reference for this exact question. Usually it is part of more general theory of stable distributions, which you can find in many probability books. – sometempname Aug 10 '16 at 21:23
  • @sometempname I would really appreciate if you put these ideas in the answer. If nobody else answer the bounty is yours. – Boby Aug 10 '16 at 23:18
  • I guess Noam Elkies' answer to my question settles the question: http://math.stackexchange.com/questions/1847213/fourier-cosine-transforms-of-schwartz-functions-and-the-fejer-riesz-theorem/1898671#1898671 – Jack D'Aurizio Aug 21 '16 at 08:17

1 Answers1

1

I'll sketch an answer.

I use the notation: $$ \psi_k(z) = \sum_{n=0}^\infty \frac{\Gamma(\frac{1}{k} + \frac{2n}{k})}{\Gamma(\frac12 + n)} \cdot \frac{z^n}{n!}, $$ and $$ \phi_k(z) = \int_{0}^\infty \cos(z t)e^{-t^k} \, \rm{d} t. $$

First of all, the correct identity seems to be: $$ \phi_k(x) = \frac{\sqrt{\pi}}{k} \psi_k\left(-\left(\frac{x}{2}\right)^2\right).$$

Elementary cases ($\phi_k > 0$): $$k=1\rm{:}\,\, \phi_1(x) = \frac{1}{1+x^2}, \quad k=2\rm{:}\,\, \phi_2(x) = \frac{\sqrt{\pi}}{2} e^{-\frac{x^2}{4}}. $$

The case $k>2$:

The function $e^{-t^k}$ is differentiable at $t=0$ (from the right, at least twice), so integrating by parts twice, we have $\phi_k(x) = O(x^{-2})$. Thus, $\phi_k(x)$ is integrable (it is clearly bounded). Since $\cos(z) = \frac12 (e^{iz} + e^{-iz})$, we can write $$ \phi_k(z) = \frac12 \int_{-\infty}^\infty e^{izt} e^{-|t|^k} \, \rm{d} t.$$ We might think of $\frac{1}{\pi} \phi_k(x)$ as the "density" of a random variable $X$, whose characteristic function is $\varphi_k(t) = e^{-|t|^k}$. If $\varphi_k(t)$ was a characteristic function, then since it is at least twice differentiable at $t=0$, we would have $\mathbb{E} [X^2] = -\varphi_k^{(2)} (0) = 0$. Thus $X$ is a trivial random variable, and the contradiction implies that $\frac{1}{\pi} \phi_k$ is not a density.

The case $k<2$:

I'm not sure if any easy proof is available. An indirect way to prove $\phi_k > 0$ is to show $$ \varphi_k(t) = \int_0^\infty e^{-t^2 y} \, \rm{d} \mu_k(y) ,\quad t \in \mathbb{R}, \tag{1} \label{eq:sample} $$ for some non-negative Radon measure $\mu_k$. Then we have, $$ \phi_k(x) = \frac12 \int_{-\infty}^\infty e^{i x t} \varphi_k(t) \, \rm{d} t = \frac12 \int_0^\infty \left[ \int_{-\infty}^\infty e^{i x t - t^2 y} \, \rm{d} t \right] \, \rm{d} \mu_k(y) = \frac{\sqrt{\pi}}{2} \int_0^\infty y^{-\frac12} e^{-\frac{x^2}{4y}} \, \rm{d} \mu_k(y). $$ However, it is not straightforward to find the representation \eqref{eq:sample}. The details can be found in the paper "Stable laws of probability and completely monotone functions" by S. Bochner.

sometempname
  • 1,081
  • Why can you think of $\frac{1}{\pi} \phi_k(x)$ as a density? It can be negative. – Boby Aug 19 '16 at 15:02
  • 1
    By the properties of the Fourier transform you have that the total integral of $\frac{1}{\pi} \phi_k(x)$ is $1$. In the case $k > 2$, you assume that it is a density and reach a contradiction (so it is negative). – sometempname Aug 19 '16 at 17:18
  • Question: if the function integrates to one and can not be a pdf, the this implies that this function is negative? Never saw this argument before. I hope you don't mind if I ask you a few more question? I can be slow at times? – Boby Aug 19 '16 at 18:53
  • Also for the part $k>2$ at what point do we use the assumption $k>2$. – Boby Aug 19 '16 at 19:00
  • I think I got that par, for $k \le2$ the second derivative at zero is not zero. – Boby Aug 19 '16 at 19:04
  • If the total integral is $1$ and the function is non-negative, then it is a density by definition (and you can find a random variable $X$ with this density). In case $k < 2$, the second derivative of $\varphi_k(t)$ does not exist at $t=0$. It is known that the random variables with density $\frac{1}{\pi} \phi_k(x)$ do not have a second moment (this is basically a proof). – sometempname Aug 19 '16 at 19:10
  • Great. Thanks. A lot. – Boby Aug 19 '16 at 19:17
  • Do you think zeros of this function can be determined exactly or is it hopeless? – Boby Aug 19 '16 at 19:17
  • Sorry, I missed your bounty. I started a new one and will award you that. – Boby Aug 19 '16 at 19:18
  • I would be surprised if one can find an analytic expression for the zeros (for any $k > 2$). For integer $k$s you can probably find complicated expressions for the functions $\phi_k$ in terms of hypergeometric functions. Thanks for the bounty. – sometempname Aug 19 '16 at 19:48
  • But we say that zero's should appear periodically? For example, if $t_0$ is a zero than so is $t_1=t_0+2\pi$. – Boby Aug 23 '16 at 18:13
  • Based on numerical evidence in the cases $k=4,6$, I don't think so. – sometempname Aug 23 '16 at 19:53
  • Do you have access to the pdf of this paper "Stable laws of probability and completely monotone functions" by S. Bochner ? Thanks – Boby Aug 23 '16 at 20:29
  • I think you'll find more information in a book about stable distributions. – sometempname Aug 23 '16 at 20:45
  • Any reference you can recommend? – Boby Aug 23 '16 at 21:23
  • I'm not sure. What exactly are you looking for? – sometempname Aug 24 '16 at 06:54
  • More details on the part of the proof for $k<2$. Specifically, constraction of the measure. – Boby Aug 24 '16 at 11:04
  • The construction is not explicit, it is based on closure properties of completely monotone functions. – sometempname Aug 24 '16 at 18:02
  • Yes, but I would like to see it or have a reference I can look up – Boby Aug 24 '16 at 18:41