Is $ f_1(x,v) = \sum_{n=0}^\infty {x^n \over (n!)^v } > 0 $ for all real $x$ and $0<v<1$ ?
Lets start simple and take the case $v = 1/2$ and notice the inverse ratio of taylor coefficients is : $(a_{n+1}/a_n)^{-1} = \sqrt n$. Probably the simplest way to prove positivity is now to write $\int_0^1(1-t^n)\log(\frac 1t)^{-3/2}\,\frac{dt}t=c\sqrt n$ with some fixed positive $c$ (notice that the integral converges and the integrand is positive(!), and make the change of variable $t^n\to t$). We conclude (by the additive property and the identity $x^m t^m = (xt)^m$ ) that $\int_0^1 (f(x)-f(xt))\log(\frac 1t)^{-3/2}\,\frac{dt}t=cxf(x)$. If $x$ is the largest zero of $f$ (which must be negative), then by plugging it in, we get $0$ on the right and a negative number on the left, which is a clear contradiction. Thus, crossing the $x$-axis is not possible. Of course, there is nothing special about $1/2$. Any power $v$ between $0$ and $1$ works just as well because the analogue integral still converges.
For instance $\int_0^1(1-t^n)\log(\frac 1t)^{-5/4}\,\frac{dt}t= 4 \Gamma(3/4) n^{1/4}$ and $\int_0^1 (f(x)-f(xt))\log(\frac 1t)^{-5/4}\,\frac{dt}t=cxf(x)$.
This method cries out for a generalization and a deeper understanding.
Notice also that (*) the limit of $f(x,v)$ at minus infinity is zero and the function is strictly increasing.
Also (partially because of that) the function is estimated (and proven) to go to zero at rate $O(n^{-v})$ (not so easy to prove?) and to $+\infty$ at rate $O(\exp(x^{1/v}))$.
Notice also that proving $ f_2(x,v) = \sum_{n=0}^\infty {x^{2n} \over (2n!)^v } > \sum_{n=0}^\infty {x^{2n+1} \over (2n+1!)^v } $ is the equivalent statement to the above ... and this is similar to proving $2 \cosh(x) > 2 \sinh(x)$ and (*) is proven by adding the asymptotic property. (This idea belongs to the " fake function theory " idea posted in 2014 at the tetration forum ( and developped by Tom Raes and Sheldon Levenstein but with lots of room for improvement !) for those who care. The integral transform also occured in an equivalent form (integral from $1$ to infinity wich is just a substitution) )
So how to generalize this all, in particular the integral method? ( the integral method is called infinite descent by some for obvious reasons )
Are all the derivates of $f(x,v)$ positive for all real $x$ ?? That would really be analogue to $\exp(x)$.
I wondered what other functions are always positive ? And go to 0 ?
For instance a subset of the Mittag-Leffler functions :
MAIN QUESTION :
Is $ f_3(x,v) = \sum_{n=0}^\infty {x^n \over \Gamma(v n +1) } > 0 $ for all real $x$ and $0<v<1$ ?
So much to learn.