In my master project I encounter the following function $$f_\varepsilon(x) = \ln\left(\frac{x^{1 + a} + \varepsilon^\beta}{\lambda(x)^2 + \varepsilon^2}\right)$$ for $a$ close to zero, $\beta \in (1, 2)$ and $$\lambda(x) = \frac{x}{|\ln(x)|^2}.$$ I aim to bound the following quantity $$g_\varepsilon(x) = \bigg|\frac{1}{f_{\varepsilon}(x)} - \frac{1}{f_{0}(x)}\bigg|$$ on a certain interval $[0, x_0]$, $x_0 \ll 1$, where $f_0$ is just $f_\varepsilon$ with the parameter $\epsilon = 0$. Clearly, for $x \to 0$, we get $$g_\varepsilon (0) = \frac{1}{(2 - \beta)|\ln \varepsilon|}$$ so I feel that we should get something like $$g_\varepsilon(x) \le \frac{C}{(2 - \beta)|\ln \varepsilon|}$$ for $C > 0$ independent of $\varepsilon$. I tried to bound $|\ln(\varepsilon)|g_\varepsilon(x)$ by such a constant but I couldn't prove rigourously that it didn't depend on $\varepsilon$. Any help would be greatly appreciated.
-
https://math.stackexchange.com/questions/4638242/prove-or-disprove-that-fx-leq-0/4638959#4638959 Not sure if it helps – Barackouda Jul 03 '23 at 07:07
-
I guess you also want $C$ not to depend on $x$, but what about $a$ and $\beta$ ? Can $C$ depend on them ? – Cactus Jul 03 '23 at 07:48
-
Yes $C$ may (and probably will) depend on $\alpha$ and $\beta$ – Falcon Jul 03 '23 at 16:03
-
Are you sure that there is a constant $C'$ such that $g_\varepsilon(x)\lt C'$ ? It seems to me that for some $x_0$, there is no constant $C'$ such that $g_\varepsilon(x)\lt C'$. – mathlove Jul 07 '23 at 13:09
-
The existence of such a constant seems ok for me as both $1/f_0$ and $1/f_\varepsilon$ are continuous functions defined on a bounded interval $[0,x_0]$. The question is more, do we hav $g_\varepsilon \le C/|\ln \varepsilon|$. When I plot it numerically and let the parameter $\varepsilon$ vary it seems reasonable but I couldn't prove it. – Falcon Jul 07 '23 at 13:39
1 Answers
The existence of such a constant seems ok for me as both $1/f_0$ and $1/f_{\varepsilon}$ are continuous functions defined on a bounded interval $[0,x_0]$.
I think that you need to consider the case where there is $x_1$ satisfying both $f_0(x_1)=0$ and $0\lt x_1\lt x_0$.
If there is $x_1$ satisfying both $f_0(x_1)=0$ and $0\lt x_1\lt x_0$, then since $f_{\varepsilon}(x_1)$ is a non-zero constant, we get $\displaystyle\lim_{x\to x_1}g_{\varepsilon}(x)=\infty$.
Example :
If $a,x_1,x_0$ satisfy $$a=1+\frac{200}{41}\ln\bigg(\frac{41}{50}\bigg)\approx 0.032,\qquad x_1=e^{-41/50}\approx 0.440$$ and $x_1\lt x_0$, then we get $f_0(x_1)=0$. Since $f_{\varepsilon}(x_1)$ is a non-zero constant, we get $\displaystyle\lim_{x\to x_1}g_{\varepsilon}(x)=\infty$.
Added :
We have
$$g_\varepsilon(x) = \bigg|\frac{1}{f_{\varepsilon}(x)} - \frac{1}{f_{0}(x)}\bigg|\le \bigg|\frac{1}{f_{\varepsilon}(x)} \bigg|+\bigg|\frac{1}{f_{0}(x)}\bigg|\tag1$$
Since $\bigg(\dfrac{x^{1+a}}{\lambda^2}\bigg)'=\dfrac{4-(1-a)\ln x}{x^{2-a}}(\ln x)^3\lt 0$ with $\displaystyle\lim_{x\to 0^+}\dfrac{x^{1+a}}{\lambda^2}=\infty$, we see that $\dfrac{1}{f_0(x)}$ is positive and is increasing, so $$\bigg|\frac{1}{f_0(x)}\bigg|\le \frac{1}{f_0(x_0)}\tag2$$
Next, we may suppose that $\dfrac{x^{1+a}}{\lambda^2}\ge \varepsilon^{\beta-2}$, so $\dfrac{x^{1+a}+\varepsilon^{\beta}}{\lambda^2+\varepsilon^2}\ge \varepsilon^{\beta-2}$ implies $f_{\varepsilon}(x)\ge \ln(\varepsilon^{\beta-2})\gt 0$ (assuming that $0\lt \varepsilon\lt 1$), so $0\lt\dfrac{1}{f_{\varepsilon}(x)}\le\dfrac{1}{\ln(\varepsilon^{\beta-2})}$, therefore, we get $$\bigg|\frac{1}{f_{\varepsilon}(x)} \bigg|\le\frac{1}{\ln(\varepsilon^{\beta-2})}\tag3$$
It follows from $(1)(2)(3)$ that $$g_\varepsilon(x) \le \frac{1}{\ln(\varepsilon^{\beta-2})}+ \frac{1}{f_0(x_0)}$$
- 151,597
-
What I meant by "a certain interval [0,x_0], x_0 \ll 1" is that $x_0$ can be chosen arbitrarily small. The only thing I want is to prove the inequality on a connected interval $[0, x_0]$. Therefore I can consider $x_0$ such that the root of $f_0(x)$ is not containing in this interval. – Falcon Jul 09 '23 at 22:03
-
@Falcon : I added an inequality. Although I've spent a lot of time looking for a better bound, I couldn't get a bound of the form $\dfrac{C}{(2 - \beta)|\ln \varepsilon|}$. – mathlove Jul 10 '23 at 08:49
-
Thank you for the time spent on this question! However, I am not convinced of your inequality $$\frac{1}{f_\varepsilon} \le\frac{1}{\ln(\varepsilon^{\beta - 2})} = \frac{1}{(2-\beta) |\ln(\varepsilon)|}.$$ Indeed, the rhs depends only on $\varepsilon$, not on $x$ so that for $\varepsilon \to 0$ it converges to $0$ while the lhs doesn't as $$\lim_{\varepsilon \to 0}\frac{1}{f_\varepsilon(x)} = \frac{1}{f_0(x)} \neq 0\quad \text{for }x \neq 0.$$ – Falcon Jul 10 '23 at 16:22
-
@Falcon : Hmm, maybe $(1)$ is not a good start. Have you tried using the mean value theorem? There is $c$ such that $0\lt c\lt\varepsilon$ and $\frac{1}{f_{\varepsilon}(x)}-\frac{1}{f_0(x)}=\varepsilon F(c)$ where $F(\varepsilon)=\frac{\partial}{\partial \varepsilon}(\frac{1}{f_{\varepsilon}(x)})$. If I'm not mistaken, I got $$g_{\varepsilon}(x)=\frac{\varepsilon|\beta c^{\beta-1}(\lambda^2+c^2)-2c(x^{1+a}+c^{\beta})|}{\ln^2(\frac{x^{1+a}+c^{\beta}}{\lambda^2+c^2})(x^{1+a}+c^{\beta})(\lambda^2+c^2)}$$ although I'm not sure if this helps. – mathlove Jul 10 '23 at 18:18
-
No I didn't think about using that, maybe it helps indeed I will think about it. An other thing is that when I plotted $g_\varepsilon(x)$ it seemed to be a strictly decreasing function for every value of the parameter $\varepsilon$ small. This suggests that $$g_\varepsilon(x) \le g_\varepsilon(0) = \frac{1}{(2 -\beta)|\ln(\varepsilon)|}.$$ However, the form of the derivative of $g_\varepsilon(x)$ is so ugly that I couldn't prove rigourously that it was strictly decreasing.. – Falcon Jul 10 '23 at 21:21