1

I have this old (1991) contest math problem I am stuck on. The problem is:

Let $f(x)$ be a function such that $f(1)=1$ and, for $x\geq 1 $

$$ f'(x) = \frac{1}{x^2 + f^2(x)} $$

Prove that $\lim_{x\to\infty} f(x)$ exists, and is less than $1 + \frac{\pi}{4}$.

I tried a trig substitution approach (let $x=r\cos(\theta), y = r \sin(\theta)$), but I got stuck when I tried to do the change of variables. I also thought about calculating the general form for $f^n(x)$ and calculating the Taylor's series around $x=1$, but I got stuch around the 3rd derivative and couldn't see a way to come up with a general formula.

Any tips on making progress would be welcome! Thanks.

1 Answers1

2

Hint :

Since $x^2+f^2(x) \ge 0$, we know that $f'(x)>0$.

So, $f$ is increasing $\Rightarrow$ $f(x)\ge f(1) = 1$.

Apply this again, we get $$x^2+f^2(x)\ge x^2 + 1\quad\Rightarrow\quad \frac{1}{x^2+1}\ge\frac{1}{x^2+f^2(x)}=f'(x)$$

and use the property :

$$f(x)=f(1)+\int_1^xf'(t)dt\le f(1)+\int_1^x\frac{1}{t^2+1}dt$$

bFur4list
  • 2,775