Suppose we have a function $f : \mathbb{R}^+ \to \mathbb{R}$. It seems intuitive to me that if $\lim_{x \to 0} f(x)$ exists, then $\lim_{x \to 0} x f'(x) = 0$. I suspect that for real functions, there may be pathological counterexamples to this, but at least for analytic functions (where $0$ may be on the boundary of the analytic disk) then it should be true.
In the analytic case, I can give an argument for this that would convince a typical physicist like myself. $f(x)$ cannot have an essential singularity at $0$, because $\lim_{x \to 0} f(x)$ would not exist. So as $x \to 0$, it scales like $f(x) = c + O( x^\alpha )$ for some constant $c$ and $\alpha > 0$, since the limit exists. Therefore, $x f'(x) = O( x^\alpha ) \to 0$.
Two questions:
Is this true, and if so, how generally valid is it?
Is there a simpler proof that does not rely on scaling arguments? This seems like the kind of problem that one would typically address with a fairly general theorem like L'Hospital's rule. Another strategy is to write $xf'=(xf)'-f$, but this just shifts the burden of the problem to showing that $\lim_{x \to 0}[xf(x)]'=\lim_{x \to 0} f(x)$. I am guessing that the solution is totally obvious and I am just not seeing it.