It's not an exercise for uni or anything like that, just something that's been bothering me a bit and I can't seem to find useful information on the web on the matter.
When talking about real valued scalar functions, we know that newton's method will surely converge to a root $s$ of $f(x)$ if our initial value $x_0$ is sufficiently close to the root, meaning, on the interval $(s-r,s+r)$ where $r=|s-x_0|$ the derivative $f'(x)$ is never zero (assuming $f'(s)$ is not zero).
My question is, can we expand that criterion to higher dimension?
Let $f: \mathbb R^{n} \to \mathbb R^n$ differentiable function with continuous partial derivatives. Let $s\in \mathbb R^n$ such that $f(s)=0$ and let $x_0 \in \mathbb R^n$ and $r=|s-x_0|$.
Assume the jacobian of $f$ is invertible everywhere in the sphere with radius $r$ at epicenter $s$. Prove or disprove that newton's method will converge to $s$ if our initial value was $x_0$.
Reminder:
Newton's method is defined as $x_{n+1}=x_n-J^{-1}(x_n)f(x_n)$