9

Consider a polynomial $p : \mathbb R \to \mathbb R$ with $p'(x) > 0$ for all $x \in \mathbb R$. The function $p$ has exactly one real zero. Will the Newton method

$$x_{n+1} = x_n - \frac{p(x_n)}{p'(x_n)}$$

converge for all $x_0 \in \mathbb R$?

Intuitively I think there still might be a counterexample - but I couldn't find one, so is it maybe possible that it indeed converges for every $x_0$?

flawr
  • 16,931
  • Wlog let the zero be at $0$. If there is a $p$ such that it does not converge, then there must exist $x$ such that $p(x)/p'(x) >= 2x$. Wlog we can choose $x=1$ and we get: If there is $p$ such that it does not converge, then $p(1)/p'(1) \geq 2$. But I couldn't show yet that under the given assumptions $p(1)/p'(1) \geq 2$ can never be satisfied. – sebastian Jul 01 '19 at 08:07
  • @flawr The conditions you mention do not prevent the formation of periodic orbits (and the conseguent divergence). If you want to guarantee convergence, you must pose additional conditions, like convexity/concavity, and even this does not guarantee convergence for all $x_0$. – PierreCarre Jul 01 '19 at 09:53

1 Answers1

12

No, not necessarily. In particular, consider the polynomial, $$p(x) = \frac{7}{2}x - \frac{5}{2}x^3 + x^5.$$ Note that $$p'(x) = \frac{7}{2} - \frac{15}{2}x^2 + 5x^4,$$ a positive quadratic in $x^2$ with a negative discriminant $-\frac{55}{4}$, and hence is strictly positive everywhere. This means $p$ is strictly increasing, as required.

Take an initial iterate $x_0 = 1$. Then, \begin{align*} x_1 &= 1 - \frac{p(1)}{p'(1)} = 1 - \frac{2}{1} = -1 \\ x_2 &= -1 - \frac{p(-1)}{p'(-1)} = -1 - \frac{-2}{1} = 1, \end{align*} and so the iterates repeat.


Method

It's not difficult to form a cycle with Newton's method. I wanted a polynomial that passes through $(-1, -2)$ and $(1, 2)$, both with derivative $1$. Following the tangent at $(-1, -2)$ to the $x$-axis will yield $x = 1$, so if we take $x_n = -1$, then $x_{n+1} = 1$. Following the tangent at $(1, 2)$ yields an $x$-intercept of $x = -1$, so $x_{n+2} = -1$, and so the iterates repeat.

I tried for a degree $5$ polynomial. Let our polynomial be $$p(x) = a_0 + a_1 x + a_2 x^2 + a_3 x^3 + a_4 x^4 + a_5 x^5.$$ Our restrictions turn into \begin{align*} p(-1) = a_0 - a_1 + a_2 - a_3 + a_4 - a_5 &= -2 \\ p'(-1) = a_1 - 2a_2 + 3a_3 - 4a_4 + 5a_5 &= 1 \\ p(1) = a_0 + a_1 + a_2 + a_3 + a_4 + a_5 &= 2 \\ p'(-1) = a_1 + 2a_2 + 3a_3 + 4a_4 + 5a_5 &= 1. \end{align*} Solving this, we get a general solution in terms of $s$ and $t$: $$p(x) = \frac{5}{2}x - \frac{1}{2}x^3 + t(1 - 2x^2 + x^4) + s(x - 2x^3 + x^5).$$ In particular, I needed to choose $s$ and $t$ so that the derivative \begin{align*} p'(x) &= \frac{5}{2} - \frac{3}{2}x^2 + t(3x^3 - 4x) + s(1 - 6x^2 + 5x^4) > 0 \end{align*} Clearly, we required $s > 0$. I also decided to chose $t = 0$; it may not have been necessary, but it made things simpler. Now $p'(x)$ is now a cubic in $x^2$: $$p'(x) = \left(\frac{5}{2} + s\right) - \left(\frac{3}{2} + 6s\right)x^2 + 5sx^4.$$ I wanted the discriminant to be negative, which is to say $$\left(\frac{3}{2} + s\right)^2 - 20s\left(\frac{5}{2} + s\right) < 0.$$ Choosing $s = 1$ did the trick, and gave us the previously presented polynomial.

Theo Bendit
  • 53,568
  • 2
    Curiously enough, the periodic orbits that appear in Newton's method are often unstable and break due round-off errors. How wonderful is it to have a method that converges due to errors? – PierreCarre Jul 01 '19 at 09:48
  • @TheoBendit Thanks a lot for your answer as well as the excellent explanation of the construction! – flawr Jul 01 '19 at 12:41
  • @PierreCarre That is interesting, so far when studying the theory I only ever found the counterexamples that are based on periodicity - do you know whether adding "errors" has been studied for actual use? – flawr Jul 01 '19 at 12:44
  • @flawr I came across this curiosity in the framework of discrete dynamical systems, where Newton's method is often used as an example of an iteration map. To my knowledge, this has not been used outside this context. – PierreCarre Jul 01 '19 at 12:59
  • @flawr In this example you can check that the orbit is in fact unstable, starting with $x_0=1+\varepsilon$ you will get convergence to zero. – PierreCarre Jul 01 '19 at 13:07