7

We've got a function: $ f : \Bbb R \to \Bbb R$ defined by $f(x) = x^3 - 9$. Let $x^* $ be its root, which means $ f(x^*) = 0$. We want to find approximation for $x^*$ using a Newton's method. There are two questions I don't know how to answer:

  1. We choose an initial guess: $x_0 = 2,5$. Does it lineary converge for such $x_0$? And quadratically? And, most of all, I'm interested how to find range of $x_0 $ for which this method converges. How to do it?

  2. Can we show that $|x_3 - x^*| < 2^{-15}$? Is there a posiibility to do it without computing $x_3$, which is not that easy without a calculator?

Anne
  • 1,567

2 Answers2

7

I'm not an expert on Newton's Method, but I spent some time thinking about it last year while teaching an honors calculus course. As I understand it, the basic philosophy behind convergence of Newton's method is:

  1. Although one can find counterexamples to convergence by "making trouble", under reasonably mild hypotheses one can show there is $\delta > 0$ such that if $x_0 \in [c-\delta,c+\delta]$, then the sequence of Newton iterates converges to the true root $c$ "quadratically". One can certainly work out an explicit value of $\delta$: see below.

  2. On the other hand, if one starts with an $x_0$ which is "too far away" from $c$, then the question of whether the Newton's method iterates converge to $c$ can become arbitrarily complicated and delicate. (As in some sense it must: e.g. maybe there are other roots!) For instance if you look at $z \mapsto z^3-1$ (in the complex plane, which is not our setup), you get fractal basins of convergence.

Here is a result from my honors calculus notes which gives you an explicit value of $\delta$. $\newcommand{\ra}{\rightarrow}$ $\newcommand{\N}{\mathbb{N}}$ $\renewcommand{\R}{\mathbb{R}}$

Theorem Let $f: I \ra \R$ be twice differentiable, and let $x^* \in I^{\circ}$ be a root of $f$.
a) Suppose there are real numbers $\delta ,A ,B > 0$ such that for all $x \in [x^*-\delta,x^*+\delta]$, $|f'(x)| \geq A$ and $|f''(x)| \leq B$. For any $x_0 \in [x^*-\delta,x^*+\delta]$, let $\{x_n\}$ be the Newton's Method sequence with initial value $x_0$. Then for all $n \in \N$, $$ |x_{n+1} - x^*| \leq \frac{B}{A} |x_n - x^*|^2. $$ b) If $f''$ is continuous on $I$ and $f'(x^*) \neq 0$, then there are indeed $\delta,A,B > 0$ as in the statement of part a), so that the above inequality holds for all $x_0 \in [x^*-\delta,x^*+\delta]$.

You can try computing a $\delta$ this in your situation and check whether the $\delta$ you get is small enough so that $|3^{\frac{2}{3}}-2| \leq \delta$: I haven't tried this myself.

Pete L. Clark
  • 100,402
  • @Anne You may go through pete L. clark's notes for getting some information about computing $\delta$. – Srijan Apr 18 '13 at 14:38
  • Related to this, you may enjoy the following paper: Cubics, Chaos and Newton's Method, Thomas Dence, The Mathematical Gazette, 81 (492), (1997), 403-408. http://www.jstor.org/stable/3619617 – Andrés E. Caicedo Apr 18 '13 at 14:55
  • Thank you! And do you have any idea what about my question, part b) ? – Anne Apr 18 '13 at 15:29
  • @Anne: the main inequality given above, when it applies, will give you an upper bound for the distance of $x_n$ from the true root $x^$. I haven't checked whether it applies in your case, and if so whether it gives $|x_3 - x^| < 2^{-15}$. Rather, when $\delta$ is small enough, the inequality will give you an explicit value of $n$ such that $|x_n - x^*| < \epsilon$, for any $\epsilon$ you care to choose. Whether $n = 3$ works for $\epsilon = 2^{-15}$ is a very specific question: you may have to resort to calculation to answer it. – Pete L. Clark Apr 18 '13 at 17:54
3

As we know that Newton’s Method is an algorithm to solve $g(x) = 0$. Given an initial $x_0$, we compute a sequence of points defined via \begin{equation} x_{n+1} = x_{n} - \dfrac {g(x_n)}{g'(x_n)} \end{equation} Your first question is for what values of initial approximation $x_0$ does the newton's method converges quadratically?Here is the result of convergence of Newton's method you would like to read.

Let $g$ be twice continuously differentiable on the interval (a, b) . Let $r$ be the root of $g$. If $r\in (a,b)$ such that $g(r) = 0$ and $g'(r)\neq 0$, then there exists $\delta > 0$ such that Newton’s Method will converge if started in the interval [r -$\delta$, r+$\delta$]. In this case, the sequence converges quadratically.

Case when Newton's method failed to converge quadratically: Consider $g(x) = x^2$. Now the question is will Newton’s Method converge quadratically to the root $x = 0$? Answer is no: This happened because there was a multiple root at $x = 0$. Note that In Newton’s Method if the root being sought has multiplicity greater than one, the convergence rate is merely linear (errors reduced by a constant factor at each step) unless special steps are taken (you would like to read This).

If you believe your function has a “simple” root (that is, $g(r) = 0$ and $g^{'}(r) \neq 0$), and you have a reasonable starting guess, $x_0$, then Newton’s Method is the method of choice. In practice, we will not know if we have a simple root, and we may not have a very good starting estimate- In that case, you may use Bisection method to get you started, then switch to Newton’s Method once you’re close.

Srijan
  • 12,862
  • 1
    Ok. I think I wasn't precise in my question - if I have given $x_0$, how do I check if for that $x_0$ it converges? That it is in a correct interval $[r- \delta, r+ \delta]$, whithout all those sophisticated calculations (like in the hint below)? Is there a possibility to find this $\delta$? – Anne Apr 18 '13 at 13:10