### Newton’s method

Truth is much too complicated to allow anything but approximations.

— John Von Neumann

#### Finding a solution with geometry

Newton’s method for solving equations is another numerical method for solving an equation *f*(*x*)=0

. It is based on the geometry of a curve, using the tangent lines to a curve. As such, it requires calculus, in particular differentiation.

Roughly, the idea of Newton’s method is as follows. We seek a solution to *f*(*x*)=0. That is, we want to find the red dotted point in the picture below.

We start with an initial guess *x*1. We calculate *f*(*x*1). If *f*(*x*1)=0, we are very lucky, and have a solution. But most likely *f*(*x*1) is not zero. Let *f*(*x*1)=*y*1, as shown.

We now try for a better guess. How to find that better guess? The trick of Newton’s method is to draw a tangent line to the graph *y*=*f*(*x*) at the point (*x*1,*y*1). See below.

This tangent line is a good linear approximation to *f*(*x*) near *x*1, so our next guess *x*2 is the point where the tangent line intersects the *x*

-axis, as shown above.

We then proceed using the same method. We calculate *y*2=*f*(*x*2); if it is zero, we’re finished. If not, then we draw the tangent line to *y*=*f*(*x*) at (*x*2,*y*2), and our next guess *x*3 is the point where this tangent line intersects the *x*-axis. See below.

In the figure shown, *x*1,*x*2,*x*3

rapidly approach the red solution point!

Continuing in this way, we find points *x*1,*x*2,*x*3,*x*4,…

approximating a solution. This method for finding a solution is Newton’s method.

As we’ll see, Newton’s method can be a very efficient method to approximate a solution to an equation — when it works.

#### The key calculation

As our introduction above just showed, the key calculation in each step of Newton’s method is to find where the tangent line to *y*=*f*(*x*) at the point (*x*1,*y*1) intersects the *x*

-axis.

Let’s find this *x*-intercept. The tangent line we are looking for passes through the point (*x*1,*y*1) and has gradient *f*′(*x*1)

.Recall that, given the gradient *m* of a line, and a point (*x*0,*y*0) on it, the line has equation

#### Newton’s method

Let *f*:R→R be a differentiable function. We seek a solution of *f*(*x*)=0, starting from an initial estimate *x*=*x*_{1}

.At the *n*‘th step, given *x _{n}*, compute the next approximation

*x*

_{n+1}by

Some comments about this algorithm:

- Often, Newton’s method works extremely well, and the
*x*_{n}

converge rapidly to a solution. However, it’s important to note that Newton’s method does not always work. Several things can go wrong, as we will see shortly. Note that if *f*(*x _{n}*)=0, so that

*x*is an exact solution of

_{n}*f*(

*x*)=0, then the algorithm gives

*x*

_{n+1}=

*x*, and in fact all of

_{n}*x*,

_{n}*x*

_{n+1},

*x*

_{n+2},

*x*

_{n+3},… will be equal. If you have an exact solution, Newton’s method will stay on that solution! While the bisection method only requires

*f*to be continuous, Newton’s method requires the function

*f*to be differentiable. This is necessary for

*f*to have a tangent line.

#### Using Newton’s method

As we did with the bisection method, let’s approximate some well-known constants.

#### Example

Use Newton’s method to find an approximate solution to *x*^{2}−2=0, starting from an initial estimate *x*_{1}=2. After 4 iterations,

Newton’s method in the above example is much faster than the bisection algorithm! In only 4 iterations we have 11 decimal places of accuracy! The following table illustrates how many decimal places of accuracy we have in each *x _{n}*.

The number of decimal places of accuracy approximately doubles with each iteration!

**Exercise 10 **(Harder.) This problem investigates this doubling of accuracy:

**Example**

Find an approximation to *π* by using Newton’s method to solve sin*x*=0 for 3 iterations, starting from *x*1=3

. To how many decimal places is the approximate solution accurate?

Solution

**Exercise 11**

Find all solutions to the equation cos*x*=*x* to 9 decimal places using Newton’s method. Compare the convergence to what you obtained with the bisection method in exercise 5.

#### What can go wrong

While Newton’s method can give fantastically good approximations to a solution, several things can go wrong. We now examine some of this less fortunate behaviour.

The first problem is that the *x _{n}* may not even be defined!

This problem occurs whenever when *f*′(*x _{n}*)=0. If

*x*is a stationary point of

_{n}*f*

, then Newton’s method attempts to divide by zero — and fails.

The next example illustrates that even when the “approximate solutions” *x _{n}* exist, they may not “approximate” a solution very well.

The third problem shows that all *x _{n}*

can get further and further away from a solution!

**Example**

Find an approximate solution to arctan*x*=0, using Newton’s method starting at *x*_{1}=3

.Solution

The above example may seem a little silly, since finding an exact solution to arctan*x*=0 is not difficult: the solution is *x*=0! But similar problems will happen applying Newton’s method to any curve with a similar shape. The fact that *f*′(*x _{n}*) is close to zero means the tangent line is close to horizontal, and so may travel far away to arrive at its

*x*-intercept of

*x*

_{n+1}

.Sensitive dependence on initial conditions

Sometimes the choice of initial conditions can be ever so slight, yet lead to a radically different outcome in Newton’s method.

Consider solving *f*(*x*)=2*x*−3*x*^{2}+*x*^{3}=0. It’s possible to factorise the cubic, *f*(*x*)=*x*(*x*−1)(*x*−2), so the solutions are just *x*=0,1 and 2.

So there is no simple transition from those values of *x*_{1} which lead to the solution *x*=1 and *x*=0. The behaviour of Newton’s method depends extremely sensitively on the initial *x*1. Indeed, we have only tried only a handful of values for *x*_{1}

, and have only seen the tip of the iceberg of this behaviour.

In the Links Forward section we examine this behaviour further, showing some pictures of this sensitive dependence on initial conditions, which is indicative of mathematical chaos. Pictures illustrating the behaviour of Newton’s method show the intricate detail of a fractal.

#### Getting Newton’s method to work

In general, it is difficult to state precisely when Newton’s method will provide good approximations to a solution of *f*(*x*)=0

. But we can make the following notes, summarising our previous observations.

- If
*f*′(*x*)=0_{n}

then Newton’s method immediately fails, as it attempts to divide by zero. The “approximations” *x*_{1},*x*_{2},*x*_{3},… can cycle, rather than converging to a solution. The “approximate solutions” *x*_{1},*x*_{2},*x*_{3},… can even diverge to infinity. This problem happens when *f*′(*x _{n}*) is small, but not zero. The behaviour of Newton’s method can depend extremely sensitively on the choice of

*x*

_{1}

- , and lead to chaotic dynamics.

However, a useful rule of thumb is as follows:

- Suppose you can sketch a graph of
*y*=*f*(*x*)

, and you can see the approximate location of a solution, and *f*′(*x*) is not too small near that solution. Then taking *x*_{1} near that solution, and away from other solutions or critical points, Newton’s method will tend to produce *x _{n}* which converge rapidly to the solution.

### How To Use Newton’s Method

The idea behind is to start with an **initial guess** which is reasonably close to the true root *(solution)* and then to use the **tangent line** to obtain another x-intercept that is even better than our initial guess or starting point.

Let’s look at this conceptually to make sense of what is happening.

Assume we want to find the root (i.e., x-intercept) for f(x). This means we want to find *a* in the picture below.

So, we begin with an initial guess (as noted by UC Davis) that is relatively close to point *a*, which is indicated as point *b*, and substitute *b* into f(x) and see if it equals zero. If it does equal, we’re done. However, more than likely, this will not be the case, and we will need to try for a better guess.

*How?*

By using our knowledge of **linear approximation**. If we can create a tangent line to the graph at point *b*, then we can find another point even closer to *a* that also intersects the x-axis, at point *c*, as seen below.

And we keep using this process until we can find two successive values that agree to the desired decimal place.

In other words, until we finally converge on the value of *a*.

This iterative process is called Newton’s Method!

### Newton’s Method Formula

And to help with our calculations, we can use the following formula:

So, after six iterations, we have found that the seventh root of 1000 to be approximately 2.6826958.

And that’s all you have to do.

**Example 1** Use Newton’s Method to determine an approximation to the solution to cosx=x that lies in the interval [0,2]. Find the approximation to six decimal places. Hide Solution

First note that we weren’t given an initial guess. We were however, given an interval in which to look. We will use this to get our initial guess. As noted above the general rule of thumb in these cases is to take the initial approximation to be the midpoint of the interval. So, we’ll use x_{0}=1

as our initial guess.

Next, recall that we **must** have the function in the form f(x)=0

. Therefore, we first rewrite the equation as, cosx−x=0

We can now write down the general formula for Newton’s Method. Doing this will often simplify up the work a little so it’s generally not a bad idea to do this.

At this point we should point out that the phrase “six decimal places” does not mean just get x_{1}

to six decimal places and then stop. Instead it means that we continue until two successive approximations agree to six decimal places.

Given that stopping condition we clearly need to go at least one step farther.

Alright, we’re making progress. We’ve got the approximation to 1 decimal place. Let’s do another one, leaving the details of the computation to you.

And now we’ve got two approximations that agree to 9 decimal places and so we can stop. We will assume that the solution is approximately x_{4}=0.7390851332.

In this last example we saw that we didn’t have to do too many computations in order for Newton’s Method to give us an approximation in the desired range of accuracy. This will not always be the case. Sometimes it will take many iterations through the process to get to the desired accuracy and on occasion it can fail completely.

The following example is a little silly but it makes the point about the method failing.

So, we need to be a little careful with Newton’s method. It will usually quickly find an approximation to an equation. However, there are times when it will take a lot of work or when it won’t work at all.

## Để lại một phản hồi