7
Newton’s Method Other Recursive Methods Modified Fixed Point Method

Newton’s Method

  • Upload
    thuong

  • View
    48

  • Download
    0

Embed Size (px)

DESCRIPTION

Newton’s Method. Other Recursive Methods Modified Fixed Point Method. Tangential Roots - PowerPoint PPT Presentation

Citation preview

Page 1: Newton’s Method

Newton’s Method

Other Recursive Methods

Modified Fixed Point Method

Page 2: Newton’s Method

Tangential Roots

The consistent problem we have encountered with both the Bisection Method and the Regula-Falsi Method is that of a tangential root. In other words to apply the method the function f(x) must be of opposite sign on each side of the root (i.e. f(a)f(b)<0). We want to be able to address this problem and still find a root if the function f(x) crosses the x-axis.

Modified Fixed Point Algorithm

The fixed point algorithm can be modified to find roots instead of fixed points. This comes from a simple algebra fact that if w0 is a fixed point of h(x) and h(x)=f(x)+x then w0 will be a root of f(x).

h(w0)=w0 (x0 is a fixed point of h(x) )

h(w0) =f(w0)+w0 (definition of h(x) )

f(w0)+w0=w0 (substitute)

f(w0)=0

Page 3: Newton’s Method

Below we show how the modified fixed point algorithm can be applied to find roots. We do this for a couple of functions.

1,6

)( 0

2

xx

xxxf

x

xx

x

xxxh

66)(

2

n xn h(xn)

0 1 7

1 7 1.85714

2 1.85714 4.23077

3 4.23077 2.41818

1,6)( 02 xxxxf

22 266)( xxxxxxh

n xn h(xn)

0 1 7

1 7 -29

2 -29 -893

3 -893 -799229

Page 4: Newton’s Method

The problem with the modified fixed point method is that it is very computationally unstable for many functions with certain initial values. By this I mean that the sequence that it generates does not converge. Notice that in the previous example even though we were trying to compute the same root in both examples (i.e. 3) in the first case when dividing by x the algorithm converged.

Newton’s Method

The idea for this method is to use f(x) to build another function h(x) that will generate a recursive sequence that converges to the root just like the modified fixed point method.

The idea here is to keep following the tangent line at a point on the graph down to the x-axis and use that for the value of x that will approximate the root. In other words h(x) represents the x-intercept of the tangent line of f(x).

x

f(x)

h(x)

f(xn)

h(xn)=xn+1

xn

root root

Page 5: Newton’s Method

To get what xn is from xn+1 we write the equation of the tangent line at xn, plug in the point (xn+1,0) and solve for xn+1.

)('

)(

)('

)(

))((')(0

))((')(

1

1

1

n

nnn

nnn

n

nnnn

nnn

xf

xfxx

xxxf

xf

xxxfxf

xxxfxfy

equation of tangent at xn

substitute in (xn+1,0)

solve

this is the h(x)

The equation above gives the recursively defined sequence for xn. This is what is used for Newton’s Method. The halting condition is usually given by the Standard Cauchy Error.

Page 6: Newton’s Method

1,2)( 02 xxxf

x

xxxh

2

2)(

2

n xn h(xn)

0 1 1.5

1 1.5 1.41667

2 1.41667 1.41422

3 1.41422 1.41421

1,32)( 03 xxxxf

23

32)(

2

3

x

xxxxh

n xn h(xn)

0 1 5

1 5 3.46575

2 3.46575 2.53422

3 2.53422 2.059

Here are some examples of Newton’s Method applied.

Page 7: Newton’s Method

Problems With Newton’s Method

The number of iterations required can not be determined before the algorithm begins.

The algorithm will not work if f(x) is not differentiable.

The algorithm will halt (program termination by division by zero if not checked for) if a horizontal tangent line is encountered.

Newton’s method will sometimes find an extraneous root.