20
Optimization Part II G.Anuradha

Optimization Part II

  • Upload
    ally

  • View
    19

  • Download
    0

Embed Size (px)

DESCRIPTION

Optimization Part II. G.Anuradha. Review of previous lecture- Steepest Descent. Choose the next step so that the function decreases:. For small changes in x we can approximate F ( x ):. where. If we want the function to decrease:. We can maximize the decrease by choosing:. Example. - PowerPoint PPT Presentation

Citation preview

Page 1: Optimization Part II

Optimization Part II

G.Anuradha

Page 2: Optimization Part II

Review of previous lecture-Steepest Descent

F x k 1+ F xk

Choose the next step so that the function decreases:

F xk 1+ F xk x k+ F xk gkT x k+=

For small changes in x we can approximate F(x):

g k F x x xk=

where

g kT x k kg k

Tpk 0=

If we want the function to decrease:

pk g– k=

We can maximize the decrease by choosing:

x k 1+ xk kg k–=

Page 3: Optimization Part II

ExampleF x x1

2 2 x1x2 2x22 x1+ + +=

x00.50.5

=

F x x1 F x

x2 F x

2x1 2x2 1+ +

2x1 4x2+= = g0 F x

x x0=33

= =

0.1=

x1 x0 g0– 0.50.5

0.1 33

– 0.20.2

= = =

x2 x1 g1– 0.20.2

0.1 1.81.2

– 0.020.08

= = =

Page 4: Optimization Part II

Plot

-2 -1 0 1 2-2

-1

0

1

2

Page 5: Optimization Part II
Page 6: Optimization Part II
Page 7: Optimization Part II

Necessary and sufficient conditions for a function with single variable

Page 8: Optimization Part II
Page 9: Optimization Part II

Functions with two variablesNecessary conditions Sufficient conditions

Page 10: Optimization Part II

Stationary Points

Page 11: Optimization Part II

Effect of learning ratex k 1+ xk kg k–=

More the learning rate the trajectory becomes oscillatory.This will make the algorithm unstableThe upper limit for learning rates can be set for quadratic functions

Page 12: Optimization Part II

Stable Learning Rates (Quadratic)F x 1

2---xTAx dTx c+ +=

F x Ax d+=

x k 1+ xk gk– x k Ax k d+ –= = xk 1+ I A– x k d–=

I A– zi z i Az i– z i iz i– 1 i– z i= = =

1 i– 1 2i----

2max------------

Stability is determinedby the eigenvalues of

this matrix.

Eigenvalues of [I - A].

Stability Requirement:

(i - eigenvalue of A)

Page 13: Optimization Part II

ExampleA 2 2

2 4= 1 0.764= z1

0.8510.526–

=

2 5.24 z20.5260.851

=

=

2

max------------ 2

5.24---------- 0.38= =

-2 -1 0 1 2-2

-1

0

1

2

-2 -1 0 1 2-2

-1

0

1

2 0.37= 0.39=

Page 14: Optimization Part II

Newton’s MethodF xk 1+ F xk xk+ F xk g k

Tx k

12---xk

TAkx k+ +=

gk Akxk+ 0=

Take the gradient of this second-order approximationand set it equal to zero to find the stationary point:

x k Ak1–– g k=

xk 1+ xk Ak1– gk–=

Page 15: Optimization Part II

ExampleF x x1

2 2 x1x2 2x22 x1+ + +=

x00.50.5

=

F x x1 F x

x2 F x

2x1 2x2 1+ +

2x1 4x2+= =

g0 F x x x0=

33

= =

A 2 22 4

=

x10.50.5

2 22 4

1–33

– 0.50.5

1 0.5–0.5– 0.5

33

– 0.50.5

1.50

– 1–0.5

= = = =

Page 16: Optimization Part II

Plot

-2 -1 0 1 2-2

-1

0

1

2

Page 17: Optimization Part II

• This is used for finding line minimization methods and their stopping criteria– Initial bracketing– Line searches• Newton’s method• Secant method• Sectioning method

Page 18: Optimization Part II

Initial Bracketing

• Helps in finding the range which contains the relative minimum

• Bracketing some assumed minimum in the starting interval is required

• Two schemes are used for this purpose

Page 19: Optimization Part II
Page 20: Optimization Part II

Sectioning methods