Upload
sydnee-maynor
View
317
Download
2
Embed Size (px)
Citation preview
2.3 共轭斜量法( Conjugate Gradient Methods)
属于一种迭代法,但如果不考虑计算过程的舍入误差, CG算法只用有限步就收敛于方程组的精确解
Outline
BackgroundSteepest DescentConjugate Gradient
1 Background
• The min(max) problem:
• But we learned in calculus how to solve that kind of question!
)(min xfx
“real world” problem
• Connectivity shapes (isenburg,gumhold,gotsman)
• What do we get only from C without geometry?
{ ( , ), }mesh C V E geometry
Motivation- “real world” problem
• First we introduce error functionals and then try to minimize them:
23
( , )
( ) 1ns i j
i j E
E x x x
( , )
1( )i j i
i j Ei
L x x xd
3 2
1
( ) ( )n
nr i
i
E x L x
Motivation- “real world” problem
Then we minimize:
High dimension non-linear problem.Conjugate gradient method is maybe the
most popular optimization technique based on what we’ll see here.
3
( , ) arg min 1 ( ) ( )n
s rx
E C E x E x
Directional Derivatives: first, the one dimension derivative:
x
yxf
),(
y
yxf
),(
Directional Derivatives : Along the Axes…
v
yxf
),(
2Rv
1v
Directional Derivatives : In general direction…
Directional Derivatives
x
yxf
),(
y
yxf
),(
In the plane
2R
RRf 2:
y
f
x
fyxf :),(
The Gradient: Definition in
n
n x
f
x
fxxf ,...,:),...,(
11
RRf n :
The Gradient: Definition
基本思想
Modern optimization methodsA method to solve quadratic function
minimization:
(A is symmetric and positive definite)
12
min ( ) min{ , , }n nx R x R
x x Ax b x
2 最速下降法 ( Steepest Descent)
( 1 )概念:将 点的修正方向取为该点的负梯度方向 ,即为最速下降方向,该方法进而称之为最速下降法 .
( 2 )计算公式:任意取定初始向量,
1
( , )
( , )
k k k
k kk
k k
k k k k
p r b Ax
r p
Ap p
x x p
kx( ) |
kk k x xp r grad f x
Steepest Descent
3 共轭斜量法( Conjugate Gradient)
Modern optimization methods : “conjugate direction” methods.A method to solve quadratic function
minimization:
(A is symmetric and positive definite)
12
min{ , , }nx R
x Ax b x
Conjugate Gradient
• Originally aimed to solve linear problems:
• Later extended to general functions under rational of quadratic approximation to a function is quite accurate.
2min bAxbAx
nRx
Conjugate Gradient
The basic idea: decompose the n-dimensional quadratic problem into n problems of
1-dimensionThis is done by exploring the function in
“conjugate directions”.Definition: A-conjugate vectors:
1{ } , , 0,n ni i i ju R u Au i j
Conjugate Gradient
• If there is an A-conjugate basis then:
• N problems in 1-dimension (simple smiling quadratic)• The global minimizer is calculated sequentially
starting from x0:
0 j jj
x x p
0 0
12
212
( ) : , ,
( ) ( ) , ,j j j jjj
x x Ax b x
x x p Ap b Ax p
1 , ( 0, 1, ..., 1)k k k kx x p k n
Conjugate Gradient
Conjugate Gradient
Gradient
4 共轭斜量法与最速下降法的比较: