22
The Averaged Gradient Technique in Sensitivity Analysis Jan Chleboun Department of Mathematics, Faculty of Civil Engineering, Prague Dolní Maxov, June 1–6, 2008 1 / 20

The Averaged Gradient Technique in Sensitivity Analysis

  • Upload
    others

  • View
    8

  • Download
    0

Embed Size (px)

Citation preview

Page 1: The Averaged Gradient Technique in Sensitivity Analysis

The Averaged Gradient Technique inSensitivity Analysis

Jan Chleboun

Department of Mathematics, Faculty of Civil Engineering, Prague

Dolní Maxov, June 1–6, 2008

1 / 20

Page 2: The Averaged Gradient Technique in Sensitivity Analysis

Outline

Averaged (a.k.a. Recovered) GradientIdeaProperties

Sensitivity AnalysisMotivationThe Core of the Sensitivity FormulaeObservations

Comments on Matlab and PDE Toolbox

2 / 20

Page 3: The Averaged Gradient Technique in Sensitivity Analysis

Averaged (a.k.a. Recovered) Gradient

Idea (in 2D): Let (uax , ua

y ) be an approximation of ∇u. Byprocessing (ua

x , uay ), we wish to obtain (ua

x , uay ), a better (more

accurate) approximation of ∇u.

Application: In FEM, ‖∇u − (uhx , uhy )‖L2 ≤ Ch, where uh is aFE approximation to a BVP solution u. It can be‖∇u − (uhx , uhy )‖L2 ≤ Ch2.

Various methods. We will use the method proposed in

I. Hlavácek, M. Krížek, V. Pištora: How to recover the gradientof linear elements on nonuniform triangulations, Appl. Math. 41(1996), 241-267

3 / 20

Page 4: The Averaged Gradient Technique in Sensitivity Analysis

Recovered Gradient: Technique

Z

A2

A1

B2

B1

Z

A2

A1

B2

B1

ai = (Ai − Z )i , bi = (Bi − Z )i , i = 1, 2(. . . d)

The weighted averaged gradient of function v at node Z

(Ghv(Z ))i = αiv(Ai) − (αi + βi)v(Z ) + βiv(Bi )

where

αi =bi

ai(bi − ai), βi =

ai

bi(ai − bi).

4 / 20

Page 5: The Averaged Gradient Technique in Sensitivity Analysis

Properties IAssumptions:Ω ⊂ R

2 (R3), a bounded domain with a polygonal Lipschitzboundary.F , a strongly regular family of triangulations Thh→0, i.e.,

∃κ > 0 ∀Th ∈ F ∀K ∈ Th κh ≤ %K ,

where %K is the radius of a ball BK such that BK ⊂ K .

Theorem: Let q ∈ (1,∞) and v ∈ W 3q (Ω). Then exists a

constant C > 0 such that

‖∇v − Ghv‖0,q ≤ Ch2|v |3,q

for any Th ∈ F .

5 / 20

Page 6: The Averaged Gradient Technique in Sensitivity Analysis

Properties IIAn elliptic BVP

− div(λ∇u) = f in Ω,

u = 0 on ∂Ω,

where λ is a matrix comprising sufficiently smooth functions(λ ∈ [W 2

2+ε(Ω)]2×2).

It is assumed that u ∈ W 3q (Ω), q > 2.

Let us consider uh, the finite element approximation to u (p.-w.linear basis functions).

Is Ghuh close to ∇u?

6 / 20

Page 7: The Averaged Gradient Technique in Sensitivity Analysis

Properties IIAssumptions: The family F = Thh→0 is generated by smoothdistortions of uniform triangulations of square grids.

(Then the linear interpolation of uis sufficiently close to u.)

Ω0, a domain such that Ω0 ⊂ Ω

Theorem: ‖∇u − Ghuh‖0,2,Ω0≤ C(u)h2.

7 / 20

Page 8: The Averaged Gradient Technique in Sensitivity Analysis

Sensitivity Analysis: MotivationThe weak formulation of the (simplified) BVP

∫Ω

λ(x)∇u · ∇v dx =

∫Ω

fv dx ∀v ∈ H10 (Ω), u ∈ H1

0 (Ω).

A unique solution u(λ; ·).

Criterion-functional: Ψ(λ) = Φ(u(λ))

Example (Parameter Identification)A function w is given. The goal is to find λ that minimizes

Ψ(λ) =

∫Ω

(w(x) − u(λ; x))2 dx .

Analogous settings appear in optimal design problems, theworst scenario method, or PDE/ODE-constrained optimization.

8 / 20

Page 9: The Averaged Gradient Technique in Sensitivity Analysis

Sensitivity AnalysisIn the search for

arg minλ∈Uad

Ψ(λ),

where Uad is an admissible set of parameters, the derivative Ψ′

of Ψ w.r.t. λ is beneficial.

Remark: In discretized problems (FEM and a discretization ofUad), Ψ(λ) is a function of several variables λ1, . . . , λn and Ψ′

becomes ∇Ψ.

Let us focus on a particular approach to the Ψ′ calculation: theformula for Ψ′ is derived from the "continuous" weak problem,and then approximated by the solutions of discretized problems.

9 / 20

Page 10: The Averaged Gradient Technique in Sensitivity Analysis

In our problem, letλ(x1, x2) = λ1 + λ2x1 + λ3x2 + λ4x2

1 + λ5x1x2 + +λ6x22 .

Then, for instance,

∂Ψ

∂λ4(λ) = −

∫Ω

x21∇u(x) · ∇z(x) dx ,

where z ∈ H10 (Ω) solves the adjoint equation

∫Ω

λ(x)∇z · ∇v dx =

∫Ω

2(u − w) dx ∀v ∈ H10 (Ω)

(the right-hand side equals the derivative of Φ(u) w.r.t. u).

10 / 20

Page 11: The Averaged Gradient Technique in Sensitivity Analysis

Generally (not only in our example!), Ψ′ is evaluated throughthe integration of an expression containing ∇u and ∇z. Sincethe accuracy of Ψ′ depends on the accuracy of theapproximations of ∇u and ∇z, the idea of recovering ∇uh and∇zh, the FEM-computed gradients, emerged.

11 / 20

Page 12: The Averaged Gradient Technique in Sensitivity Analysis

Generally (not only in our example!), Ψ′ is evaluated throughthe integration of an expression containing ∇u and ∇z. Sincethe accuracy of Ψ′ depends on the accuracy of theapproximations of ∇u and ∇z, the idea of recovering ∇uh and∇zh, the FEM-computed gradients, emerged.

The idea failed.

The gradient of Ψ computed by means of the recoveredgradient technique is not better than the gradient computeddirectly.

11 / 20

Page 13: The Averaged Gradient Technique in Sensitivity Analysis

Why?

Reasons:

1. A bug in the code?

12 / 20

Page 14: The Averaged Gradient Technique in Sensitivity Analysis

Why?

Reasons:

1. A bug in the code?

2. A "boundary layer", where the averaging is inaccurate.

12 / 20

Page 15: The Averaged Gradient Technique in Sensitivity Analysis

Interpolation features of Gh, uniform mesh

0

0.5

1

0

0.5

10

0.01

0.02

0.03

0.04

0.05

Eucl. norm of the diff. in the gradients: pwc − exact (centroids)

0.005

0.01

0.015

0.02

0.025

0.03

0.035

0.04

0.045

0

0.5

1

0

0.5

10

0.2

0.4

0.6

0.8

1

x 10−16

Eucl. norm of the diff. in the gradients: rpwc − exact (nodes)

0

1

2

3

4

5

6

7

8

x 10−17

Interpolated function: xy(1 − x)(1 − y)Euclid. norm of the difference between the calculated and theexact gradient: p.-w. constant gradient (left), recovered gradient(right).

13 / 20

Page 16: The Averaged Gradient Technique in Sensitivity Analysis

Interpolation features of Gh, distorted mesh

0

0.5

1

0

0.2

0.4

0.6

0.8

10

0.02

0.04

0.06

Eucl. norm of the diff. in the gradients: pwc − exact (centroids)

0.005

0.01

0.015

0.02

0.025

0.03

0.035

0.04

0

0.5

1

0

0.5

10

0.02

0.04

0.06

Eucl. norm of the diff. in the gradients: rpwc − exact (nodes)

0.005

0.01

0.015

0.02

0.025

0.03

0.035

0.04

0.045

0.05

Interpolated function: xy(1 − x)(1 − y)Euclid. norm of the difference between the calculated and theexact gradient: p.-w. constant gradient (left), recovered gradient(right).

14 / 20

Page 17: The Averaged Gradient Technique in Sensitivity Analysis

FEM features of Gh, uniform mesh

0

0.5

1

0

0.2

0.4

0.6

0.8

10

0.02

0.04

0.06

Eucl. n. of the diff. in the gradients: pwc FEM − exact (centroids)

0.005

0.01

0.015

0.02

0.025

0.03

0.035

0.04

0.045

0

0.5

1

0

0.5

10

1

2

3

4

x 10−3

Eucl. n. of the diff. in the gradients: rpwc FEM − exact (nodes)

0.5

1

1.5

2

2.5

3

3.5x 10

−3

FEM solution of the Poisson equation with the exact solutionxy(1 − x)(1 − y).Euclid. norm of the difference between the calculated and theexact gradient: p.-w. constant gradient (left), recovered gradient(right).

15 / 20

Page 18: The Averaged Gradient Technique in Sensitivity Analysis

FEM features of Gh, distorted mesh

0

0.5

1

0

0.2

0.4

0.6

0.8

10

0.02

0.04

0.06

Eucl. n. of the diff. in the gradients: pwc FEM − exact (centroids)

0.005

0.01

0.015

0.02

0.025

0.03

0.035

0.04

0

0.5

1

0

0.5

10

0.02

0.04

0.06

Eucl. n. of the diff. in the gradients: rpwc FEM − exact (nodes)

0

0.005

0.01

0.015

0.02

0.025

0.03

0.035

0.04

0.045

0.05

FEM solution of the Poisson equation with the exact solutionxy(1 − x)(1 − y).Euclid. norm of the difference between the calculated and theexact gradient: p.-w. constant gradient (left), recovered gradient(right).

16 / 20

Page 19: The Averaged Gradient Technique in Sensitivity Analysis

Gradient of the adjoint state, uniform mesh

0

0.2

0.4

0.6

0.8

1

0

0.2

0.4

0.6

0.8

10

0.5

1

Eucl. n. of the adj. gradient (FEM, centroids)

0.1

0.2

0.3

0.4

0.5

0.6

0.7

0.8

0

0.5

1

0

0.2

0.4

0.6

0.8

10

0.5

1

Eucl. n. of the recov. adj. gradient (FEM, nodes)

0

0.1

0.2

0.3

0.4

0.5

0.6

0.7

0.8

0.9

The Euclid. norm of the p.-w. constant (left) and recoveredgradient of the adjoint solution.

Observations: A boundary layer exists where (a) the recoveredgradient of ∇uh (and ∇zh) is inaccurate, and (b) the amount ofsensitivity information stored in ∇zh is substantial.Conclusion: The performance of ∇uh · ∇zh is poor even forrecovered gradients.

17 / 20

Page 20: The Averaged Gradient Technique in Sensitivity Analysis

Work in progress

I Exact integration in the FEM code.I Focus not on ∇uh · ∇zh in the sensitivity formula, but on

special criterion-functionals as

Ψ(λ) =

∫Ω0

(w,x1 − u,x1)2 + (w,x2 − u,x2)

2 dx ,

where u solves the Poisson equation, w is a given function,and Ω0 ⊂ Ω.

18 / 20

Page 21: The Averaged Gradient Technique in Sensitivity Analysis

Matlab and PDE ToolboxTwo errors discovered in R2007b and R2008a (R2007a too?,not in R2006a):

I The format of the boundary condition matrix "b" as used inthe past and as described in the PDE Toolbox help systemor handbook causes an error message and the breakdownof the code.Will be fixed in R2008b. DIY fixing is easy.

I The PDE Toolbox GUI (pdetool) does not solve BVPswith a Dirichlet homogeneous boundary condition. Itdelivers a solution that is nonzero along the boundary.Will be fixed?

19 / 20

Page 22: The Averaged Gradient Technique in Sensitivity Analysis

Acknowledgement: I thank dr. Tomáš Vejchodský for his routinefor numerical integration.

Thank you for your attention

20 / 20