36
Modern Optimization Techniques Modern Optimization Techniques 3. Equality Constrained Optimization / 3.1. Duality Lars Schmidt-Thieme Information Systems and Machine Learning Lab (ISMLL) Institute for Computer Science University of Hildesheim, Germany Lars Schmidt-Thieme, Information Systems and Machine Learning Lab (ISMLL), University of Hildesheim, Germany 1 / 23

Modern Optimization Techniques...Modern Optimization Techniques 1. Constrained Optimization Constrained Optimization Problems A constrained optimization problem has the form: minimize

  • Upload
    others

  • View
    61

  • Download
    1

Embed Size (px)

Citation preview

Page 1: Modern Optimization Techniques...Modern Optimization Techniques 1. Constrained Optimization Constrained Optimization Problems A constrained optimization problem has the form: minimize

Modern Optimization Techniques

Modern Optimization Techniques3. Equality Constrained Optimization / 3.1. Duality

Lars Schmidt-Thieme

Information Systems and Machine Learning Lab (ISMLL)Institute for Computer Science

University of Hildesheim, Germany

Lars Schmidt-Thieme, Information Systems and Machine Learning Lab (ISMLL), University of Hildesheim, Germany

1 / 23

Page 2: Modern Optimization Techniques...Modern Optimization Techniques 1. Constrained Optimization Constrained Optimization Problems A constrained optimization problem has the form: minimize

Modern Optimization Techniques

Syllabus

Mon. 28.10. (0) 0. Overview

1. TheoryMon. 4.11. (1) 1. Convex Sets and Functions

2. Unconstrained OptimizationMon. 11.11. (2) 2.1 Gradient DescentMon. 18.11. (3) 2.2 Stochastic Gradient DescentMon. 25.11. (4) 2.3 Newton’s MethodMon. 2.12. (5) 2.4 Quasi-Newton MethodsMon. 19.12. (6) 2.5 Subgradient MethodsMon. 16.12. (7) 2.6 Coordinate Descent

— — Christmas Break —

3. Equality Constrained OptimizationMon. 6.1. (8) 3.1 DualityMon. 13.1. (9) 3.2 Methods

4. Inequality Constrained OptimizationMon. 20.1. (10) 4.1 Primal MethodsMon. 27.1. (11) 4.2 Barrier and Penalty MethodsMon. 3.2. (12) 4.3 Cutting Plane Methods

Lars Schmidt-Thieme, Information Systems and Machine Learning Lab (ISMLL), University of Hildesheim, Germany

1 / 23

Page 3: Modern Optimization Techniques...Modern Optimization Techniques 1. Constrained Optimization Constrained Optimization Problems A constrained optimization problem has the form: minimize

Modern Optimization Techniques

Outline

1. Constrained Optimization

2. Duality

3. Karush-Kuhn-Tucker Conditions

Lars Schmidt-Thieme, Information Systems and Machine Learning Lab (ISMLL), University of Hildesheim, Germany

1 / 23

Page 4: Modern Optimization Techniques...Modern Optimization Techniques 1. Constrained Optimization Constrained Optimization Problems A constrained optimization problem has the form: minimize

Modern Optimization Techniques 1. Constrained Optimization

Outline

1. Constrained Optimization

2. Duality

3. Karush-Kuhn-Tucker Conditions

Lars Schmidt-Thieme, Information Systems and Machine Learning Lab (ISMLL), University of Hildesheim, Germany

1 / 23

Page 5: Modern Optimization Techniques...Modern Optimization Techniques 1. Constrained Optimization Constrained Optimization Problems A constrained optimization problem has the form: minimize

Modern Optimization Techniques 1. Constrained Optimization

Constrained Optimization Problems

A constrained optimization problem has the form:

minimize f (x)

subject to gp(x) = 0, p = 1, . . . ,P

hq(x) ≤ 0, q = 1, . . . ,Q

where:

I f : RN → R is called the objective or cost function,

I g1, . . . , gP : RN → R are called equality constraints,

I h1, . . . , hQ : RN → R are called inequality constraints,

I a feasible, optimal x∗ exists

Lars Schmidt-Thieme, Information Systems and Machine Learning Lab (ISMLL), University of Hildesheim, Germany

1 / 23

Page 6: Modern Optimization Techniques...Modern Optimization Techniques 1. Constrained Optimization Constrained Optimization Problems A constrained optimization problem has the form: minimize

Modern Optimization Techniques 1. Constrained Optimization

Constrained Optimization ProblemsA convex constrained optimization problem:

minimize f (x)

subject to gp(x) = 0, p = 1, . . . ,P

hq(x) ≤ 0, q = 1, . . . ,Q

is convex iff:I f , the objective function is convex,I g1, . . . , gP the equality constraint functions are affine:

gp(x) = aTp x− bp, and

I h1, . . . , hQ the inequality constraint functions are convex.

minimize f (x)

subject to aTp x− bp = 0, p = 1, . . . ,P

hq(x) ≤ 0, q = 1, . . . ,Q

Lars Schmidt-Thieme, Information Systems and Machine Learning Lab (ISMLL), University of Hildesheim, Germany

2 / 23

Page 7: Modern Optimization Techniques...Modern Optimization Techniques 1. Constrained Optimization Constrained Optimization Problems A constrained optimization problem has the form: minimize

Modern Optimization Techniques 1. Constrained Optimization

Linear ProgrammingA convex problem with anI affine objective andI affine constraints

is called Linear Program (LP).

Standard form LP:

minimize cTx

subject to aTp x = bp, p = 1, . . . ,P

x ≥ 0

Inequality form LP:

minimize cTx

subject to aTq x ≤ bq, q = 1, . . . ,Q

I No analytical solutionI There are specialized algorithms available

Lars Schmidt-Thieme, Information Systems and Machine Learning Lab (ISMLL), University of Hildesheim, Germany

3 / 23

Page 8: Modern Optimization Techniques...Modern Optimization Techniques 1. Constrained Optimization Constrained Optimization Problems A constrained optimization problem has the form: minimize

Modern Optimization Techniques 1. Constrained Optimization

Quadratic Programming

A convex problem with

I a quadratic objective andI affine constraints

is called Quadratic Program (QP).

Inequality form QP:

minimize1

2xTCx + cTx

subject to aTq x ≤ bq, q = 1, . . . ,Q

where:

I C � 0 pos.def. orI C = 0, a special case: linear programs.

Lars Schmidt-Thieme, Information Systems and Machine Learning Lab (ISMLL), University of Hildesheim, Germany

4 / 23

Page 9: Modern Optimization Techniques...Modern Optimization Techniques 1. Constrained Optimization Constrained Optimization Problems A constrained optimization problem has the form: minimize

Modern Optimization Techniques 1. Constrained Optimization

Example: Maximum Margin Separating Hyperplanes

a2

a1

xT a

+x 0

=0

xT a

+x 0

=1

xT a

+x 0

=−1

2‖x‖

x0‖x‖

x

Lars Schmidt-Thieme, Information Systems and Machine Learning Lab (ISMLL), University of Hildesheim, Germany

5 / 23

Page 10: Modern Optimization Techniques...Modern Optimization Techniques 1. Constrained Optimization Constrained Optimization Problems A constrained optimization problem has the form: minimize

Modern Optimization Techniques 1. Constrained Optimization

Example: Maximum Margin Separating Hyperplanes

a2

a1

xT a

+x 0

=0

xT a

+x 0

=1

xT a

+x 0

=−1

2‖x‖

x0‖x‖

x

Lars Schmidt-Thieme, Information Systems and Machine Learning Lab (ISMLL), University of Hildesheim, Germany

5 / 23

Page 11: Modern Optimization Techniques...Modern Optimization Techniques 1. Constrained Optimization Constrained Optimization Problems A constrained optimization problem has the form: minimize

Modern Optimization Techniques 1. Constrained Optimization

Example: Maximum Margin Separating Hyperplanes

a2

a1

xT a

+x 0

=0

xT a

+x 0

=1

xT a

+x 0

=−1

2‖x‖

x0‖x‖

x

Lars Schmidt-Thieme, Information Systems and Machine Learning Lab (ISMLL), University of Hildesheim, Germany

5 / 23

Page 12: Modern Optimization Techniques...Modern Optimization Techniques 1. Constrained Optimization Constrained Optimization Problems A constrained optimization problem has the form: minimize

Modern Optimization Techniques 1. Constrained Optimization

Example: Maximum Margin Separating Hyperplanes

a2

a1

xT a

+x 0

=0

xT a

+x 0

=1

xT a

+x 0

=−1

2‖x‖

x0‖x‖

x

Lars Schmidt-Thieme, Information Systems and Machine Learning Lab (ISMLL), University of Hildesheim, Germany

5 / 23

Page 13: Modern Optimization Techniques...Modern Optimization Techniques 1. Constrained Optimization Constrained Optimization Problems A constrained optimization problem has the form: minimize

Modern Optimization Techniques 1. Constrained Optimization

Example: Maximum Margin Separating Hyperplanes

a2

a1

xT a

+x 0

=0

xT a

+x 0

=1

xT a

+x 0

=−1

2‖x‖

x0‖x‖

x

Lars Schmidt-Thieme, Information Systems and Machine Learning Lab (ISMLL), University of Hildesheim, Germany

5 / 23

Page 14: Modern Optimization Techniques...Modern Optimization Techniques 1. Constrained Optimization Constrained Optimization Problems A constrained optimization problem has the form: minimize

Modern Optimization Techniques 1. Constrained Optimization

Example: Maximum Margin Separating Hyperplanes

a2

a1

xT a

+x 0

=0

xT a

+x 0

=1

xT a

+x 0

=−1

2‖x‖

x0‖x‖

x

Lars Schmidt-Thieme, Information Systems and Machine Learning Lab (ISMLL), University of Hildesheim, Germany

5 / 23

Page 15: Modern Optimization Techniques...Modern Optimization Techniques 1. Constrained Optimization Constrained Optimization Problems A constrained optimization problem has the form: minimize

Modern Optimization Techniques 1. Constrained Optimization

Example: Support Vector Machines

If the instances are not completely separable,we can allow some of them to be on the wrong side of the decisionboundary.

The closer the “wrong” points are to the boundary,the better (modeled by slack variables ξn).

minimize1

2||x||2 + γ

N∑n=1

ξn

subject to yn(a0 + xTan) ≥ 1− ξn, n = 1, . . . ,N

ξn ≥ 0,¸ n = 1, . . . ,N

Lars Schmidt-Thieme, Information Systems and Machine Learning Lab (ISMLL), University of Hildesheim, Germany

6 / 23

Page 16: Modern Optimization Techniques...Modern Optimization Techniques 1. Constrained Optimization Constrained Optimization Problems A constrained optimization problem has the form: minimize

Modern Optimization Techniques 2. Duality

Outline

1. Constrained Optimization

2. Duality

3. Karush-Kuhn-Tucker Conditions

Lars Schmidt-Thieme, Information Systems and Machine Learning Lab (ISMLL), University of Hildesheim, Germany

7 / 23

Page 17: Modern Optimization Techniques...Modern Optimization Techniques 1. Constrained Optimization Constrained Optimization Problems A constrained optimization problem has the form: minimize

Modern Optimization Techniques 2. Duality

Lagrangian

Given a constrained optimization problem in the standard form:

minimize f (x)

subject to gp(x) = 0, p = 1, . . . ,P

hq(x) ≤ 0, q = 1, . . . ,Q

We can put

I the objective function f andI the constraints gp and hq

in a joint function called primal Lagrangian:

f (x) +P∑

p=1

νp gp(x) +Q∑

q=1

λq hq(x)

Lars Schmidt-Thieme, Information Systems and Machine Learning Lab (ISMLL), University of Hildesheim, Germany

7 / 23

Page 18: Modern Optimization Techniques...Modern Optimization Techniques 1. Constrained Optimization Constrained Optimization Problems A constrained optimization problem has the form: minimize

Modern Optimization Techniques 2. Duality

Primal Lagrangian

The primal Lagrangian of a constrained optimization problem is afunction

L: RN × RP × RQ → R

L(x, ν, λ) :=f (x) +P∑

p=1

νp gp(x) +Q∑

q=1

λq hq(x)

where:I νp and λq are called Lagrange multipliers.

I νp is the Lagrange multiplier associated with the constraint gp(x) = 0

I λq is the Lagrange multiplier associated with the constraint hq(x) ≤ 0.

Lars Schmidt-Thieme, Information Systems and Machine Learning Lab (ISMLL), University of Hildesheim, Germany

8 / 23

Page 19: Modern Optimization Techniques...Modern Optimization Techniques 1. Constrained Optimization Constrained Optimization Problems A constrained optimization problem has the form: minimize

Modern Optimization Techniques 2. Duality

Dual Lagrangian

Be D the domain of the problem, the dual Lagrangian of a constrainedoptimization problem is a function g : RP × RQ → R:

g(ν, λ) := infx∈D

L(x, ν, λ)

= infx∈D

f (x) +P∑

p=1

νp gp(x) +Q∑

q=1

λq hq(x)

I g is concave.

I as infimum over concave (affine) functions

I for non-negative λq, g is a lower bound on f (x∗):

g(ν, λ) ≤ f (x∗) for λ ≥ 0

Lars Schmidt-Thieme, Information Systems and Machine Learning Lab (ISMLL), University of Hildesheim, Germany

9 / 23

Note: From here onwards, g denotes the dual Lagrangian, not the equality constraintsanymore.

Page 20: Modern Optimization Techniques...Modern Optimization Techniques 1. Constrained Optimization Constrained Optimization Problems A constrained optimization problem has the form: minimize

Modern Optimization Techniques 2. Duality

Dual Lagrangian / ProofProof of the lower bound property of:

g(ν, λ) := infx∈D

L(x, ν, λ)

= infx∈D

f (x) +P∑

p=1

νp gp(x) +Q∑

q=1

λq hq(x)

for any feasible x we have:

I gp(x) = 0

I hq(x) ≤ 0

thus, with λ ≥ 0:

f (x) ≥ L(x, ν, λ) ≥ infx′∈D

L(x′, ν, λ) = g(ν, λ)

minimizing over all feasible x, we have f (x∗) ≥ g(ν, λ)Lars Schmidt-Thieme, Information Systems and Machine Learning Lab (ISMLL), University of Hildesheim, Germany

10 / 23

Page 21: Modern Optimization Techniques...Modern Optimization Techniques 1. Constrained Optimization Constrained Optimization Problems A constrained optimization problem has the form: minimize

Modern Optimization Techniques 2. Duality

Least-norm solution of linear equations

minimize xTx

subject to Ax = b

I Lagrangian: L(x, ν) = xTx + νT (Ax− b)

I Dual Lagrangian:I minimize L over x:

∇xL(x, ν) = 2x + ATν = 0

x = −1

2ATν

I Substituting x in L we get g :

g(ν) = −1

4νTAATν − bTν

Lars Schmidt-Thieme, Information Systems and Machine Learning Lab (ISMLL), University of Hildesheim, Germany

11 / 23

Page 22: Modern Optimization Techniques...Modern Optimization Techniques 1. Constrained Optimization Constrained Optimization Problems A constrained optimization problem has the form: minimize

Modern Optimization Techniques 2. Duality

The Dual Problem

Once we know how to compute the dual, we are interested in computingthe best lower bound on f (x∗):

maximize g(ν, λ)

subject to λ ≥ 0

where:

I this is a convex optimization problem (g is concave)

I let d∗ be the maximal value of g

Lars Schmidt-Thieme, Information Systems and Machine Learning Lab (ISMLL), University of Hildesheim, Germany

12 / 23

Page 23: Modern Optimization Techniques...Modern Optimization Techniques 1. Constrained Optimization Constrained Optimization Problems A constrained optimization problem has the form: minimize

Modern Optimization Techniques 2. Duality

Weak and Strong Duality

Say p∗ is the optimal value of fand d∗ is the optimal value of g

Weak duality: d∗ ≤ p∗

I always holds

I can be useful to find informative lower bounds for difficult problems

Strong duality: d∗ = p∗

I does not always hold

I but holds for a range of convex problems

I properties that guarantee strong duality are calledconstraint qualifications

Lars Schmidt-Thieme, Information Systems and Machine Learning Lab (ISMLL), University of Hildesheim, Germany

13 / 23

Page 24: Modern Optimization Techniques...Modern Optimization Techniques 1. Constrained Optimization Constrained Optimization Problems A constrained optimization problem has the form: minimize

Modern Optimization Techniques 2. Duality

Slater’s Condition / Strict Feasibility

If the following primal problem

minimize f (x)

subject to Ax = b

hq(x) ≤ 0, q = 1, . . . ,Q

is:

I convex and

I strictly feasible, i.e.

∃x : Ax = b and hq(x)< 0, q = 1, . . . ,Q

then strong duality holds for this problem.

Lars Schmidt-Thieme, Information Systems and Machine Learning Lab (ISMLL), University of Hildesheim, Germany

14 / 23

Page 25: Modern Optimization Techniques...Modern Optimization Techniques 1. Constrained Optimization Constrained Optimization Problems A constrained optimization problem has the form: minimize

Modern Optimization Techniques 2. Duality

Duality Gap

How close is the value of the dual lagrangian to the primal objective?

Given a primal feasible x and a dual feasible ν, λ,the duality gap is defined as:

f (x)− g(ν, λ)

Since g(ν, λ) is a lower bound on f :

f (x)− f (x∗) ≤ f (x)− g(ν, λ)

If the duality gap is zero, then x is primal optimal.

I This is a useful stopping criterion:if f (x)− g(ν, λ) ≤ ε, then we are sure that f (x)− f (x∗) ≤ ε

Lars Schmidt-Thieme, Information Systems and Machine Learning Lab (ISMLL), University of Hildesheim, Germany

15 / 23

Page 26: Modern Optimization Techniques...Modern Optimization Techniques 1. Constrained Optimization Constrained Optimization Problems A constrained optimization problem has the form: minimize

Modern Optimization Techniques 3. Karush-Kuhn-Tucker Conditions

Outline

1. Constrained Optimization

2. Duality

3. Karush-Kuhn-Tucker Conditions

Lars Schmidt-Thieme, Information Systems and Machine Learning Lab (ISMLL), University of Hildesheim, Germany

16 / 23

Page 27: Modern Optimization Techniques...Modern Optimization Techniques 1. Constrained Optimization Constrained Optimization Problems A constrained optimization problem has the form: minimize

Modern Optimization Techniques 3. Karush-Kuhn-Tucker Conditions

Consequences of Strong Duality

Assume strong duality:

I let x∗ be primal optimal andI (ν∗, λ∗) be dual optimal.

f (x∗) =s.d.

g(ν∗, λ∗) = infx∈D

L(x, ν∗, λ∗)

≤L(x∗, ν∗, λ∗)

≤lower bound

f (x∗)

hence

L(x∗, ν∗, λ∗) = infx∈D

L(x, ν∗, λ∗) = f (x∗)

Lars Schmidt-Thieme, Information Systems and Machine Learning Lab (ISMLL), University of Hildesheim, Germany

16 / 23

Page 28: Modern Optimization Techniques...Modern Optimization Techniques 1. Constrained Optimization Constrained Optimization Problems A constrained optimization problem has the form: minimize

Modern Optimization Techniques 3. Karush-Kuhn-Tucker Conditions

Consequences of Strong Duality I: Stationarity

Assume strong duality:

I let x∗ be primal optimal andI (ν∗, λ∗) be dual optimal.

L(x∗, ν∗, λ∗) = infx∈D

L(x, ν∗, λ∗)

i.e., x∗ minimizes L(x, ν∗, λ∗) and thus

∇xL(x∗, ν∗, λ∗) =∇f (x∗) +P∑

p=1

ν∗p∇gp(x∗) +Q∑

q=1

λ∗q∇hq(x∗)!

= 0

I condition called stationarity.

Lars Schmidt-Thieme, Information Systems and Machine Learning Lab (ISMLL), University of Hildesheim, Germany

17 / 23

Note: gp denote again the equality constraints, not the dual Lagrangian.

Page 29: Modern Optimization Techniques...Modern Optimization Techniques 1. Constrained Optimization Constrained Optimization Problems A constrained optimization problem has the form: minimize

Modern Optimization Techniques 3. Karush-Kuhn-Tucker Conditions

Consequences of Strong Duality II: ComplementarySlacknessAssume strong duality:I let x∗ be primal optimal andI (ν∗, λ∗) be dual optimal.

L(x∗, ν∗, λ∗) = f (x∗) +P∑

p=1

ν∗p gp(x∗) +Q∑

q=1

λ∗q hq(x∗) = f (x∗)

complementary slackness:

λ∗q hq(x∗) = 0, q = 1, . . . ,Q

which means thatI If λ∗q > 0, then hq(x∗) = 0

I If hq(x∗) < 0, then λq = 0

Lars Schmidt-Thieme, Information Systems and Machine Learning Lab (ISMLL), University of Hildesheim, Germany

18 / 23

Page 30: Modern Optimization Techniques...Modern Optimization Techniques 1. Constrained Optimization Constrained Optimization Problems A constrained optimization problem has the form: minimize

Modern Optimization Techniques 3. Karush-Kuhn-Tucker Conditions

Karush-Kuhn-Tucker (KKT) ConditionsThe following conditions on x, ν, λ are called the KKT conditions:

1. primal feasibility: gp(x) = 0 and hq(x) ≤ 0, ∀p, q2. dual feasibility: λ ≥ 0

3. complementary slackness: λq hq(x) = 0, ∀q

4. stationarity: ∇f (x) +P∑

p=1

νp∇gp(x) +Q∑

q=1

λq∇hq(x) = 0

If strong duality holds and x, ν, λ are optimal,then they must satisfy the KKT conditions.

If x, λ, ν satisfy the KKT conditions,then x is the primal solution and (ν, λ) is the dual solution.

Lars Schmidt-Thieme, Information Systems and Machine Learning Lab (ISMLL), University of Hildesheim, Germany

19 / 23

Page 31: Modern Optimization Techniques...Modern Optimization Techniques 1. Constrained Optimization Constrained Optimization Problems A constrained optimization problem has the form: minimize

Modern Optimization Techniques 3. Karush-Kuhn-Tucker Conditions

Karush-Kuhn-Tucker (KKT) Conditions

Theorem (Karush-Kuhn-Tucker)

For a strongly dual problem, if x, λ, ν satisfy the KKT conditions,

1. primal feasibility: gp(x) = 0 and hq(x) ≤ 0, ∀p, q2. dual feasibility: λ ≥ 0

3. complementary slackness: λq hq(x) = 0, ∀q

4. stationarity: ∇f (x) +P∑

p=1

νp∇gp(x) +Q∑

q=1

λq∇hq(x) = 0

then x is the primal solution and (ν, λ) is the dual solution.

Lars Schmidt-Thieme, Information Systems and Machine Learning Lab (ISMLL), University of Hildesheim, Germany

20 / 23

Page 32: Modern Optimization Techniques...Modern Optimization Techniques 1. Constrained Optimization Constrained Optimization Problems A constrained optimization problem has the form: minimize

Modern Optimization Techniques 3. Karush-Kuhn-Tucker Conditions

Karush-Kuhn-Tucker (KKT) Conditions / Proof

Proof:

g(λ, ν) = supx ′∈D

f (x ′) +P∑

p=1

νpgp(x ′) +Q∑

q=1

λqhq(x ′)

=4.stat.

f (x) +P∑

p=1

νpgp(x) +Q∑

q=1

λqhq(x)

= f (x)

i.e. duality gap is 0, and thus x and λ, ν optimal.

Lars Schmidt-Thieme, Information Systems and Machine Learning Lab (ISMLL), University of Hildesheim, Germany

21 / 23

Page 33: Modern Optimization Techniques...Modern Optimization Techniques 1. Constrained Optimization Constrained Optimization Problems A constrained optimization problem has the form: minimize

Modern Optimization Techniques 3. Karush-Kuhn-Tucker Conditions

Summary

I The primal Lagrangian combines objective and constraints linearlyI constraint weights called multipliersI multipliers viewed as additional variablesI inequality multipliers ≥ 0

I The dual Lagrangian g is the pointwise infimum of the primalLagrangian over the primal variables x.

I a lower-bound for f (x∗)I difference g(ν, λ)− f (x∗) called duality gap

I Dual problem: Maximizing the dual LagrangianI = finding the best lower boundI a convex problemI solves the primal problem under strong duality (duality gap = 0)

I Constraint qualifications guarantee strong duality for a problemI e.g., Slater’s condition: existence of a strictly feasible point.

Lars Schmidt-Thieme, Information Systems and Machine Learning Lab (ISMLL), University of Hildesheim, Germany

22 / 23

Page 34: Modern Optimization Techniques...Modern Optimization Techniques 1. Constrained Optimization Constrained Optimization Problems A constrained optimization problem has the form: minimize

Modern Optimization Techniques 3. Karush-Kuhn-Tucker Conditions

Summary (2/2)

I Karush-Kuhn-Tucker (KKT) conditions for (x , ν, λ):

1. primal feasibility2. dual feasibility3. complementary slackness4. stationarity

I KKT is a necessary condition for a primal/dual solution.

I If a problem is strongly dual,KKT are also a sufficient condition for a primal/dual solution.

Lars Schmidt-Thieme, Information Systems and Machine Learning Lab (ISMLL), University of Hildesheim, Germany

23 / 23

Page 35: Modern Optimization Techniques...Modern Optimization Techniques 1. Constrained Optimization Constrained Optimization Problems A constrained optimization problem has the form: minimize

Modern Optimization Techniques

Further Readings

I [Boyd and Vandenberghe, 2004, ch. 5]

I The proof that Slater’s condition is sufficient for strong duality can befound in [Boyd and Vandenberghe, 2004, ch. 5.3.2].

Lars Schmidt-Thieme, Information Systems and Machine Learning Lab (ISMLL), University of Hildesheim, Germany

24 / 23

Page 36: Modern Optimization Techniques...Modern Optimization Techniques 1. Constrained Optimization Constrained Optimization Problems A constrained optimization problem has the form: minimize

Modern Optimization Techniques

References

Stephen Boyd and Lieven Vandenberghe. Convex Optimization. Cambridge University Press, 2004.

Lars Schmidt-Thieme, Information Systems and Machine Learning Lab (ISMLL), University of Hildesheim, Germany

25 / 23