44
Gabriel Peyré www.numerical-tours.com Low Complexity Regularization of Inverse Problems Cours #1 Inverse Problems

Low Complexity Regularization of Inverse Problems - Course #1 Inverse Problems

Embed Size (px)

DESCRIPTION

Course given at "Journées de géométrie algorithmique", JGA 2013.

Citation preview

Page 1: Low Complexity Regularization of Inverse Problems - Course #1 Inverse Problems

Gabriel Peyré

www.numerical-tours.com

Low Complexity Regularization of Inverse Problems

Cours #1 Inverse Problems

Page 2: Low Complexity Regularization of Inverse Problems - Course #1 Inverse Problems

Overview of the Course

• Course #1: Inverse Problems

• Course #2: Recovery Guarantees

• Course #3: Proximal Splitting Methods

Page 3: Low Complexity Regularization of Inverse Problems - Course #1 Inverse Problems

Overview

• Inverse Problems

• Compressed Sensing

• Sparsity and L1 Regularization

Page 4: Low Complexity Regularization of Inverse Problems - Course #1 Inverse Problems

y = �x0 + w 2 RP

Inverse Problems

Recovering x0 � RN from noisy observations

Page 5: Low Complexity Regularization of Inverse Problems - Course #1 Inverse Problems

y = �x0 + w 2 RP

Examples: Inpainting, super-resolution, . . .

Inverse Problems

Recovering x0 � RN from noisy observations

x0

Page 6: Low Complexity Regularization of Inverse Problems - Course #1 Inverse Problems

�x = (p✓k)16k6K

Inverse Problems in Medical Imaging

Page 7: Low Complexity Regularization of Inverse Problems - Course #1 Inverse Problems

Magnetic resonance imaging (MRI):

�x = (p✓k)16k6K

�x = (f̂(!))!2⌦

Inverse Problems in Medical Imaging

Page 8: Low Complexity Regularization of Inverse Problems - Course #1 Inverse Problems

Magnetic resonance imaging (MRI):

Other examples: MEG, EEG, . . .

�x = (p✓k)16k6K

�x = (f̂(!))!2⌦

Inverse Problems in Medical Imaging

Page 9: Low Complexity Regularization of Inverse Problems - Course #1 Inverse Problems

Inverse Problem Regularization

observations y

parameter �Estimator: x(y) depends only on

Observations: y = �x0 + w 2 RP .

Page 10: Low Complexity Regularization of Inverse Problems - Course #1 Inverse Problems

Inverse Problem Regularization

observations y

parameter �

Example: variational methods

Estimator: x(y) depends only on

x(y) 2 argminx2RN

1

2||y � �x||2 + � J(x)

Data fidelity Regularity

Observations: y = �x0 + w 2 RP .

Page 11: Low Complexity Regularization of Inverse Problems - Course #1 Inverse Problems

J(x0)Regularity of x0

Inverse Problem Regularization

observations y

parameter �

Example: variational methods

Estimator: x(y) depends only on

x(y) 2 argminx2RN

1

2||y � �x||2 + � J(x)

Data fidelity Regularity

Observations: y = �x0 + w 2 RP .

Choice of �: tradeo� ||w||Noise level

Page 12: Low Complexity Regularization of Inverse Problems - Course #1 Inverse Problems

J(x0)Regularity of x0

x

? 2 argminx2RQ

,�x=y

J(x)

Inverse Problem Regularization

observations y

parameter �

Example: variational methods

Estimator: x(y) depends only on

x(y) 2 argminx2RN

1

2||y � �x||2 + � J(x)

Data fidelity Regularity

Observations: y = �x0 + w 2 RP .

No noise: �� 0+, minimize

Choice of �: tradeo� ||w||Noise level

Page 13: Low Complexity Regularization of Inverse Problems - Course #1 Inverse Problems

J(x0)Regularity of x0

This course:

Performance analysis.

Fast computational scheme.

x

? 2 argminx2RQ

,�x=y

J(x)

Inverse Problem Regularization

observations y

parameter �

Example: variational methods

Estimator: x(y) depends only on

x(y) 2 argminx2RN

1

2||y � �x||2 + � J(x)

Data fidelity Regularity

Observations: y = �x0 + w 2 RP .

No noise: �� 0+, minimize

Choice of �: tradeo� ||w||Noise level

Page 14: Low Complexity Regularization of Inverse Problems - Course #1 Inverse Problems

Overview

• Inverse Problems

• Compressed Sensing

• Sparsity and L1 Regularization

Page 15: Low Complexity Regularization of Inverse Problems - Course #1 Inverse Problems

x̃0

Compressed Sensing[Rice Univ.]

Page 16: Low Complexity Regularization of Inverse Problems - Course #1 Inverse Problems

P measures � N micro-mirrors

x̃0

y[i] = hx0, 'ii

Compressed Sensing[Rice Univ.]

Page 17: Low Complexity Regularization of Inverse Problems - Course #1 Inverse Problems

P/N = 0.16 P/N = 0.02P/N = 1

P measures � N micro-mirrors

x̃0

y[i] = hx0, 'ii

Compressed Sensing[Rice Univ.]

Page 18: Low Complexity Regularization of Inverse Problems - Course #1 Inverse Problems

Physical hardware resolution limit: target resolution f � RN .

micromirrors

arrayresolution

CS hardware

x̃0 2 L2x0 2 RN y 2 RP

CS Acquisition Model

CS is about designing hardware: input signals f̃ � L2(R2).

Page 19: Low Complexity Regularization of Inverse Problems - Course #1 Inverse Problems

Physical hardware resolution limit: target resolution f � RN .

micromirrors

arrayresolution

CS hardware

,

...

Operator �

x̃0 2 L2x0 2 RN y 2 RP

CS Acquisition Model

CS is about designing hardware: input signals f̃ � L2(R2).

,

,

x0

Page 20: Low Complexity Regularization of Inverse Problems - Course #1 Inverse Problems

Overview

• Inverse Problems

• Compressed Sensing

• Sparsity and L1 Regularization

Page 21: Low Complexity Regularization of Inverse Problems - Course #1 Inverse Problems

Q

N

Dictionary � = (�m)m � RQ�N , N � Q.

Redundant Dictionaries

Page 22: Low Complexity Regularization of Inverse Problems - Course #1 Inverse Problems

�m = ei�·, m�

frequencyFourier:

Q

N

Dictionary � = (�m)m � RQ�N , N � Q.

Redundant Dictionaries

Page 23: Low Complexity Regularization of Inverse Problems - Course #1 Inverse Problems

Wavelets:�m = �(2�jR��x� n)

m = (j, �, n)

scale

orientation

position

�m = ei�·, m�

frequencyFourier:

Q

N

Dictionary � = (�m)m � RQ�N , N � Q.

Redundant Dictionaries

� = 2� = 1

Page 24: Low Complexity Regularization of Inverse Problems - Course #1 Inverse Problems

Wavelets:�m = �(2�jR��x� n)

m = (j, �, n)

scale

orientation

position

�m = ei�·, m�

frequencyFourier:

DCT, Curvelets, bandlets, . . .

Q

N

Dictionary � = (�m)m � RQ�N , N � Q.

Redundant Dictionaries

� = 2� = 1

Page 25: Low Complexity Regularization of Inverse Problems - Course #1 Inverse Problems

Synthesis: f =�

m xm�m = �x.

xf

Image f = �xCoe�cients x

Wavelets:�m = �(2�jR��x� n)

m = (j, �, n)

scale

orientation

position

�m = ei�·, m�

frequencyFourier:

DCT, Curvelets, bandlets, . . .

Q

N

Dictionary � = (�m)m � RQ�N , N � Q.

Redundant Dictionaries

� =�

� = 2� = 1

Page 26: Low Complexity Regularization of Inverse Problems - Course #1 Inverse Problems

Ideal sparsity: for most m, xm = 0.

J0(x) = # {m \ xm �= 0}

Sparse Priors

Image f0

Coe�cients x

Page 27: Low Complexity Regularization of Inverse Problems - Course #1 Inverse Problems

Ideal sparsity: for most m, xm = 0.

J0(x) = # {m \ xm �= 0}

Sparse approximation: f = �x whereargminx2RN

||f0 � x||2 + T

2J0(x)

Sparse Priors

Image f0

Coe�cients x

Page 28: Low Complexity Regularization of Inverse Problems - Course #1 Inverse Problems

Ideal sparsity: for most m, xm = 0.

J0(x) = # {m \ xm �= 0}

Sparse approximation: f = �x where

Orthogonal �: ��� = ��� = IdN

xm =�

�f0, �m� if |�f0, �m�| > T,0 otherwise.

��

ST

f = � � ST � ��(f0)

argminx2RN

||f0 � x||2 + T

2J0(x)

Sparse Priors

Image f0

Coe�cients x

Page 29: Low Complexity Regularization of Inverse Problems - Course #1 Inverse Problems

Ideal sparsity: for most m, xm = 0.

J0(x) = # {m \ xm �= 0}

Sparse approximation: f = �x where

Orthogonal �: ��� = ��� = IdN

xm =�

�f0, �m� if |�f0, �m�| > T,0 otherwise.

��

ST

Non-orthogonal �:�� NP-hard.

f = � � ST � ��(f0)

argminx2RN

||f0 � x||2 + T

2J0(x)

Sparse Priors

Image f0

Coe�cients x

Page 30: Low Complexity Regularization of Inverse Problems - Course #1 Inverse Problems

Image with 2 pixels:

q = 0

J0(x) = # {m \ xm �= 0}J0(x) = 0 �� null image.J0(x) = 1 �� sparse image.J0(x) = 2 �� non-sparse image.

x2

Convex Relaxation: L1 Prior

x1

Page 31: Low Complexity Regularization of Inverse Problems - Course #1 Inverse Problems

Image with 2 pixels:

q = 0 q = 1 q = 2q = 3/2q = 1/2

Jq(x) =�

m

|xm|q

J0(x) = # {m \ xm �= 0}J0(x) = 0 �� null image.J0(x) = 1 �� sparse image.J0(x) = 2 �� non-sparse image.

x2

Convex Relaxation: L1 Prior

�q priors: (convex for q � 1)

x1

Page 32: Low Complexity Regularization of Inverse Problems - Course #1 Inverse Problems

Image with 2 pixels:

q = 0 q = 1 q = 2q = 3/2q = 1/2

Jq(x) =�

m

|xm|q

J0(x) = # {m \ xm �= 0}J0(x) = 0 �� null image.J0(x) = 1 �� sparse image.J0(x) = 2 �� non-sparse image.

x2

J1(x) =�

m

|xm|

Convex Relaxation: L1 Prior

Sparse �1 prior:

�q priors: (convex for q � 1)

x1

Page 33: Low Complexity Regularization of Inverse Problems - Course #1 Inverse Problems

L1 Regularization

coe�cientsx0 � RN

Page 34: Low Complexity Regularization of Inverse Problems - Course #1 Inverse Problems

L1 Regularization

coe�cients image�

x0 � RN f0 = �x0 � RQ

Page 35: Low Complexity Regularization of Inverse Problems - Course #1 Inverse Problems

L1 Regularization

observations

w

coe�cients image� K

x0 � RN f0 = �x0 � RQ y = Kf0 + w � RP

Page 36: Low Complexity Regularization of Inverse Problems - Course #1 Inverse Problems

L1 Regularization

observations

� = K �⇥ ⇥ RP�N

w

coe�cients image� K

x0 � RN f0 = �x0 � RQ y = Kf0 + w � RP

Page 37: Low Complexity Regularization of Inverse Problems - Course #1 Inverse Problems

Fidelity Regularization

minx�RN

12

||y � �x||2 + �||x||1

L1 Regularization

Sparse recovery: f� = �x� where x� solves

observations

� = K �⇥ ⇥ RP�N

w

coe�cients image� K

x0 � RN f0 = �x0 � RQ y = Kf0 + w � RP

Page 38: Low Complexity Regularization of Inverse Problems - Course #1 Inverse Problems

x� � argmin�x=y

m

|xm|

x�

�x = y

Noiseless Sparse Regularization

Noiseless measurements: y = �x0

Page 39: Low Complexity Regularization of Inverse Problems - Course #1 Inverse Problems

x� � argmin�x=y

m

|xm|

x�

�x = y

x� � argmin�x=y

m

|xm|2

Noiseless Sparse Regularization

x�

�x = y

Noiseless measurements: y = �x0

Page 40: Low Complexity Regularization of Inverse Problems - Course #1 Inverse Problems

x� � argmin�x=y

m

|xm|

x�

�x = y

x� � argmin�x=y

m

|xm|2

Noiseless Sparse Regularization

Convex linear program.Interior points, cf. [Chen, Donoho, Saunders] “basis pursuit”.

Douglas-Rachford splitting, see [Combettes, Pesquet].

x�

�x = y

Noiseless measurements: y = �x0

Page 41: Low Complexity Regularization of Inverse Problems - Course #1 Inverse Problems

RegularizationData fidelity

y = �x0 + wNoisy measurements:

x� � argminx�RQ

12

||y � �x||2 + � ||x||1

Noisy Sparse Regularization

Page 42: Low Complexity Regularization of Inverse Problems - Course #1 Inverse Problems

�� �RegularizationData fidelityEquivalence

||�x =y|| �

y = �x0 + wNoisy measurements:

x� � argminx�RQ

12

||y � �x||2 + � ||x||1

x� � argmin||�x�y||��

||x||1

Noisy Sparse Regularization

x�

Page 43: Low Complexity Regularization of Inverse Problems - Course #1 Inverse Problems

Iterative soft thresholdingForward-backward splitting

Algorithms:

�� �RegularizationData fidelityEquivalence

||�x =y|| �

y = �x0 + wNoisy measurements:

x� � argminx�RQ

12

||y � �x||2 + � ||x||1

x� � argmin||�x�y||��

||x||1

Noisy Sparse Regularization

Nesterov multi-steps schemes.

see [Daubechies et al], [Pesquet et al], etc

��

x�

Page 44: Low Complexity Regularization of Inverse Problems - Course #1 Inverse Problems

K

y = Kf0 + wMeasures:

(Kf)i =

⇢0 if i 2 ⌦,fi if i /2 ⌦.

Inpainting Problem