36
Compressive Sensing and Beyond Sohail Bahmani Gerorgia Tech.

Compressed Sensing andmlsp.cs.cmu.edu/courses/fall2014/lectures/slides/Sohail... · 2014-10-21 · Sohail Bahmani Gerorgia Tech. Compressed Sensing Signal Processing . Signal Models

  • Upload
    others

  • View
    1

  • Download
    0

Embed Size (px)

Citation preview

Page 1: Compressed Sensing andmlsp.cs.cmu.edu/courses/fall2014/lectures/slides/Sohail... · 2014-10-21 · Sohail Bahmani Gerorgia Tech. Compressed Sensing Signal Processing . Signal Models

Compressive Sensing and Beyond

Sohail Bahmani

Gerorgia Tech.

Page 2: Compressed Sensing andmlsp.cs.cmu.edu/courses/fall2014/lectures/slides/Sohail... · 2014-10-21 · Sohail Bahmani Gerorgia Tech. Compressed Sensing Signal Processing . Signal Models
Page 3: Compressed Sensing andmlsp.cs.cmu.edu/courses/fall2014/lectures/slides/Sohail... · 2014-10-21 · Sohail Bahmani Gerorgia Tech. Compressed Sensing Signal Processing . Signal Models

Compressed Sensing

Signal Processing

Page 4: Compressed Sensing andmlsp.cs.cmu.edu/courses/fall2014/lectures/slides/Sohail... · 2014-10-21 · Sohail Bahmani Gerorgia Tech. Compressed Sensing Signal Processing . Signal Models

Signal Models

Page 5: Compressed Sensing andmlsp.cs.cmu.edu/courses/fall2014/lectures/slides/Sohail... · 2014-10-21 · Sohail Bahmani Gerorgia Tech. Compressed Sensing Signal Processing . Signal Models

The Sampling Theorem

Any signal with bandwidth 𝐵 can be recovered from its samples collected uniformly at a rate no less that 𝑓s = 2𝐵.

Classics: bandlimited

Claude Shannon Harry Nyquist

𝑥(𝑡) = 𝑥𝑛sinc 𝑓s𝑡 − 𝑛

𝑛∈ℤ

𝑥𝑛 = 𝑥 𝑡 , sinc 𝑓s𝑡 − 𝑛

-3 -2 -1 0 1 2 3

0

1

1

Page 6: Compressed Sensing andmlsp.cs.cmu.edu/courses/fall2014/lectures/slides/Sohail... · 2014-10-21 · Sohail Bahmani Gerorgia Tech. Compressed Sensing Signal Processing . Signal Models

𝑋(𝜏, 𝜔) = 𝑥 𝑡 , 𝑤 𝑡 − 𝜏 𝑒j𝜔𝑡

𝑥 𝑡 =1

2𝜋 𝑋 𝜏,𝜔 𝑒j𝜔𝑡d𝜔d𝜏

ℝℝ

𝜓𝑚,𝑛(𝑡) = 𝑎−𝑚2𝜓 𝑎−𝑚𝑡 − 𝑛𝑏

𝑥 𝑡 = 𝑥 𝑡 ,𝜓𝑚,𝑛(𝑡) 𝜓𝑚,𝑛(𝑡)

𝑛∈ℤ

𝑚∈ℤ

Classics: time-frequency

A. Haar

I. Daubechies R. Coifman

D Gabor

2

Page 7: Compressed Sensing andmlsp.cs.cmu.edu/courses/fall2014/lectures/slides/Sohail... · 2014-10-21 · Sohail Bahmani Gerorgia Tech. Compressed Sensing Signal Processing . Signal Models

Sparsity

=

𝒙 𝑭 𝒙

𝑭Ω𝒙 ∣Ω

=

𝒙 T. Tao E. Candès J. Romberg

D. Donoho

3

Page 8: Compressed Sensing andmlsp.cs.cmu.edu/courses/fall2014/lectures/slides/Sohail... · 2014-10-21 · Sohail Bahmani Gerorgia Tech. Compressed Sensing Signal Processing . Signal Models

Structured sparsity

Discrete Wavelet Transform

Tree Structured Models Graph Structured Models

Protein-Protein Interactions Gene Regulatory Network

Couch Lab, George Mason University, http://osf1.gmu.edu/~rcouch/protprot.jpg 4

Page 9: Compressed Sensing andmlsp.cs.cmu.edu/courses/fall2014/lectures/slides/Sohail... · 2014-10-21 · Sohail Bahmani Gerorgia Tech. Compressed Sensing Signal Processing . Signal Models

Other applications

minimum order linear system realization

low-rank matrix completion

low-dimensional Euclidean embedding

Low rank

B. Recht

M.Fazel

P. Parrilo

Model for Images

5

Page 10: Compressed Sensing andmlsp.cs.cmu.edu/courses/fall2014/lectures/slides/Sohail... · 2014-10-21 · Sohail Bahmani Gerorgia Tech. Compressed Sensing Signal Processing . Signal Models

Compressive Sensing

Page 11: Compressed Sensing andmlsp.cs.cmu.edu/courses/fall2014/lectures/slides/Sohail... · 2014-10-21 · Sohail Bahmani Gerorgia Tech. Compressed Sensing Signal Processing . Signal Models

𝒙⋆

𝒙 = 𝑨†𝒚

𝑨𝒙 = 𝒚

Linear inverse problems

=

𝒙⋆

𝑚 𝑛

𝑨 𝒚

𝑚 < 𝑛 No unique inverse, but …

6

Page 12: Compressed Sensing andmlsp.cs.cmu.edu/courses/fall2014/lectures/slides/Sohail... · 2014-10-21 · Sohail Bahmani Gerorgia Tech. Compressed Sensing Signal Processing . Signal Models

argmin

𝒙𝒙 0

subject to 𝑨𝒙 = 𝒚

𝒙 0 = supp 𝒙

supp 𝒙 = 𝑖 𝑥𝑖 ≠ 0

Generally NP-hard!

Tractable for many interesting 𝑨

Sparse solutions

𝑨𝒙𝟏

𝑨𝒙𝟐

𝒙𝟐

𝒙𝟏

7

Page 13: Compressed Sensing andmlsp.cs.cmu.edu/courses/fall2014/lectures/slides/Sohail... · 2014-10-21 · Sohail Bahmani Gerorgia Tech. Compressed Sensing Signal Processing . Signal Models

Many sufficient conditions proposed

Nullspace property, Incoherence, Restricted Eigenvalue, …

Restricted Isometry Property

1 − 𝛿𝑘 𝒙 22 ≤ 𝑨𝒙 2

2 ≤ 1 + 𝛿𝑘 𝒙 22, ∀𝒙: 𝒙 0 ≤ 𝑘

Specialize 𝑨

𝑨𝒙𝟏

𝑨𝒙𝟐

1 ± 𝛿𝑑 𝑑

𝒙𝟐

𝒙𝟏

8

Page 14: Compressed Sensing andmlsp.cs.cmu.edu/courses/fall2014/lectures/slides/Sohail... · 2014-10-21 · Sohail Bahmani Gerorgia Tech. Compressed Sensing Signal Processing . Signal Models

Randomness comes to rescue Random matrices can exhibit RIP:

Random matrices with iid entries

Gaussian, Rademacher (symmetric Bernoulli), Uniform

Structured random matrices

Random partial Fourier matrices

Random circulant matrices

RIP holds with high probability if 𝑚 ≳ 𝑘 log𝛾 𝑛

9

Page 15: Compressed Sensing andmlsp.cs.cmu.edu/courses/fall2014/lectures/slides/Sohail... · 2014-10-21 · Sohail Bahmani Gerorgia Tech. Compressed Sensing Signal Processing . Signal Models

argmin𝒙

𝒙 1

subject to 𝑨𝒙 = 𝒚

𝒙 1 = 𝑥𝑖𝑖

Theorem [Candès] If 𝑨 satisfies 𝛿2𝑘-RIP

with 𝛿2𝑘 < 2 − 1, then BP recovers any 𝑘-sparse target exactly.

Basis Pursuit (ℓ𝟏-minimization)

10

Page 16: Compressed Sensing andmlsp.cs.cmu.edu/courses/fall2014/lectures/slides/Sohail... · 2014-10-21 · Sohail Bahmani Gerorgia Tech. Compressed Sensing Signal Processing . Signal Models

Basis Pursuit (ℓ𝟏-minimization)

0 100 200 300 400 500 600 700 800 900 1000-2

-1.5

-1

-0.5

0

0.5

1

1.5

I ndex

Va

lue

Spar se Signal

`1 Reconst r uct ion

CVX

11

Page 17: Compressed Sensing andmlsp.cs.cmu.edu/courses/fall2014/lectures/slides/Sohail... · 2014-10-21 · Sohail Bahmani Gerorgia Tech. Compressed Sensing Signal Processing . Signal Models

OMP, StOMP, IHT, CoSAMP, SP, …

Iteratively estimate the support and the values on the support

Many of them have RIP-based convergence guarantees

CoSaMP (Needell and Tropp)

Proxy vector 𝒛 = 𝑨T(𝑨𝒙 − 𝒚)

𝒵 = supp 𝒛2𝑘

𝒃 = argmin𝒙 𝑨𝒙 − 𝒚 22 s. t. 𝒙 ∣𝒯c= 𝟎

Converges with 𝛿4𝑘 ≤ 0.1

Greedy algorithms

𝒛 1

𝒵

2

𝒯

3

𝒃 4

𝒙

supp 𝒙

Update

𝒃𝑘 5

12

Page 18: Compressed Sensing andmlsp.cs.cmu.edu/courses/fall2014/lectures/slides/Sohail... · 2014-10-21 · Sohail Bahmani Gerorgia Tech. Compressed Sensing Signal Processing . Signal Models

Rice single-pixel camera

dim. reduction @ 20% dim. reduction @ 40% oroginal object

Wakin et al

13

Page 19: Compressed Sensing andmlsp.cs.cmu.edu/courses/fall2014/lectures/slides/Sohail... · 2014-10-21 · Sohail Bahmani Gerorgia Tech. Compressed Sensing Signal Processing . Signal Models

Compressive MRI Lustig et al

Acquisition Speed

CS MRI

Traditional MRI

14

Page 20: Compressed Sensing andmlsp.cs.cmu.edu/courses/fall2014/lectures/slides/Sohail... · 2014-10-21 · Sohail Bahmani Gerorgia Tech. Compressed Sensing Signal Processing . Signal Models

Image super-resolution Yang et al

CS-based reconstruction Original

Sparse representation 𝜶

𝑫h𝜶

𝑫l𝜶

15

Page 21: Compressed Sensing andmlsp.cs.cmu.edu/courses/fall2014/lectures/slides/Sohail... · 2014-10-21 · Sohail Bahmani Gerorgia Tech. Compressed Sensing Signal Processing . Signal Models

Low-Rank Matrix Recovery

Page 22: Compressed Sensing andmlsp.cs.cmu.edu/courses/fall2014/lectures/slides/Sohail... · 2014-10-21 · Sohail Bahmani Gerorgia Tech. Compressed Sensing Signal Processing . Signal Models

Simple least squares requires 𝑛1𝑛2 measurements

Degrees of freedom is 𝑟 𝑛1 + 𝑛2 − 1

Can we close the gap?

Linear systems w/ low-rank solutions

𝑦𝑖 = 𝑿⋆ = , 𝑨𝑖𝑼 𝑽∗ 𝑛1

𝑛2

𝑟

𝑟

16

Page 23: Compressed Sensing andmlsp.cs.cmu.edu/courses/fall2014/lectures/slides/Sohail... · 2014-10-21 · Sohail Bahmani Gerorgia Tech. Compressed Sensing Signal Processing . Signal Models

argmin

𝑿rank 𝑿

subject to 𝒜 𝑿 = 𝒚

If identifiable, it recovers the low-rank solution exactly

Just like ℓ0-minimization, it is generally NP-hard

Special measurement operators 𝒜 admit efficient solvers

Rank minimization

17

Page 24: Compressed Sensing andmlsp.cs.cmu.edu/courses/fall2014/lectures/slides/Sohail... · 2014-10-21 · Sohail Bahmani Gerorgia Tech. Compressed Sensing Signal Processing . Signal Models

𝑨𝑖 with iid entries exhibit low-rank RIP

Gaussian

Rademacher

Uniform

Some structured matrices

Rank-one 𝑨𝑖 = 𝒂𝑖𝒃𝑖∗

𝑨𝑖 with dependent entries

Standard RIP usually doesn’t hold, but other approaches exist

Random measurements

Universal

Instance optimal

18

Page 25: Compressed Sensing andmlsp.cs.cmu.edu/courses/fall2014/lectures/slides/Sohail... · 2014-10-21 · Sohail Bahmani Gerorgia Tech. Compressed Sensing Signal Processing . Signal Models

𝒜 𝑿 = 𝒚

𝑿⋆

argmin𝑿

𝑿 ∗

subject to 𝒜 𝑿 = 𝒚

𝑿 ∗ = 𝜎𝑖𝑖

Theorem [Recht, Fazel, Parrilo] If 𝒜 obeys 𝛿5𝑟 -RIP with 𝛿5𝑟 < 0.1 then nuclear-norm minimization recovers any rank-𝑟 target exactly.

Nuclear-norm minimization

19

Page 26: Compressed Sensing andmlsp.cs.cmu.edu/courses/fall2014/lectures/slides/Sohail... · 2014-10-21 · Sohail Bahmani Gerorgia Tech. Compressed Sensing Signal Processing . Signal Models

Theorem [Candès and Recht] If 𝑴 is a rank-𝑟 matrix from the “random orthogonal model”, then w.h.p. the nuclear-norm

minimization recovers 𝑴 exactly from 𝒪(𝑟𝑛5 4 log 𝑛) uniformly observed entries, where 𝑛 = 𝑛1 ∨ 𝑛2.

Matrix completion

Alice 4 5 ? 3 ? … ? 1

Bob ? 2 ? 4 5 … ? ?

.

.

.

Yvon 5 ? 4 3 2 … 2 ?

Zelda 3 2 ? ? 5 … 2 2

20

Page 27: Compressed Sensing andmlsp.cs.cmu.edu/courses/fall2014/lectures/slides/Sohail... · 2014-10-21 · Sohail Bahmani Gerorgia Tech. Compressed Sensing Signal Processing . Signal Models

Background Modeling

Ordinary PCA is sensitive to noise and outliers

𝑴 = 𝑳 + 𝑺

argmin𝑳,𝑺 𝑳 ∗ + 𝜆 𝑺 1

subject to 𝑃Ωobs𝑳 + 𝑺 = 𝒀

Robust PCA Candès et al

low-rank component sparse component

21

Page 28: Compressed Sensing andmlsp.cs.cmu.edu/courses/fall2014/lectures/slides/Sohail... · 2014-10-21 · Sohail Bahmani Gerorgia Tech. Compressed Sensing Signal Processing . Signal Models

Compressive multiplexing Ahmed and Romberg

Ω~𝑅 𝑀 +𝑊 log3 𝑀𝑊

22

𝑥𝑚 𝑡 = 𝛼𝑚 𝜔 𝑒j2𝜋𝜔𝑡𝐵

𝜔=−𝐵

𝑊 = 2𝐵 + 1

Page 29: Compressed Sensing andmlsp.cs.cmu.edu/courses/fall2014/lectures/slides/Sohail... · 2014-10-21 · Sohail Bahmani Gerorgia Tech. Compressed Sensing Signal Processing . Signal Models

Nonlinear CS

Page 30: Compressed Sensing andmlsp.cs.cmu.edu/courses/fall2014/lectures/slides/Sohail... · 2014-10-21 · Sohail Bahmani Gerorgia Tech. Compressed Sensing Signal Processing . Signal Models

Squared error isn’t always appropriate

example : non-Gaussian noise models

argmin𝒙 𝑓 𝒙

subject to 𝒙 0 ≤ 𝑘

It’s challenging in its general form

Objectives with certain properties allow approximations

convex relaxation: ℓ1-regularization

greedy methods

Sparsity-constrained minimization

23

Page 31: Compressed Sensing andmlsp.cs.cmu.edu/courses/fall2014/lectures/slides/Sohail... · 2014-10-21 · Sohail Bahmani Gerorgia Tech. Compressed Sensing Signal Processing . Signal Models

Sparse Logistic Regression

argmin𝒙

log 1 + 𝑒𝒂𝑖T𝒙 − 𝑦𝑖𝒂𝑖

T𝒙

𝑚

𝑖=1

subject to 𝒙 0 ≤ 𝑘, 𝒙 2 ≤ 1

Gene classification

DNA Microarray

𝒙 : sparse weights for genes

𝒂𝑖 : microarray sample

𝑦𝑖 : binary sample label (healthy/diseased)

model: 𝑝 𝑦 𝒂; 𝒙

24

Page 32: Compressed Sensing andmlsp.cs.cmu.edu/courses/fall2014/lectures/slides/Sohail... · 2014-10-21 · Sohail Bahmani Gerorgia Tech. Compressed Sensing Signal Processing . Signal Models

Photon noise follows a Poisson distribution

𝑝 𝒚 𝑨; 𝒙 = 𝑨𝒙 𝑖

𝑦𝑖

𝑦𝑖!𝑒− 𝑨𝒙 𝑖

𝑖

Physical constraints 𝑨 ≥ 0, 𝑨𝒙 ≥ 0, 𝒙 ≥ 0

SPIRAL-TAP argmin𝒙≥0

− log 𝑝 𝒚 𝑨; 𝒙 + 𝜏𝑅(𝒙)

Imaging under photon noise

𝒙 1

𝑾T𝒙1

𝒙 TV …

Harmany et al

25

Page 33: Compressed Sensing andmlsp.cs.cmu.edu/courses/fall2014/lectures/slides/Sohail... · 2014-10-21 · Sohail Bahmani Gerorgia Tech. Compressed Sensing Signal Processing . Signal Models

Coherent diffractive imaging

* Lima et al., ESRF (www.esrf.eu)

𝑓 𝒙 = 𝑳𝑭𝑪𝒙 2 − 𝑰 22

𝑰 : diffracted field intensity

𝑪𝒙 : real image

𝑪 : Shifts of generating function 𝑳 : low-pass filter 𝑭 : Fourier transform *

*

Szameit et al

26

Page 34: Compressed Sensing andmlsp.cs.cmu.edu/courses/fall2014/lectures/slides/Sohail... · 2014-10-21 · Sohail Bahmani Gerorgia Tech. Compressed Sensing Signal Processing . Signal Models

Generalizes CoSaMP

Proxy vector 𝒛 = 𝛻𝑓 𝒙

𝒵 = supp 𝒛2𝑘

𝒃 = argmin𝒙 𝑓 𝒙 s. t. 𝒙 ∣𝒯c= 𝟎

Converges under SRH

the Hessian 𝛻2𝑓 𝒙 is well-conditioned when restricted to sparse subspaces

a generalization of the RIP

Gradient Support Pursuit 𝒛

1

𝒵

2

𝒯

3

𝒃 4

𝒙

supp 𝒙

Update

𝒃𝑘 5

w/ Raj and Boufounos

27

Page 35: Compressed Sensing andmlsp.cs.cmu.edu/courses/fall2014/lectures/slides/Sohail... · 2014-10-21 · Sohail Bahmani Gerorgia Tech. Compressed Sensing Signal Processing . Signal Models

There’s much more …

28