Introduction to Compressed Sensing · Introduction to Compressed Sensing Gitta Kutyniok (Institut...

Preview:

Citation preview

Introduction to Compressed Sensing

Gitta Kutyniok

(Institut fur Mathematik, Technische Universitat Berlin)

Winter School on “Compressed Sensing”, TU BerlinDecember 3–5, 2015

Gitta Kutyniok (TU Berlin) Introduction to Compressed Sensing Winter School 2015 1 / 40

Outline

1 Modern Data ProcessingData DelugeInformation Content of DataWhy do we need Compressed Sensing?

2 Main Ideas of Compressed SensingSparsityMeasurement MatricesRecovery Algorithms

3 Applications

4 This Winter School

Gitta Kutyniok (TU Berlin) Introduction to Compressed Sensing Winter School 2015 2 / 40

The Age of Data

Problem of the 21th Century:

We live in a digitalized world.

Slogan: “Big Data”.

New technologies produce/sense enormous amounts of data.

Problems: Storage, Transmission, and Analysis.

“Big Data Research and Development Initiative”

Barack Obama (March 2012)

Gitta Kutyniok (TU Berlin) Introduction to Compressed Sensing Winter School 2015 3 / 40

Olympic Games 2012

Gitta Kutyniok (TU Berlin) Introduction to Compressed Sensing Winter School 2015 4 / 40

Better, Stronger, Faster!

Gitta Kutyniok (TU Berlin) Introduction to Compressed Sensing Winter School 2015 5 / 40

Accelerating Data Deluge

Situation 2010:

1250 Billion Gigabytes generated in 2010:

# digital bits > # stars in the universe

Growing by a factor of 10 every 5 years.

Available transmission bandwidth

Observations:

Total data generated > total storage

Increases in generation rate >> increases in communication rate

Gitta Kutyniok (TU Berlin) Introduction to Compressed Sensing Winter School 2015 6 / 40

What can we do...?

Gitta Kutyniok (TU Berlin) Introduction to Compressed Sensing Winter School 2015 7 / 40

Quote by Einstein

“Not everything that can be counted counts,

and not everything that counts can be counted.”

Albert Einstein

Gitta Kutyniok (TU Berlin) Introduction to Compressed Sensing Winter School 2015 8 / 40

An Applied Harmonic Analysis Viewpoint

Exploit a carefully designed representation system (ψλ)λ∈Λ ⊆ H:

H ∋ f −→ (〈f , ψλ〉)λ∈Λ −→∑

λ∈Λ

〈f , ψλ〉ψλ = f .

Desiderata:

Special features encoded in the “large” coefficients | 〈f , ψλ〉 |.Efficient representations:

f ≈∑

λ∈ΛN

〈f , ψλ〉ψλ, #(ΛN) small

Goals:

Derive high compression by considering only the “large” coefficients.

Modification of the coefficients according to the task.

Gitta Kutyniok (TU Berlin) Introduction to Compressed Sensing Winter School 2015 9 / 40

Review of Wavelets for L2(R2)

Definition (1D): Let φ ∈ L2(R) be a scaling function and ψ ∈ L2(R) be awavelet. Then the associated wavelet system is defined by

{φ(x −m) : m ∈ Z} ∪ {2j/2 ψ(2jx −m) : j ≥ 0,m ∈ Z}.

Gitta Kutyniok (TU Berlin) Introduction to Compressed Sensing Winter School 2015 10 / 40

Review of Wavelets for L2(R2)

Definition (1D): Let φ ∈ L2(R) be a scaling function and ψ ∈ L2(R) be awavelet. Then the associated wavelet system is defined by

{φ(x −m) : m ∈ Z} ∪ {2j/2 ψ(2jx −m) : j ≥ 0,m ∈ Z}.

Definition (2D): A wavelet system is defined by

{φ(1)(x −m) : m ∈ Z2} ∪ {2jψ(i)(2jx −m) : j ≥ 0,m ∈ Z

2, i = 1, 2, 3},

where ψ(1)(x) = φ(x1)ψ(x2),

φ(1)(x) = φ(x1)φ(x2) and ψ(2)(x) = ψ(x1)φ(x2),

ψ(3)(x) = ψ(x1)ψ(x2).

Gitta Kutyniok (TU Berlin) Introduction to Compressed Sensing Winter School 2015 10 / 40

The World is Compressible!

N pixelsk << Nlarge waveletcoefficients

N widebandsignalsamples

k << Nlarge Gaborcoefficients

Gitta Kutyniok (TU Berlin) Introduction to Compressed Sensing Winter School 2015 11 / 40

JPEG2000

Kompression auf 1/20 Kompression auf 1/200

Gitta Kutyniok (TU Berlin) Introduction to Compressed Sensing Winter School 2015 12 / 40

The New Paradigm for Data Processing: Sparsity!

Sparse Signals:A signal x ∈ R

N is k-sparse, if

‖x‖0 = #non-zero coefficients ≤ k .

Model Σk : Union of k-dimensional subspaces.

Compressible Signals:A signal x ∈ R

N is compressible, if the sortedcoefficients have rapid (power law) decay.

Model: ℓp ball with p ≤ 1.

|xi |

k N

Gitta Kutyniok (TU Berlin) Introduction to Compressed Sensing Winter School 2015 13 / 40

“Not everything that can be counted counts...” (Einstein)

Classical Approach:

Sensing/Sampling Compression ReconstructionxN k N

x

Gitta Kutyniok (TU Berlin) Introduction to Compressed Sensing Winter School 2015 14 / 40

“Not everything that can be counted counts...” (Einstein)

Classical Approach:

Sensing/Sampling Compression ReconstructionxN k N

x

Sensing/Sampling:◮ Linear processing.

Compression:◮ Non-linear processing.

Why acquire N samples only to discard all but k pieces of data?

Gitta Kutyniok (TU Berlin) Introduction to Compressed Sensing Winter School 2015 14 / 40

“Not everything that can be counted counts...” (Einstein)

Classical Approach:

Sensing/Sampling Compression ReconstructionxN k N

x

Sensing/Sampling:◮ Linear processing.

Compression:◮ Non-linear processing.

Why acquire N samples only to discard all but k pieces of data?

Fundamental Idea:

Directly acquire “compressed data”, i.e., the information content.

Take more universal measurements:

Compressed Sensing Reconstructionx(k <)n(<< N) N

x

Gitta Kutyniok (TU Berlin) Introduction to Compressed Sensing Winter School 2015 14 / 40

Compressed Sensing enters the Stage

‘Initial’ Papers:

E. Candes, J. Romberg, T. Tao, Stable signal recovery from incomplete and

inaccurate measurements, Comm. Pure Appl. Math. 59 (2006), 1207–1223.

D. Donoho, Compressed sensing, IEEE Trans. Inform. Theory 52 (2006),1289–1306.

Avalanche of Results (dsp.rice.edu/cs):

Approx. 2000 papers and 150 conferences so far.

Gitta Kutyniok (TU Berlin) Introduction to Compressed Sensing Winter School 2015 15 / 40

Compressed Sensing enters the Stage

‘Initial’ Papers:

E. Candes, J. Romberg, T. Tao, Stable signal recovery from incomplete and

inaccurate measurements, Comm. Pure Appl. Math. 59 (2006), 1207–1223.

D. Donoho, Compressed sensing, IEEE Trans. Inform. Theory 52 (2006),1289–1306.

Avalanche of Results (dsp.rice.edu/cs):

Approx. 2000 papers and 150 conferences so far.

Relation to the following areas:

Applied harmonic analysis.

Applied linear algebra.

Convex optimization.

Geometric functional analysis.

Random matrix theory.

Application areas: Radar, Astronomy, Biology, Seismology, Signalprocessing and more.

Gitta Kutyniok (TU Berlin) Introduction to Compressed Sensing Winter School 2015 15 / 40

What is Compressed Sensing...?

Gitta Kutyniok (TU Berlin) Introduction to Compressed Sensing Winter School 2015 16 / 40

Compressed Sensing Problem, I

General Procedure:

Signal x ∈ RN .

x is k-sparse.

Take n << N linear, non-adaptive measurements using a matrix A.

=

x

Ay

Gitta Kutyniok (TU Berlin) Introduction to Compressed Sensing Winter School 2015 17 / 40

Compressed Sensing Problem, I

General Procedure:

Signal x ∈ RN .

x is k-sparse.

Take n << N linear, non-adaptive measurements using a matrix A.

=

x

Ay

Viewpoints:

Efficient sampling.

Dimension reduction.

Efficient representation.

Gitta Kutyniok (TU Berlin) Introduction to Compressed Sensing Winter School 2015 17 / 40

Compressed Sensing Problem, II

=

x

Ay

Fundamental Questions:

What are suitable signal models?

When and with which accuracy can the signal be recovered?

What are suitable sensing matrices?

How can the signal be algorithmically recovered?

Gitta Kutyniok (TU Berlin) Introduction to Compressed Sensing Winter School 2015 18 / 40

Fundamental Theorem of Sparse Solutions

Definition:Let A be an n × N matrix. Then spark(A) denotes the minimal number oflinearly dependent columns; spark(A) ∈ [2, n + 1].

Gitta Kutyniok (TU Berlin) Introduction to Compressed Sensing Winter School 2015 19 / 40

Fundamental Theorem of Sparse Solutions

Definition:Let A be an n × N matrix. Then spark(A) denotes the minimal number oflinearly dependent columns; spark(A) ∈ [2, n + 1].

Lemma:Let A be an n × N matrix, and let k ∈ N. Then the following conditionsare equivalent:

(i) For every y ∈ Rn, there exists at most one x ∈ R

N with ‖x‖0 ≤ ksuch that y = Ax .

(ii) k < spark(A)/2.

Gitta Kutyniok (TU Berlin) Introduction to Compressed Sensing Winter School 2015 19 / 40

Fundamental Theorem of Sparse Solutions

Definition:Let A be an n × N matrix. Then spark(A) denotes the minimal number oflinearly dependent columns; spark(A) ∈ [2, n + 1].

Lemma:Let A be an n × N matrix, and let k ∈ N. Then the following conditionsare equivalent:

(i) For every y ∈ Rn, there exists at most one x ∈ R

N with ‖x‖0 ≤ ksuch that y = Ax .

(ii) k < spark(A)/2.

Sketch of Proof:

Assume y = Ax0 = Ax1.

Then x0 − x1 ∈ N (A).

Gitta Kutyniok (TU Berlin) Introduction to Compressed Sensing Winter School 2015 19 / 40

Sparsity and ℓ1

Assumption: Letting A be an n × N-matrix, n << N, the seeked solutionx0 of y = Ax0 satisfies:

‖x0‖0 = #{i : x0i 6= 0} is ‘small’, i.e., x0 is sparse.

Gitta Kutyniok (TU Berlin) Introduction to Compressed Sensing Winter School 2015 20 / 40

Sparsity and ℓ1

Assumption: Letting A be an n × N-matrix, n << N, the seeked solutionx0 of y = Ax0 satisfies:

‖x0‖0 = #{i : x0i 6= 0} is ‘small’, i.e., x0 is sparse.

Ideal: Solve...(P0) min

x‖x‖0 subject to y = Ax

Gitta Kutyniok (TU Berlin) Introduction to Compressed Sensing Winter School 2015 20 / 40

Sparsity and ℓ1

Assumption: Letting A be an n × N-matrix, n << N, the seeked solutionx0 of y = Ax0 satisfies:

‖x0‖0 = #{i : x0i 6= 0} is ‘small’, i.e., x0 is sparse.

Ideal: Solve...(P0) min

x‖x‖0 subject to y = Ax

Basis Pursuit (Chen, Donoho, Saunders; 1998)

(P1) minx

‖x‖1 subject to y = Ax

−→ This can be solved by linear programming!

Gitta Kutyniok (TU Berlin) Introduction to Compressed Sensing Winter School 2015 20 / 40

Sparsity and ℓ1

Assumption: Letting A be an n × N-matrix, n << N, the seeked solutionx0 of y = Ax0 satisfies:

‖x0‖0 = #{i : x0i 6= 0} is ‘small’, i.e., x0 is sparse.

Ideal: Solve...(P0) min

x‖x‖0 subject to y = Ax

Basis Pursuit (Chen, Donoho, Saunders; 1998)

(P1) minx

‖x‖1 subject to y = Ax

−→ This can be solved by linear programming!

Meta-Result: If the solution x0 is sufficiently sparse, and A is sufficientlyincoherent, then x0 can be recovered from y via (P1).

Gitta Kutyniok (TU Berlin) Introduction to Compressed Sensing Winter School 2015 20 / 40

ℓ1 promotes Sparsity!

{x : y = Ax}

min ‖x‖2 s.t. y = Ax

min ‖x‖1 s.t. y = Ax

Gitta Kutyniok (TU Berlin) Introduction to Compressed Sensing Winter School 2015 21 / 40

Equivalent Condition for Uniqueness of ℓ1

Reminder:spark(A) = min{k : N (A) ∩ Σk 6= {0}}.

Definition:Let A be an n × N matrix. Then A has the null space property of order k ,if, for all h ∈ N (A) \ {0} and for all index sets |Λ| ≤ k ,

‖1Λh‖1 < 12‖h‖1.

Gitta Kutyniok (TU Berlin) Introduction to Compressed Sensing Winter School 2015 22 / 40

Equivalent Condition for Uniqueness of ℓ1

Reminder:spark(A) = min{k : N (A) ∩ Σk 6= {0}}.

Definition:Let A be an n × N matrix. Then A has the null space property of order k ,if, for all h ∈ N (A) \ {0} and for all index sets |Λ| ≤ k ,

‖1Λh‖1 < 12‖h‖1.

Theorem (Cohen, Dahmen, DeVore; 2008):Let A be an n × N matrix, and let k ∈ N. The following are equivalent:

(i) For every y ∈ Rn, there exists at most one solution in Σk of

minx

‖x‖1 subject to y = Ax .

(ii) A satisfies the null space property of order k .

Gitta Kutyniok (TU Berlin) Introduction to Compressed Sensing Winter School 2015 22 / 40

Sufficient Condition for ‘ℓ0 = ℓ1’: Coherence

Definition:Let A = (ai)

Ni=1 be an n× N matrix. Then its coherence µ(A) is

µ(A) = maxi 6=j

| 〈ai , aj〉 |‖ai‖2‖aj‖2

∈[

N − n

n(N − 1), 1]

.

Gitta Kutyniok (TU Berlin) Introduction to Compressed Sensing Winter School 2015 23 / 40

Sufficient Condition for ‘ℓ0 = ℓ1’: Coherence

Definition:Let A = (ai)

Ni=1 be an n× N matrix. Then its coherence µ(A) is

µ(A) = maxi 6=j

| 〈ai , aj〉 |‖ai‖2‖aj‖2

∈[

N − n

n(N − 1), 1]

.

Theorem (Elad, Bruckstein; 2002) (Donoho, Elad; 2003):Let A be an n × N matrix, and let x0 ∈ R

N \ {0} satisfy

‖x0‖0 < 12 (1 + µ(A)−1).

Then x0 is the unique solution of

minx

‖x‖0 s.t. Ax0 = Ax and minx

‖x‖1 s.t. Ax0 = Ax .

Gitta Kutyniok (TU Berlin) Introduction to Compressed Sensing Winter School 2015 23 / 40

Sufficient Condition for ‘ℓ0 = ℓ1’: RIP

Again Key Idea: Sparsity

Our signal is k-sparse:

=

Φ is effectively an n × k Matrix

Gitta Kutyniok (TU Berlin) Introduction to Compressed Sensing Winter School 2015 24 / 40

Sufficient Condition for ‘ℓ0 = ℓ1’: RIP

Again Key Idea: Sparsity

Our signal is k-sparse:

=

Φ is effectively an n × k Matrix

=⇒ Design Φ so that each of its n × k submatrices is full rank!

Definition: Let A be an n× N matrix. Then A has the Restricted IsometryProperty (RIP) of order k , if there exists δk ∈ (0, 1) with

(1− δk)‖x‖22 ≤ ‖Ax‖22 ≤ (1 + δk)‖x‖22 ∀x ∈ Σk .

Gitta Kutyniok (TU Berlin) Introduction to Compressed Sensing Winter School 2015 24 / 40

Restricted Isometry Property (RIP)

Stable Embedding:

Φ shall preserve the geometry of the set of sparse signals:

Φ

x2x1

Φ(x1)

Φ(x2)

Restricted Isometry Property:

‖x1 − x2‖ ≈ ‖Φ(x1)− Φ(x2)‖.

Gitta Kutyniok (TU Berlin) Introduction to Compressed Sensing Winter School 2015 25 / 40

Restricted Isometry Property (RIP)

Stable Embedding:

Φ shall preserve the geometry of the set of sparse signals:

Φ

x2x1

Φ(x1)

Φ(x2)

Restricted Isometry Property:

‖x1 − x2‖ ≈ ‖Φ(x1)− Φ(x2)‖.

But this is a combinatorial NP-hard design problem!

Gitta Kutyniok (TU Berlin) Introduction to Compressed Sensing Winter School 2015 25 / 40

Insight from Banach Space Theory

General Approach to RIP:

Based on work by Garnaev, Gluskin, and Kushin (77’ & 84’)

Design Φ to be a random matrix, e.g.◮ Gaussian iid◮ Bernoulli (±1) iid◮ ...

Such matrices Φ have the Restricted Isometry Property with highprobability, if

n = O(k · log(

N

k

)

) << N.

Gitta Kutyniok (TU Berlin) Introduction to Compressed Sensing Winter School 2015 26 / 40

Sufficient Condition for ‘ℓ0 = ℓ1’: RIP

Theorem (Cohen, Dahmen, DeVore; 2008) (Candes; 2008):Let A be an n × N matrix which satisfies the RIP of order 2k withδ2k <

√2− 1, and let x0 ∈ R

N . Then the solution x of

minx

‖x‖1 subject to Ax0 = Ax .

satisfies

‖x0 − x‖2 ≤ C ·(σk(x

0)1√k

)

,

where σk(x0)1 is the ℓ1-error of best k-term approximation to x0, i.e.,

σk(x0)1 := inf

y∈Σk

‖x0 − y‖1.

Gitta Kutyniok (TU Berlin) Introduction to Compressed Sensing Winter School 2015 27 / 40

Sensing Matrices and Recovery Algorithms...

Gitta Kutyniok (TU Berlin) Introduction to Compressed Sensing Winter School 2015 28 / 40

Sensing Matrices

Deterministic Matrices:

n × N-Vandermonde matrix:

spark(A) = n + 1, but poorly conditioned.

n × n2-equiangular tight frames (Strohmer, Heath; 2003):

µ(A) =1√n, then n = O(k2 logN), but N = n2.

n × N-matrices (Bourgain, DeVore, Haupt, et al.; 2007–):

n & k2−µ, but µ is very small.

Gitta Kutyniok (TU Berlin) Introduction to Compressed Sensing Winter School 2015 29 / 40

Sensing Matrices

Random Matrices:

n × N-matrix with i.i.d. entries:

spark(A) = n + 1 with probability 1.

n × N-matrix with subgaussian distr. (Candes, Donoho, et al.;2006–): If

n = O(k log(N/k)),

then A satisfies the RIP of order δ2k with prob. at least

1− 2e−c·n ‘overwhelmingly high probability’.

Question:How far can we get with deterministic matrices?

Gitta Kutyniok (TU Berlin) Introduction to Compressed Sensing Winter School 2015 30 / 40

Sparse Recovery Algorithms: ℓ1 Minimization

Convex Problem:

minx

‖x‖1 subject to y = Ax

Convex problem with a conic constraint:

minx

‖x‖1 subject to ‖Ax − y‖22 ≤ ε

−→ Specialized algorithms for Compressed Sensing!−→ www.acm.caltech.edu/l1magic and sparselab.stanford.edu!

Equivalent formulation:

Unconstrained version:

minx

12‖Ax − y‖22 + λ‖x‖1

Gitta Kutyniok (TU Berlin) Introduction to Compressed Sensing Winter School 2015 31 / 40

Sparse Recovery Algorithms: Greedy and Combinatorial

Greedy Algorithms:

Orthogonal Matching Pursuit

Iterative Thresholding

...

Combinatorial Algorithms:

Combinatorial group testing

Data streams

...

Gitta Kutyniok (TU Berlin) Introduction to Compressed Sensing Winter School 2015 32 / 40

Compressed Sensing in Action...

Gitta Kutyniok (TU Berlin) Introduction to Compressed Sensing Winter School 2015 33 / 40

Application Areas of Compressed Sensing

Compressed SensingImaging

Sciences

Radar

Technology

Communikations

Theory

Information

Theory

Biology

Geology/

Seismology

Astronomy

Optics

Business

Remote

Sensing

Compression/

Dimension Reduktion

Medicine

Gitta Kutyniok (TU Berlin) Introduction to Compressed Sensing Winter School 2015 34 / 40

Further Applications

Astronomy◮ Cosmic Microwave Background, Planck mission, ...

Communication◮ Channel estimation, (sensor) networks,...

Computational Biology◮ DNA microarrays, ...

Geophysical Data Analysis◮ Seismic data recovery, wavefield extrapolation, ...

Photography◮ Single-pixel camera,...

Physics◮ Simulation of atomic systems, quantum state tomography, ...

...

Gitta Kutyniok (TU Berlin) Introduction to Compressed Sensing Winter School 2015 35 / 40

Topics of this Winter School

Gitta Kutyniok (TU Berlin) Introduction to Compressed Sensing Winter School 2015 36 / 40

Winter School on “Compressed Sensing”

Topics:

Model of sparse vectors Model of low-rank matrices(Rachel Ward)

Model of sparse vectors Model of sparse lattice vectors(Axel Flinth)

Measurement matrices Structured random matrices(Holger Rauhut)

Measurements Non-linearity(Roman Vershynin and Rachel Ward)

Application/Extension: Recovery of high-dimensional functions(Massimo Fornasier)

Applications: Data separation, missing data recovery & Fourier data(Gitta Kutyniok)

Applications: Proteomics analysis & MRI(Martin Genzel & Jackie Ma)

Gitta Kutyniok (TU Berlin) Introduction to Compressed Sensing Winter School 2015 37 / 40

Let’s conclude...

Gitta Kutyniok (TU Berlin) Introduction to Compressed Sensing Winter School 2015 38 / 40

Conclusions

Sparsity is a natural model for signals.

Compressed Sensing:Sparse high-dimensional signals can be recovered efficiently

from a small set of linear, non-adaptive measurements!

Various connections to different areas inside mathematics and acrossdisciplines.

Examples of applications:◮ Astronomy.◮ Biology.◮ Communication.◮ Radar.◮ ...

Compressed Sensing for Future Technologies:

Great potential, but wide open field!

Gitta Kutyniok (TU Berlin) Introduction to Compressed Sensing Winter School 2015 39 / 40

Technische Universität BerlinApplied Functional Analysis Group

THANK YOU!

Contact:

www.math.tu-berlin.de/∼kutyniokCode available at:

www.ShearLab.org

Related Books:

Y. Eldar and G. KutyniokCompressed Sensing: Theory and Applications

Cambridge University Press, 2012.S. Foucart and H. RauhutA Mathematical Introduction to Compressive Sensing

Birkhauser-Springer, 2013.

Gitta Kutyniok (TU Berlin) Introduction to Compressed Sensing Winter School 2015 40 / 40

Recommended