Sparsity and Compressed Sensing

  • View
    817

  • Download
    4

Embed Size (px)

DESCRIPTION

Slides of the lectures given at the summer school "Biomedical Image Analysis Summer School : Modalities, Methodologies & Clinical Research", Centrale Paris, Paris, July 9-13, 2012

Text of Sparsity and Compressed Sensing

  • 1. Sparsity andCompressed SensingGabriel Peyrwww.numerical-tours.com

2. Overview Inverse Problems Regularization Sparse Synthesis Regularization Theoritical Recovery Guarantees Compressed Sensing RIP and Polytopes CS Theory Fourier Measurements Convex Optimization via Proximal Splitting 3. Inverse ProblemsForward model:y = K f0 + w RP Observations Operator(Unknown) Noise: RQ RP Input 4. Inverse ProblemsForward model: y = K f0 + w RPObservations Operator(Unknown) Noise : RQ RP InputDenoising: K = IdQ , P = Q. 5. Inverse ProblemsForward model:y = K f0 + w RPObservationsOperator(Unknown)Noise: RQ RP InputDenoising: K = IdQ , P = Q.Inpainting: setof missing pixels, P = Q | |.0 if x , (Kf )(x) =f (x) if x /.K 6. Inverse ProblemsForward model:y = K f0 + w RPObservationsOperator(Unknown)Noise: RQ RP InputDenoising: K = IdQ , P = Q.Inpainting: setof missing pixels, P = Q | |.0 if x , (Kf )(x) =f (x) if x /.Super-resolution: Kf = (f k) , P = Q/ .K K 7. Inverse Problem in Medical Imaging Kf = (p k )1 k K 8. Inverse Problem in Medical Imaging Kf = (p k )1 k KMagnetic resonance imaging (MRI): Kf = (f ( ))f 9. Inverse Problem in Medical ImagingKf = (p k )1 k KMagnetic resonance imaging (MRI): Kf = (f ( ))fOther examples: MEG, EEG, . . . 10. Inverse Problem RegularizationNoisy measurements: y = Kf0 + w.Prior model: J : RQ R assigns a score to images.1f argmin ||yKf ||2 + J(f ) f RQ 2 11. Inverse Problem RegularizationNoisy measurements: y = Kf0 + w.Prior model: J : RQ R assigns a score to images.1f argmin ||y Kf ||2 + J(f ) f RQ 2Data delity Regularity 12. Inverse Problem RegularizationNoisy measurements: y = Kf0 + w.Prior model: J : RQ R assigns a score to images.1f argmin ||y Kf ||2 + J(f ) f RQ 2Data delity RegularityChoice of : tradeoNoise level Regularity of f0||w|| J(f0 ) 13. Inverse Problem RegularizationNoisy measurements: y = Kf0 + w.Prior model: J : RQ R assigns a score to images.1f argmin ||y Kf ||2 + J(f ) f RQ 2Data delity RegularityChoice of : tradeoNoise level Regularity of f0||w|| J(f0 )No noise: 0+ , minimize f argmin J(f ) f RQ ,Kf =y 14. Smooth and Cartoon PriorsJ(f ) = || f (x)||2 dx | f |2 15. Smooth and Cartoon PriorsJ(f ) = || f (x)||2 dxJ(f ) =|| f (x)||dxJ(f ) = length(Ct )dtR | f |2 | f| 16. Inpainting ExampleInput y = Kf0 + w Sobolev Total variation 17. Overview Inverse Problems Regularization Sparse Synthesis Regularization Theoritical Recovery Guarantees Compressed Sensing RIP and Polytopes CS Theory Fourier Measurements Convex Optimization via Proximal Splitting 18. Redundant DictionariesDictionary =( m )m RQ N,N Q. QN 19. Redundant DictionariesDictionary=(m )mRQ N ,N Q.Fourier:m = ei , mfrequencyQ N 20. Redundant DictionariesDictionary=( m )mRQN ,N Q. m = (j, , n)Fourier:m=e i , m frequency scale positionWavelets: m= (2 j R xn) orientation =1=2QN 21. Redundant DictionariesDictionary=( m )mRQN ,N Q. m = (j, , n)Fourier:m=e i , m frequency scale positionWavelets: m= (2 j R xn) orientationDCT, Curvelets, bandlets, . . . =1=2QN 22. Redundant DictionariesDictionary=( m )mRQ N,N Q.m = (j, , n)Fourier:m=e i , m frequencyscale positionWavelets: m= (2 j R x n) orientationDCT, Curvelets, bandlets, . . .Synthesis: f = mxm m =x.=1=2 Q =fx NCoe cients x Image f =x 23. Sparse PriorsCoe cients xIdeal sparsity: for most m, xm = 0. J0 (x) = # {mxm = 0}Image f0 24. Sparse Priors Coe cients xIdeal sparsity: for most m, xm = 0. J0 (x) = # {mxm = 0}Sparse approximation: f =x whereargmin ||f0x||2 + T J0 (x) x RN Image f0 25. Sparse PriorsCoe cients xIdeal sparsity: for most m, xm = 0. J0 (x) = # {mxm = 0}Sparse approximation: f =x where argmin ||f0 x||2 + T J0 (x)x RNOrthogonal : = = IdNf0 , m if | f0 ,m| > T, xm = 0 otherwise. STImage f0f= ST(f0 ) 26. Sparse PriorsCoe cients xIdeal sparsity: for most m, xm = 0. J0 (x) = # {mxm = 0}Sparse approximation: f =x where argmin ||f0 x||2 + T J0 (x)x RNOrthogonal : = = IdNf0 , m if | f0 ,m| > T, xm = 0 otherwise. STImage f0f= ST(f0 )Non-orthogonal : NP-hard. 27. Convex Relaxation: L1 Prior J0 (x) = # {mxm = 0}J0 (x) = 0null image.Image with 2 pixels:J0 (x) = 1sparse image.J0 (x) = 2non-sparse image. x2 x1q=0 28. Convex Relaxation: L1 Prior J0 (x) = # {mxm = 0} J0 (x) = 0 null image.Image with 2 pixels: J0 (x) = 1 sparse image. J0 (x) = 2 non-sparse image. x2 x1 q=0 q = 1/2 q=1q = 3/2 q=2 q priors:Jq (x) = |xm |q(convex for q1) m 29. Convex Relaxation: L1 PriorJ0 (x) = # {mxm = 0}J0 (x) = 0null image.Image with 2 pixels:J0 (x) = 1sparse image.J0 (x) = 2non-sparse image. x2 x1 q=0q = 1/2 q=1 q = 3/2 q=2 q priors: Jq (x) = |xm |q (convex for q1)mSparse 1 prior:J1 (x) = |xm |m 30. L1 Regularization x0 RNcoe cients 31. L1 Regularization x0 RNf0 = x0 RQcoe cientsimage 32. L1 Regularization x0 RNf0 = x0 RQ y = Kf0 + w RPcoe cientsimage observations K w 33. L1 Regularization x0 RNf0 = x0 RQy = Kf0 + w RPcoe cientsimageobservationsKw = K RP N 34. L1 Regularization x0 RNf0 = x0 RQ y = Kf0 + w RPcoe cientsimage observations Kw= K RP N Sparse recovery: f = x where x solves1min ||y x||2 + ||x||1 x RN 2 Fidelity Regularization 35. Noiseless Sparse RegularizationNoiseless measurements:y = x0xx= y xargmin|xm |x=y m 36. Noiseless Sparse RegularizationNoiseless measurements:y = x0x xx=x= y y xargmin|xm |xargmin |xm |2x=y m x=ym 37. Noiseless Sparse RegularizationNoiseless measurements:y = x0xxx= x= yyxargmin |xm |xargmin|xm |2x=y mx=y mConvex linear program.Interior points, cf. [Chen, Donoho, Saunders] basis pursuit.Douglas-Rachford splitting, see [Combettes, Pesquet]. 38. Noisy Sparse RegularizationNoisy measurements:y = x0 + w 1 xargmin ||yx||2 + ||x||1 x RQ 2Data delity Regularization 39. Noisy Sparse RegularizationNoisy measurements:y = x0 + w 1 xargmin ||yx||2 + ||x||1 x RQ 2 EquivalenceData delity Regularization x argmin ||x||1|| x y|||x=xy| 40. Noisy Sparse RegularizationNoisy measurements: y = x0 + w 1x argmin ||yx||2 + ||x||1 x RQ 2 EquivalenceData delity Regularizationx argmin ||x||1 || x y|||x=Algorithms: xy|Iterative soft thresholding Forward-backward splitting see [Daubechies et al], [Pesquet et al], etcNesterov multi-steps schemes. 41. Image De-blurringOriginal f0 y = h f0 + w 42. Image De-blurringOriginal f0 y = h f0 + w Sobolev SNR=22.7dBSobolev regularization: f = argmin ||f h y||2 + ||f ||2 f RNh()f () =y () |h()|2 + ||2 43. Image De-blurringOriginal f0y = h f0 + wSobolevSparsity SNR=22.7dBSNR=24.7dBSobolev regularization: f = argmin ||f h y||2 + ||f ||2 f RNh()f () =y () |h()|2 + ||2Sparsity regularization:= translation invariant wavelets.1f = x where xargmin ||h ( x) y||2 + ||x||1x 2 44. Inpainting Problem K 0 if x ,(Kf )(x) = f (x) if x / .Measures: y = Kf0 + w 45. Image SeparationModel: f = f1 + f2 + w, (f1 , f2 ) components, w noise. 46. Image SeparationModel: f = f1 + f2 + w, (f1 , f2 ) components, w noise. 47. Image SeparationModel: f = f1 + f2 + w, (f1 , f2 ) components, w noise.Union dictionary: =[1, 2]RQ (N1 +N2 )Recovered component: fi =i xi . 1 (x1 , x2 )argmin||fx||2 + ||x||1x=(x1 ,x2 ) RN 2 48. Examples of Decompositions 49. Cartoon+Texture Separation 50. Overview Inverse Problems Regularization Sparse Synthesis Regularization Theoritical Recovery Guarantees Compressed Sensing RIP and Polytopes CS Theory Fourier Measurements Convex Optimization via Proximal Splitting 51. Basics of Convex AnalysisSetting: G:H R {+}Here: H = RN . Problem: min G(x)x H 52. Basics of Convex AnalysisSetting: G:H R {+} Here: H = RN . Problem: min G(x)x HConvex:t [0, 1] x y G(tx + (1 t)y) tG(x) + (1 t)G(y) 53. Basics of Convex AnalysisSetting: G:H R {+} Here: H = RN . Problem: min G(x)x HConvex:t [0, 1]x y G(tx + (1 t)y) tG(x) + (1 t)G(y)Sub-di erential:G(x) = {u H z, G(z)G(x) + u, z x}G(x) = |x|G(0) = [ 1, 1] 54. Basics of Convex AnalysisSetting: G:H R {+} Here: H = RN . Problem: min G(x)x HConvex:t [0, 1]x y G(tx + (1 t)y) tG(x) + (1 t)G(y)Sub-di erential:G(x) = {u H z, G(z)G(x) + u, z x}Smooth functions: G(x) = |x|If F is C 1 , F (x) = { F (x)}G(0) = [ 1, 1] 55. Basics of Convex AnalysisSetting: G:HR {+}Here: H = RN . Problem: min G(x)x HConvex:t [0, 1]x y G(tx + (1 t)y) tG(x) + (1 t)G(y)Sub-di erential:G(x) = {u H z, G(z)G(x) + u, z x}Smooth functions: G(x) = |x|If F is C 1 , F (x) = { F (x)}First-order conditions: xargmin G(x) 0 G(x ) G(0) = [ 1, 1] x H 56. L1 Regularization: First Order Conditions1 x argmin G(x) = ||y x||2 + ||x||1x RQ2G(x) = ( x y) + || ||1 (x)sign(xi ) if xi = 0,|| ||1 (x)i =[ 1, 1] if xi = 0. 57. L1 Regularization: First Order Conditions 1x argmin G(x) = ||yx||2 + ||x||1 x RQ2G(x) =( xy) + || ||1 (x) sign(xi ) if xi = 0, || ||1 (x)i = [ 1, 1] if xi = 0.xiSupport of the solution: i I = {i {0, . . . , N1}xi = 0} 58. L1 Regularization: First Order Conditions1 x argmin G(x) = ||y x||2 + ||x||1x RQ2 G(x) =( x y) + || ||1 (x)sign(xi ) if xi = 0,|| ||1 (x)i =[ 1, 1] if xi = 0.xiSupport of the solution:i I = {i {0, . . . , N1}xi = 0}Restrictions:xI = (xi )i I R|I| I = ( i )i IRP |I| 59. L1 Regularization: First Order Conditions1 xi xargmin || x y||2 + ||x||1 P (y) x RN 2 iFirst order condition:( xy) + s = 0sI = sign(xI ), where||sI c || 1 60. L1 Regularization: First Order Conditions1 xi xargmin || x y||2 + ||x||1 P (y) x RN 2iFirst order condition:( xy) + s = 0 i, y x sI = sign(xI ),iwhere ||sI c ||11 = sI c =I c (yx ) 61. L1 Regularization: First Order Conditions1 xi xargmin || x y||2 + ||x||1 P (y) x RN 2iFirst order condition:( x y) + s = 0 i, y x sI = sign(xI ),iwhere ||sI c ||11 = sI c =I c (yx )Theorem: || Ic ( x y)|| x solution of P (y) 62. L1 Regularization: First Order Conditions1 xi xargmin || x y||2 + ||x||1 P (y) x RN 2iFirst order condition:( x y) + s = 0i, y x sI = sign(xI ),iwhere ||sI c ||11 = sI c =I c (yx )Theorem: || Ic ( x y)|| x solution of P (y)Theorem: If I has full rank and || I c ( x y)||< then x is th