Compressed Sensing Algorithms for Big Data Lecture

cis 700 l.w
1 / 11
Embed
Share

Learn about Compressed Sensing, a technique to recover sparse signals from a small number of measurements, its applications in signal processing, reconstruction methods, subgradient optimization, exact reconstruction properties, and the restricted isometry property of matrices. Explore how Compressed Sensing can efficiently recover sparse solutions and optimize data processing in big data environments.

  • Compressed Sensing
  • Algorithms
  • Big Data
  • Signal Processing
  • Optimization

Uploaded on | 0 Views


Download Presentation

Please find below an Image/Link to download the presentation.

The content on the website is provided AS IS for your information and personal use only. It may not be sold, licensed, or shared on other websites without obtaining consent from the author. If you encounter any issues during the download, it is possible that the publisher has removed the file from their server.

You are allowed to download the files provided on this website for personal or commercial use, subject to the condition that they are used lawfully. All files are the property of their respective owners.

The content on the website is provided AS IS for your information and personal use only. It may not be sold, licensed, or shared on other websites without obtaining consent from the author.

E N D

Presentation Transcript


  1. CIS 700: algorithms for Big Data Lecture 9: Compressed Sensing Slides at http://grigory.us/big-data-class.html Grigory Yaroslavtsev http://grigory.us

  2. Compressed Sensing Given a sparse signal ? ? can we recover it from a small number of measurements? Goal: design ? ? ? which allows to recover any ?-sparse ? ? from ??. ? = matrix of i.i.d. Gaussians ?(0,1) Application: signals are usually sparse in some Fourier domain

  3. Reconstruction Reconstruction: min ? 0, subject to: ?? = ? Uniqueness: If there are two ?-sparse solutions ?1,?2: ? ?1 ?2 = 0 then ? has 2? linearly dependent columns If ? = (?2) and ? is Gaussian then unlikely to have linearly dependent columns ? 0not convex, NP-hard to reconstruct ? 0 ? When does this give sparse solutions? 1: min ? 1, subject to: ?? = ?

  4. Subgradient min ? ? Subgradient ?f: equal to gradient where ? is differentiable any linear lower bound where ? is not differentiable ?0, ?: ? ?0+ ? ? ?0 + ??? ? Subgradient for ? 1: ? ? 1?= ????(??) if ?? 0 ? ? 1? 1,1 if ??= 0 For all ? such that ? ? = 0 satisfies ?? ? 0 Sufficient: ? such that ? = ??? so ?? ? = ?? ? = 0 1, subject to: ?? = ? 1is convex but not differentiable

  5. Exact Reconstruction Property Subgradient Thm. If ??0= ? and there exists a subgradient ?for ? columns of ? corresponding to ?0 are linearly independent then ?0 minimizes ? (Minimum): Assume ?? = ?. Will show ? 1such that ? = ??? and 1and is unique. 1 ?0 1 ? = ? ?0 ?? = ?? ??0= 0 ??? = 0 ? 1= ?0+ ? + ??? = ?0 ?0 1

  6. Exact Reconstruction Property (Uniqueness): assume ?0 is another minimum ? at ?0 is also a subgradient at ?0 ?:? ? = 0: ?0+ ? ?0 = ?0 ?? ?0 ?0 = ??? ?0+ ? ?0 ?i= sign (x0i) = sign ( ?0i) if either is non-zero, otherwise equal to 0 ?0 and ?0 have same sparsity pattern By linear independence of columns of ?: ?0= 1= 1+ ??( ?0 ?0+ ?) 1+ ?? ?0 ?0 = ??? ? = 0 1+ ?T ? ?0+ ?0 ?0+ ? ?0 ?0 + ?T ? 1 ?0

  7. Restricted Isometry Property Matrix ? satisfies restricted isometry property (RIP), if for any ?-sparse ? there exists ??: 1 ?? ? 2 Exact isometry: all eigenvalues are 1 for orthogonal ?,?:?????? = 0 Let ?? be the set of columns of ? in set ? Lem: If ? satisfies RIP and ??1+?2 ??1+ ??2: For ? of size ? singular values of ?? in [1 ??,1 + ?? ] For any orthogonal ?,? with supports of size ?1,?2: ?????? ? | ? |(??1+ ??2) 2 2 1 + ?? 2 ?? ? 2 2

  8. Restricted Isometry Property Lem: If ? satisfies RIP and ??1+?2 ??1+ ??2: For ? of size ? singular values of ?? in [1 ??,1 + ?? ] For any orthogonal ?,? with supports of size ?1,?2: ?????? 3/2 ? W.l.o.g ? = ? = 1 so ? + ? ? (??1+ ??2) 2= 2 2 2 1 + ??1+?2 ? ? + ? 2 1 ??1+?2 2 1 ??1+ ??2 ? ? + ? 2 2 1 + ??1+ ??2 2 (1 + ??1) 2 (1 + ??2) 1 ??1 ?? 1 ??2 ??

  9. Restricted Isometry Property 2?????? = ? + ????? ? + ? ?????? ?????? 2 ?? 2 ?? 2 = ? ? + ? 2?????? 2 1 + ??1+ ??2 1 ??1 1 ??2= 3(??1+ ??2) ?????? 3 2 ? ? (??1+ ??2)

  10. Reconstruction from RIP 1 Thm. If ? satisfies RIP with ??+1 ?-sparse and satisfies ??0= ?. Then a ?( exists at ?0 which satisfies conditions of the subgradient theorem . Implies that ?0 is the unique minimum 1-norm solution to ?? = ?. ? = ? ?0 ? 0 , ? = ? ?0 ?= 0 Find subgradient ? search for ?:? = ??? for ? ?: ??= ???? ?0 2-norm of the coordinates in ? is minimized 10 ? and ?0 is 1)

  11. Reconstruction from RIP Let ? be a vector with support ?: ??=sign (?0?) 1? ??? Let ? = ???? ?? has independent columns by RIP For coordinates in ?: 1? = ? ????? ??? ????= ?? For coordinates in ?: 1? 2 ????? 2, 1 + ?? 1?, ? ??? ??? ?= ? ? ??? are in 1 ?? 1 1 ??2, let ? = ?? Eigenvalues of ?? ??? ??? = ?? where ? has all coordinates in ? equal 0 1|| ? ??? || ?? 1 ??2 3 2??+?1 1 ??2 3 2??+1 1 ??2 1 ? ? For ? ?: ????= ?? ????? so | ????| 2

Related


More Related Content