COURSE DESCRIPTION
In many signal recovery problems, one attempts to find an n dimensional signal with as few measurements as possible. If the signal has some structure, such as sparsity, then it can sometimes be recovered with many fewer than n measurements. The recovery process often involves l1 minimization or semidefinite programming. This course will focus on rigorous recovery theorems and computational algorithms for a variety of structured signal recovery problems. The objectives of this course are to familiarize the students with the important ideas and proof techniques in sparse structure recovery.
Topics:
- compressed sensing
- random matrix theory
- high-dimensional geometry
- duality in convex programming
- phase retrieval
- sparse principal component analysis
Books: This course will cover Chapter 1 and Chapter 5 of Compressed Sensing: Theory and Applications by Eldar and Kutyniok. It will also cover Chapter 7 of Sparse Image and Signal Processing by Starck, Murtagh, and Fadili. It will also cover papers that are freely available from the arXiv.
Prerequisites: Familiarity with Matlab, Analysis (CAAM 501 or MATH 302 or similar), Optimization (CAAM 571 or similar), Probability Theory (e.g. multivariate Gaussians, Chi-squared random variables, central limit theorems)
Class structure and grading: Before each day of class, the instructor will assign book sections and/or research papers for you to read. Accompanying the reading will be a set of questions. You will be expected to complete the reading and write out answers to each of these questions by the beginning of class. Class will then be a discussion of these questions and the reading. You will be expected to share your thoughts and questions regarding the readings in class. Students taking the class for 3 credits are expected (1) to submit well-thought answers to the assigned questions; (2) present their answers upon request; and (3) participate in class discussions. The grade will be based on the degree of honest effort in all three of these aspects. Students enrolling for 1 credit are expected to attend class.
Disabilities: Any student with a disability needing academic accommodations is requested to speak with me as soon as possible. All discussions will remain confidential. Students should also contact Disability Support Services in the Ley Student center.
Schedule
Day | Topics | Reading Assignment | Class notes |
---|---|---|---|
Jan 12 | Intro to compressed sensing (CS), least squares | Notes | |
Jan 14 | Spark, coherence | Chapter 1. Questions. | Notes |
Jan 21 | Null space property | Blog post. Questions | Notes |
Jan 26 | NSP and not-exactly-sparse signals | Notes | |
Jan 28 | RIP recovery guarantees | Questions | Notes |
Feb 2 | Gaussian matrices have RIP | Questions | Notes |
Feb 4 | Singular values of Gaussian matrices | Questions | Notes |
Feb 9 | Singular values of matrices with subgaussian rows | Notes | |
Feb 16 | Heavy-tailed random variables | Notes | |
Feb 18 | Singular values of matrices with heavy-tailed row | Notes | |
Feb 23 | Dual certification and Fuch's criterion | Reading. Questions | Notes |
Mar 9 | Gaussian widths | Notes | |
Mar 16 | Gradient descent and proximal operators | Notes | |
Mar 18 | Dual methods and ADMM | Boyd's slides on ADMM | Notes |
Mar 23 | Greedy algorithms for CS | Chapter 8 of Eldar and Kutyniok | Notes |
Mar 25 | Phase retrieval | Notes | |
Apr 6 | Phase retrieval | Notes | |
Apr 8 | Phase retrieval | Notes | |
Apr 15 | Phase retrieval | Notes | |
Apr 20 | Sparse PCA | ||
Apr 22 | Sparse PCA | Reading and Questions |