COURSE DESCRIPTION
This course introduces the theory and numerical algorithms for several fundamental signal recovery tasks. Topics include L1 minimization, sparse regression, compressed sensing, orthogonal matching pursuit, proximal operators, ADMM algorithms, Iterative Reweighted Least Squares. Nuclear norm minimization, matrix completion, robust Principal Component Analysis. The objectives of this course are: (1) to provide you with a firm understanding of the basic tools involved in signal recovery, (2) to improve your ability to design signal recovery algorithms; (3) to improve your ability to prove signal recovery guarantees.
Books: This course will cover Chapter 1 and Chapter 5 of Compressed Sensing: Theory and Applications by Eldar and Kutyniok. It will additionally cover Chapter 5 of Convex Optimization by Boyd and Vandenberghe. It will also cover papers that are freely available from the arXiv. I also recommend All of Statistics by Wasserman.
Class structure and grading: You will have biweekly homework assignments that will be due on Tuesdays. One homework assignments will be pledged and will serve as an exam. One homework will be pledged and will serve as an exam. In groups of three, you will read and present two recent research papers near the end of the semester. Your grade will consist of homeworks (30%), the exam (30%), the paper presentations (30%), and class participation (10%). You are expected to attend class (almost) every day. If you miss more than 4 classes, the classroom participation part of your grade will drop to zero.
Disabilities: Any student with a disability needing academic accommodations is requested to speak with me as soon as possible. All discussions will remain confidential. Students should also contact Disability Support Services in the Ley Student center.
HOMEWORKS
Event | Date | Related Documents |
---|---|---|
HW 1 | Jan 31 in class | Problems. |
HW 2 | Feb 21 in class | Problems. |
HW 3 | Mar 28 in class | Problems. |
HW 4 | Apr 20 in class | Problems. |
Schedule
Day | Topics | Reading Assignment | Class notes |
---|---|---|---|
Jan 10 | Signal recovery problems | Notes | |
Jan 12 | Least squares | Notes | |
Jan 17 | Concentration Estimates | Wasserman's notes on probability inequalities | Notes |
Jan 19 | Concentration Estimates | Wasserman's notes on probability inequalities | Notes |
Jan 24 | Maximum of Gaussians | Notes | |
Jan 26 | Chi Squared variables | Notes | |
Jan 31 | Spectral Norm of Random Matrices | Vershynin's notes | |
Feb 1 | Spectral Norm of Random Matrices | Vershynin's notes | |
Feb 7 | Spectral Norm of Random Matrices | Vershynin's notes | Notes |
Feb 14 | Convex functions and sets | Notes | |
Feb 16 | Convex programs and Linear Programs | Notes | |
Feb 23 | Convex duality with equality and inequality constraints | Notes | |
Feb 28 | Convex duality with inequality constraints | Notes | |
Mar 2 | Cones and conic constraints | Notes | |
Mar 7 | Subgradients and L1 optimization | Notes | |
Mar 9 | Subgradients and L1 optimization | Notes | |
Compressed Sensing | Notes | ||
Null Space Property | Notes | ||
Compressed Sensing and RIP | Notes | ||
Forward Backward Methods | Notes | ||
ADMM for L1 | Notes | ||
OMP | |||
Matrix Completion | |||
Phase Retrieval | |||
Phase Retrieval |