CS 7180: Special Topics in AI

Imaging and Deep Learning - Experiments, Derivations, and Theory

Time & Location:

6:00 - 9:15pm, T, Location: Hurtig Hall 224

Staff

Instructor: Paul Hand
    Office: TBD

Overview

In each day of class, the first half will consist of a lecture given by the instructor. In the second half, there will be a presentation and discussion on two papers related to the previous class’s lecture. A student (or a pair of students) will lead the presentation/discussion on each of the papers. To prepare for each day of class you should: To prepare a class presentation/discussion: Each day of class, about half will consist of lecture and half of presentations and discussions on individual papers.

Students will present at most two papers during the semester and will do one project. The project will involve implementing the methods in a paper related the class content. The project will involve replicating some of the results of the paper in addition to presenting additional observations that were not remarked upon in the paper. You will give a 5 minute presentation during the last two weeks of class and you will write an at-most 4 page paper detailing your findings. Please use the NeurIPS Style Files. Your writeup should contain a description of the scientific context of your project, should clearly state a question or hypothesis that is being (partially or completely) answered, detail what you did, state what you observed, and then discuss the implications and subsequent work that is motivated by your results. It is fine if your project is a negative result. Your project should include references, which are not counted in the 4 page limit. The paper is due on the last day of class.

The class will assume you are fluent in linear algebra and probability. It will assume you have some experience with neural networks, python, Pytorch/Tensorflow/etc. Students taking the class for a grade are expected to participate in the discussions in almost all classes, in addition to presenting their two papers and project.

Prerequisites

The class will assume students are fluent in linear algebra and probability. Some experience with neural networks, python, PyTorch/TensorFlow will be helpful.

# Date First Half Second Half
1 T 1/7 Course Overview
Signal Recovery and Compressed Sensing
2 T 1/14 Discrete Fourier Transforms, Wavelet Transforms, Algorithms for Sparse Recovery
Improved Pediatric MR Imaging with Compressed Sensing

Signal Recovery From Random Measurements Via Orthogonal Matching Pursuit
3 T 1/21 ML Framework:
Bias-Variance Tradeoff
Linear Regression
Logistic Regression
KL Divergence
Cross Entropy
Recovering low-rank matrices from few coefficients in any basis

Robust Principal Component Analysis?

4 T 1/28 Optimization Methods: GD,SGD,Adam,LBFGS
Reconciling modern machine learning and the bias-variance trade-off

Deep Double Descent: Where Bigger Models and More Data Hurt

5 T 2/4 Work on Projects
---
6 T 2/11 Architectural Elements:
Convolutions
Transpose Convolutions
Activation Functions
Batch Normalization
Upsampling/Downsampling

Stochastic First- and Zeroth-order Methods for Nonconvex Stochastic Programming (see also Zeroth Order Optimization with Applications to Adversarial Machine Learning slides online)

Accelerating Stochastic Gradient Descent using Predictive Variance Reduction

7 T 2/18 End-to-end Approaches
Deconvolution and Checkerboard Artifacts

How Does Batch Normalization Help Optimization?

8 T 2/25 Variational Autoencoders
Image Super-Resolution Using Deep Convolutional Networks

A Deep Learning Approach to Structured Signal Recovery

T 3/3 No Class. Spring Break
---
9 T 3/10 GANs: Derivation, Variations
The Lottery Ticket Hypothesis: Finding Sparse, Trainable Neural Networks

One ticket to win them all: generalizing lottery ticket initializations across datasets and optimizers

10 T 3/17 GANs: Recovery Theory Notes Video
Unpaired Image-to-Image Translation using Cycle-Consistent Adversarial Networks

Progressive Growing of GANs for Improved Quality, Stability, and Variation

11 T 3/24 Untrained Nets: Deep Image Prior, Deep Decoder, Deep Geometric Prior Notes Video
Phase Retrieval Under a Generative Prior

The spiked matrix model with generative priors

12 T 3/31 Invertible Nets Notes Video
"Double-DIP": Unsupervised Image Decomposition via Coupled Deep-Image-Priors

Denoising and Regularization via Exploiting the Structural Bias of Convolutional Generators

13 T 4/7 Projects
Projects
14 T 4/14 Projects
Projects