CS 7150: Deep Learning - Spring 2021

Time & Location:

2:50 - 4:30pm Eastern Time, Mondays and Wednesdays, Location: See Canvas for Zoom link. Hurtig 224

Staff

Instructor: Paul Hand
Email: p.hand@northeastern.edu     Office Hours: Fridays 4-5 PM and by appointment

TA: Jorio Cocola
Email: cocola.j@northeastern.edu     Office Hours: Tuesdays 1-2 PM and by appointment

TA: Sean Gunn
Email: gunn.s@northeastern.edu     Office Hours: Thursdays 12-1 PM and by appointment

IA: Anmol Srivastava
Email: srivastava.anm@northeastern.edu     

Course Description

Note: This differs from the official course description. Please read carefully.

Introduction to deep learning, including the statistical learning framework, empirical risk minimization, loss function selection, fully connected layers, convolutional layers, pooling layers, batch normalization, multi-layer perceptrons, convolutional neural networks, autoencoders, U-nets, residual networks, gradient descent, stochastic gradient descent, backpropagation, autograd, visualization of neural network features, robustness and adversarial examples, interpretability, continual learning, and applications in computer vision and natural language processing. Assumes students already have a basic knowledge of machine learning, optimization, linear algebra, and statistics.

Overview

The learning objectives of this course are that students should:

Course Structure and Etiquette:

Because of social distancing requirements, this course will be conducted primarily over Zoom. The instructor will be screensharing papers and/or digital notes from the classroom and will facilitate classwide discussion over Zoom. Participation in the class discussion is expected, and everyone will get a chance to speak. Lectures will be recorded and posted on Canvas. During class, students should be fully engaged with the class, to the best of their ability. In particular, students should be willing with the whole class their responses to prepared questions. Students are encouraged to leave their video connection on for the majority of class. Participants should mute themselves when they are not speaking.

Student Work:

Students will be expected to complete the following work:

Course grades:

Course grades will be based on: 30% Preparation questions for classroom, 20% Participation, 30% HWs, 20% Project.
Letter grades will be assigned on the following scale: 93%+ A, 90-92% A-, 87-89 B+, 83-87% B, 80-82% B-, 77-79 C+, 73-77% C, 70-72% C-,60-70% D, 0-59% F.

Course Etiquette:

Prerequisites

Students are expected to have experience with a class in Machine Learning. The class will assume students are comfortable with some linear algebra, probability, and statistics. Some experience with neural networks, python, PyTorch/TensorFlow will be helpful but can be acquired while taking this class. If you do not have experience with PyTorch, a good resource is the book Deep Learning with Pytorch.

Day Date Class Discussion Will Be On:
1 W 1/20

Machine Learning Review (Notes)

A DARPA Perspective on Artificial Intelligence

Preparation Questions for Class

2 M 1/25

Machine Learning Review (Notes) (Annotated)

3 W 1/27 Deep Learning

Understanding deep learning requires rethinking generalization

Preparation Questions for Class (tex)

4 M 2/1 Architectural Elements of Neural Networks. (Notes) (Annotated).
5 W 2/3

Visualizing and Understanding Convolutional Networks

Visualizing Higher-Layer Features of a Deep Network

Preparation Questions for Class (tex)

6 M 2/8 Gradient Descent and Stochastic Gradient Descent (Notes) (Annotated)
7 W 2/10
HW 1 (pdf, tex) DUE (F 2/12)
Batch Normalization: Accelerating Deep Network Training by Reducing Internal Covariate Shift

How Does Batch Normalization Help Optimization?

Preparation Questions for Class (tex)

- M 2/15 No Class - Holiday
8 W 2/17

Deep Learning Book - Chapter 8

Adam: A Method for Stochastic Optimization

Preparation Questions for Class (tex)

Annotated Notes

9 M 2/22 Neural Network Architectures for Images (Notes)

Annotated Notes

10 W 2/24 Deep Residual Learning for Image Recognition (ResNets)

ImageNet Classification with Deep Convolutional Neural Networks (AlexNet)

Preparation Questions for Class (tex)

Annotated Notes

11 M 3/1

Watch DeepLearningAI videos on Object Localization and Detection: 1, 2, 3, 4, 6, 7, 8, 9, 10,

You Only Look Once: Unified, Real-Time Object Detection

Class notes (unannotated, annotated)

12 W 3/3
HW 2 (pdf, tex) DUE (F 3/5)

Attention Is All You Need

Language Models are Few-Shot Learners

Preparation questions for class (tex)

Class notes (annotated)

13 M 3/8 Adversarial Examples for Deep Neural Networks (Notes)

Class notes (unannotated, annotated)

14 W 3/10 Explaining and Harnessing Adversarial Examples

Robust Physical-World Attacks on Deep Learning Models

Preparation Questions for Class (tex)

Class notes (annotated)

15 M 3/15

"Why Should I Trust You?" Explaining the Predictions of Any Classifier

Watch this video explanation by one of the authors.

A Survey on Neural Network Interpretability (optional)

Class notes (unannotated, annotated)

16 W 3/17 Guest Lecture - Generative Priors
17 M 3/22 Continual Learning and Catastrophic Forgetting (Notes)

Class notes (unannotated, annotated)

- W 3/24
HW 3 (pdf, tex) DUE (F 3/26)
No class - University Care Day
18 M 3/29
Project Planning Document (pdf, tex) Due (M 3/29)
Automatic Differentiation, Backpropagation

Watch this video of automatic differentiation.

Watch this lecture (from start until time 38:30) on backpropagation of neural networks.

Class notes (annotated)

19 W 3/31 Overcoming catastrophic forgetting in neural networks

Preparation Questions for Class (tex)

Class notes (unannotated, annotated)

20 M 4/5 Generative Adversarial Networks (notes)

Class notes (unannotated, annotated)

21 W 4/7 Variational Autoencoders (notes)
- M 4/12 No class - University Care Day
22 W 4/14 Final Discussion
23 M 4/19 Project Presentations
24 M 4/21 Project Presentations