Exam: Thursday, April 12

Open book, notes, etc.

The exam is intended to be a 2-hour exam but you will be allowed to use all 3 hours if you need to.

 

Exam questions will be limited to (a subset of) these topics:

 

Decision trees: how constructed, entropy, information gain

Neural nets: perceptrons, perceptron algorithm, linear separability, backpropagation (including its derivation: gradient descent)

Version spaces: restricted hypothesis spaces, e.g., pure conjunctive concepts

Probability theory: joint distribution, Bayes classifiers, naïve Bayes, maximum likelihood and Bayesian parameter estimation

Bayes nets: independence, conditional independence

Overfitting: cross-validation, early stopping (neural nets)

PAC learning (know which formulas to apply, meaning of δ and ε, etc.)

VC dimension (be able to use definition or other knowledge to derive)

SVMs: maximum margin, kernels

MDPs: value functions, value iteration, policy iteration

Reinforcement learning: Q-learning, TD methods

Instance-based learning: k-nearest neighbor (also, radial basis functions, kernel regression)

Ensemble learning: weighted majority, AdaBoost algorithm, margins

Unsupervised learning: Gaussian mixture models, EM algorithm

 

HMMs/DBNs will not be covered