Created:
Thu 08 Sep 2005
Last modified:
Syllabus
- Lecture 01, Fri, Sep 09 2005
- Admistrivia
- Introduction to information theory and its applications
- Lecture 02, Tue, Sep 13 2005
- Probability primer I
- Homework 01 assigned
- Lecture 03, Fri, Sep 16 2005
- Lecture 04, Tue, Sep 20 2005
- Entropy and its properties
- Reading: Cover and Thomas 2.1-2
- Lecture 05, Fri, Sep 23 2005
- Conditional entropy, relative entropy, mutual information
- Reading: Cover and Thomas 2.2-4
- Homework 01 due
- Homework 02 assigned
- Lecture 06, Tue, Sep 27 2005
- Chain rules, data processing inequality, Fano's inequality
- Reading: Cover and Thomas 2.5-6, 2.8, 2.11
- Lecture 07, Fri, Sep 30 2005
- Markov chains, entropy rate of stochastic processes
- Reading: Cover and Thomas 4
- Homework 02 due
- Homework 03 assigned
- Lecture 08, Tue, Oct 04 2005
- Compression I: codes and decodability, Kraft's inequality,
bounds on optimal codes
- Reading: Cover and Thomas 5.1-4
- Lecture 09, Fri, Oct 07 2005
- Compression II: more Kraft's inequality, block coding, Huffman codes
- Reading: Cover and Thomas 5.5-8
- Homework 03 due
- Homework 04 assigned
- Lecture 10, Tue, Oct 11 2005
- Compression III: twenty questions, arithmetic coding, randomness
- Reading: Cover and Thomas 5.7, 5.10, 5.12
- Lecture 11, Fri, Oct 14 2005
- Asymptotic Equipartition Property (AEP) and its consequences
- Reading: Cover and Thomas 3
- Homework 04 due
- Lecture 12, Tue, Oct 18 2005
- Information theory and statistics:
the method of types and applications
- Reading: Cover and Thomas 12.1-2
- Lecture 13, Fri, Oct 21 2005
- Lempel-Ziv, universal source coding
- Reading: Cover and Thomas 12.10
- Lecture 14, Tue, Oct 25 2005
- Channel coding
- Reading: Cover and Thomas 8
- Lecture 15, Fri, Oct 28 2005
- The maximum entropy method
- Reading: Cover and Thomas 11.1-2
- Remainder of the term...
- Applications of information theory:
lectures and student presentation
Switch to:
jaa@ccs.neu.edu