Lecture 1, Jan 9: Introductory lecture.
Lecture 2, Jan 11: Information measures: entropy (joint, relative and condition).
Lecture 3, Jan 16: Asymptotic Equipartition Property (AEP) and typicality.
Lecture 4, Jan 18: Variable length lossless compression: prefix and Shannon codes
Lecture 5, Jan 23: The Kraft-McMillan inequality and Huffman coding
Lecture 6, Jan 25: Entropy rates and universal compression
Lecture 7, Jan 30: Reliable communication and channel capacity
Lecture 8, Feb 1: Information measures for continuous random variables
Lecture 9, Feb 6: AWGN channel
Lecture 10, Feb 8: Channel coding theorem converse
Lecture 11, Feb 13: Joint AEP and channel coding theorem
Lecture 12, Feb 15: Polar codes
Lecture 13, Feb 20: Lossy Compression and Rate Distortion
Lecture 14, Feb 22: Lossy Compression and Rate Distortion Continued
Lecture 15, Feb 27: Method of Types
Lecture 16, Feb 29: Strong, conditional and joint typicality
Lecture 17, Mar 5: Joint source-channel coding and the separation theorem
Lecture 18, Mar 7: Joint source-channel coding and the separation theorem 2