Lecture 1, Jan 7: Introductory lecture
[Slides]
Lecture 2, Jan 9: Information measures: entropy (joint, relative and condition)
[Notes]
Similar in coverage to 2020 Video Lecture 2 (on canvas).
Corresponds to Elements of Information Theory Chapter 2
Lecture 3, Jan 14: Asymptotic Equipartition Property (AEP) and typicality
Similar in coverage to 2020 Video Lecture 3 (on canvas).
Corresponds to Elements of Information Theory Chapter 3
Lecture 4, Jan 16: Variable length lossless compression: prefix codes
Corresponds to Elements of Information Theory Chapter 5
Lecture 5, Jan 21: Shannon code, the Kraft-McMillan inequality and Huffman code
Lecture 6, Jan 23: Entropy rates and universal compression
Corresponds to Elements of Information Theory Chapter 4
Lecture 7, Jan 28: Reliable communication and channel capacity
Corresponds to Elements of Information Theory Chapter 6
Lecture 8, Jan 30: Information measures for continuous random variables
Corresponds to Elements of Information Theory Chapter 8
Lecture 9, Feb 4: AWGN channel
Corresponds to Elements of Information Theory Chapter 9
Lecture 10, Feb 6: Joint AEP and channel coding theorem (direct part)
Lecture 11, Feb 11: Proof of channel coding theorem (direct part)
Lecure 12, Feb 13: Proof of channel coding theorem (converse part)
Lecture 13, Feb 18: Polar codes
[Slides from lecture]
[Successive Cancellation Decoding]
Corresponds to Elements of Information Theory Chapter 7
Lecture 14, Feb 20: Lossy compression and rate distortion
Corresponds to Elements of Information Theory Chapter 10
Lecture 15, Feb 25: Lossy compression and rate distortion continued
Lecture 16, Mar 4: Converse part of rate distortion theory
Lecture 17, Mar 6: Method of types
Corresponds to Elements of Information Theory (C&T) Chapter 11
Lecture 18, Mar 11: Strong and joint typicality, direct part of rate distortion theory