EE 276: Course Outline

Stanford University, Tsachy Weissman, Winter Quarter 2024-25
  • Lecture 1, Jan 7: Introductory lecture

  • Lecture 2, Jan 9: Information measures: entropy (joint, relative and condition)

    • [Notes]

    • Similar in coverage to 2020 Video Lecture 2 (on canvas).

    • Corresponds to Elements of Information Theory Chapter 2

  • Lecture 3, Jan 14: Asymptotic Equipartition Property (AEP) and typicality

    • [Notes]

    • Similar in coverage to 2020 Video Lecture 3 (on canvas).

    • Corresponds to Elements of Information Theory Chapter 3

  • Lecture 4, Jan 16: Variable length lossless compression: prefix codes

    • Corresponds to Elements of Information Theory Chapter 5

  • Lecture 5, Jan 21: Shannon code, the Kraft-McMillan inequality and Huffman code

    • Corresponds to Elements of Information Theory Chapter 5

  • Lecture 6, Jan 23: Entropy rates and universal compression

    • Corresponds to Elements of Information Theory Chapter 4

  • Lecture 7, Jan 28: Reliable communication and channel capacity

    • Corresponds to Elements of Information Theory Chapter 6

  • Lecture 8, Jan 30: Information measures for continuous random variables

    • Corresponds to Elements of Information Theory Chapter 8

  • Lecture 9, Feb 4: AWGN channel

    • Corresponds to Elements of Information Theory Chapter 9

  • Lecture 10, Feb 6: Joint AEP and channel coding theorem (direct part)

    • Corresponds to Elements of Information Theory Chapter 6

  • Lecture 11, Feb 11: Proof of channel coding theorem (direct part)

    • Corresponds to Elements of Information Theory Chapter 6

  • Lecure 12, Feb 13: Proof of channel coding theorem (converse part)

    • Corresponds to Elements of Information Theory Chapter 6

  • Lecture 14, Feb 20: Lossy compression and rate distortion

    • Corresponds to Elements of Information Theory Chapter 10

  • Lecture 15, Feb 25: Lossy compression and rate distortion continued

    • Corresponds to Elements of Information Theory Chapter 10

  • Lecture 16, Mar 4: Converse part of rate distortion theory

    • Corresponds to Elements of Information Theory Chapter 10

  • Lecture 17, Mar 6: Method of types

    • Corresponds to Elements of Information Theory (C&T) Chapter 11

  • Lecture 18, Mar 11: Strong and joint typicality, direct part of rate distortion theory

    • Corresponds to Elements of Information Theory Chapter 10