MIT 6.050J Information and Entropy, Spring 2008
This course explores the ultimate limits to communication and computation, with an emphasis on the physical nature of information and information processing. Topics include: information and computation, digital signals, codes and compression, applications such as biological representations of information, logic circuits, computer architectures, and algorithmic information, noise, probability, error correction, reversible and irreversible operations, physics of computation, and quantum computation. The concept of entropy applied to channel capacity and to the second law of thermodynamics. Created by MIT OpenCourseWare.
Pick a lesson
1: Unit 1: Bits and Codes
2: Unit 2: Compression
3: Unit 3: Noise and Errors
4: Unit 4: Probability, Lecture 1
5: Unit 4: Probability, Lecture 2
6: Unit 5: Communications, Lecture 1
7: Unit 5: Communications, Lecture 2
8: Unit 6: Processes
9: Unit 7: Inference, Lecture 1
10: Unit 7: Inference, Lecture 2
11: Unit 8: Maximum Entropy, Lecture 1
12: Unit 8: Maximum Entropy, Lecture 2
13: Unit 10: Physical Systems, Lecture 1
14: Unit 10: Physical Systems, Lecture 3
15: Unit 11: Energy, Lecture 1
16: Unit 11: Energy, Lecture 2
17: Unit 12: Temperature, Lecture 1
18: Unit 12: Temperature, Lecture 2
19: Unit 13: Quantum Information