Course syllabus adopted 2026-02-04 by Head of Programme (or corresponding).
Overview
- Swedish nameInformationsteori
- CodeSSY210
- Credits7.5 Credits
- OwnerMPICT
- Education cycleSecond-cycle
- Main field of studyElectrical Engineering
- DepartmentELECTRICAL ENGINEERING
- GradingUG - Pass, Fail
Course round 1
- Teaching language English
- Application code 90129
- Open for exchange studentsYes
Credit distribution
Module | Sp1 | Sp2 | Sp3 | Sp4 | Summer | Not Sp | Examination dates |
|---|---|---|---|---|---|---|---|
| 0108 Oral examination 7.5 c Grading: UG | 7.5 c |
In programmes
Examiner
- Giuseppe Durisi
- Full Professor, Communication, Antennas and Optical Networks, Electrical Engineering
Eligibility
General entry requirements for Master's level (second cycle)Applicants enrolled in a programme at Chalmers where the course is included in the study programme are exempted from fulfilling the requirements
Specific entry requirements
English 6 (or by other approved means with the equivalent proficiency level)Applicants enrolled in a programme at Chalmers where the course is included in the study programme are exempted from fulfilling the requirements
Course specific prerequisites
A solid foundation in probability and calculus. The course difficulty is on the Ph.D. level, which means that it is mathematically more advanced and runs at a higher pace than most Master's courses.Aim
This course offers an introduction to information theory and its application to digital communication, statistics, and machine learning.
One important feature of the information-theory approach is its ability to provide fundamental results, i.e., results that demonstrate the optimality of certain procedures.
Obtaining results of this flavor is useful for many reasons: for example, we can assess whether achieving a target error probability in the transmission of information is feasible; we can determine how many data samples need to be collected to distinguish between two or more statistical hypotheses, or how many examples are needed to train a machine learning algorithm.
Learning outcomes (after completion of the course the student should be able to)
- Define entropy, relative entropy, and mutual information and explain their operational meaning
- Describe and demonstrate Shannons source coding and channel coding theorems
- Compute the capacity of discrete communication channels
- Describe the fundamental performance metrics in binary hypothesis testing, their trade-off, their asymptotic behavior, and the structure of the optimal test
- Explain how relative entropy can help characterizing the generalization error in statistical learning
Content
- Shannons information metrics: entropy, relative entropy (a.k.a. Kulback-Leibler divergence), mutual information
- Asymptotic equipartition property and typicality
- Data compression and the source coding theorem
- Data transmission and the channel coding theorem
- Binary hypothesis testing, Neyman-Pearson Lemma, Steins lemma
- Generalization error in statistical learning theory and probably-approximately correct (PAC) Bayesian bounds
Organisation
Approximately 15 lectures and 7 exercise sessionsLiterature
- S. M. Moser, Information Theory (Lecture Notes), six ed., ETH Zurich, Switzerland and National Chiao Tung University, Taiwan, Oct. 2018. [Online]. Available here: https://moser-isi.ethz.ch/docs/it_script_v616.pdf
- Y. Polyanskiy and Y. Wu, Information theory: from coding to learning, Cambridge University Press, 2025. [Online]. Available: https://people.lids.mit.edu/yp/homepage/papers.html
- J. Duchi, Information theory and statistics (lecture notes). Stanford, CA: Stanford University, 2021. [Online]. Available: https://web.stanford.edu/class/stats311/
- A. El Gamal and Y.-H. Kim, Network information theory. Cambridge, U.K.: Cambridge Univ. Press, 2011. Available online via Chalmers library.
Examination including compulsory elements
Mandatory weekly assignments, oral exam (pass or fail), 7.5hpThe course examiner may assess individual students in other ways than what is stated above if there are special reasons for doing so, for example if a student has a decision from Chalmers about disability study support.
