# Institutionernas doktorandkurser

Startdatum och periodicitet för kurser kan variera. Se detaljer för respektive kurs för aktuell information. För anmälan, kontakta respektive kursansvarig.

## Information Theory

• Kurskod: FSSY020
• ECTS-poäng: 7,5
• Institution: ELEKTROTEKNIK
• Forskarskola: Signaler och system
• Periodicitet: Every 2nd year, next time it will be given in SP4, spring, 2019.
• Undervisningsspråk: Kursen kommer att ges på engelska
Aim
This course offers an introduction to information theory and its application to the design of reliable and efficient communication and storage systems. Students obtain in this course a basic understanding of the fundamental limits on data compression, storage, and communications, and familiarity with fundamental measures of information and uncertainty such as entropy, relative entropy, and mutual information. Such knowledge is invaluable for both researchers and practitioners in the areas of communications and signal processing.

This course is highly recommended to
• master students with an interest in the theory behind engineering approaches
• master students planning to continue their education with doctoral studies
• doctoral students

in the field of wireless and optical communications, data compression, statistical signal processing, neuroscience, or machine learning.

Content

• Entropy, relative entropy, mutual information, Jensen¿s inequality, data-processing inequality, Fanos inequality
• Data compression, the source coding theorem, Kraft inequality, Shannon-Fano-Elias codes, Huffman codes
• Entropy rate, compression of discrete stationary sources, LZ78
• Channel capacity, the channel coding theorem, computing channel capacity, the Blauhut-Arimoto algorithm
• Error exponent, coding rate at finite blocklength
• Differential entropy, capacity of additive Gaussian noise channel, parallel channels
Learning outcomes
• Define entropy, relative entropy, and mutual information and explain their operational meaning
• Describe and demonstrate Shannon¿s source coding and channel coding theorems
• Apply Huffmann and Shannon-Fano-Elias codes to compress discrete memoryless sources losslessly
• Compute the capacity of discrete memoryless point-to-point channels
• Describe the capacity formula of the AWGN channel and differentiate between power-limited and bandwidth-limited regimes
• Use the water-filling technique to optimally allocate power in parallel Gaussian channels
Application information
Contact Giuseppe Durisi
Litteratur
• T. M. Cover and J. A. Thomas, Elements of Information Theory, 2nd ed. New York, NY, U.S.A.: Wiley, 2006
• D. J. C. MacKay, Information theory, inference, and learning algorithms, Cambridge Univ. Press, 2003
• S. M. Moser, Information theory: Lecture notes, National Chiao Tung University, Taiwan, Nov. 2012
Föreläsare
Giuseppe Durisi
Telephone: 031 772 18 21
Email: durisi@chalmers.se
Mer information

Sidansvarig Publicerad: on 10 feb 2021.