Departments' graduate courses

Course start and periodicity may vary. Please see details for each course for up-to-date information. The courses are managed and administered by the respective departments. For more information about the courses, how to sign up, and other practical issues, please contact the examiner or course contact to be found in the course information.

Information Theory

  • Course code: FSSY020
  • Course higher education credits: 7.5
  • Graduate school: Signals and Systems
  • Course is normally given: Every 2nd year, next time it will be given in SP4, spring, 2019.
  • Language: The course will be given in English
This course offers an introduction to information theory and its application to the design of reliable and efficient communication and storage systems. Students obtain in this course a basic understanding of the fundamental limits on data compression, storage, and communications, and familiarity with fundamental measures of information and uncertainty such as entropy, relative entropy, and mutual information. Such knowledge is invaluable for both researchers and practitioners in the areas of communications and signal processing.

This course is highly recommended to
  • master students with an interest in the theory behind engineering approaches
  • master students planning to continue their education with doctoral studies
  • doctoral students

in the field of wireless and optical communications, data compression, statistical signal processing, neuroscience, or machine learning.  


  • Entropy, relative entropy, mutual information, Jensen¿s inequality, data-processing inequality, Fanos inequality
  • Data compression, the source coding theorem, Kraft inequality, Shannon-Fano-Elias codes, Huffman codes
  • Entropy rate, compression of discrete stationary sources, LZ78
  • Channel capacity, the channel coding theorem, computing channel capacity, the Blauhut-Arimoto algorithm
  • Error exponent, coding rate at finite blocklength
  • Differential entropy, capacity of additive Gaussian noise channel, parallel channels
Learning outcomes
  • Define entropy, relative entropy, and mutual information and explain their operational meaning
  • Describe and demonstrate Shannon¿s source coding and channel coding theorems
  • Apply Huffmann and Shannon-Fano-Elias codes to compress discrete memoryless sources losslessly
  • Compute the capacity of discrete memoryless point-to-point channels
  • Describe the capacity formula of the AWGN channel and differentiate between power-limited and bandwidth-limited regimes
  • Use the water-filling technique to optimally allocate power in parallel Gaussian channels
Application information
Contact Giuseppe Durisi

More information
Course webpage
  • T. M. Cover and J. A. Thomas, Elements of Information Theory, 2nd ed. New York, NY, U.S.A.: Wiley, 2006
  • D. J. C. MacKay, Information theory, inference, and learning algorithms, Cambridge Univ. Press, 2003
  • S. M. Moser, Information theory: Lecture notes, National Chiao Tung University, Taiwan, Nov. 2012 
Giuseppe Durisi
More information
Giuseppe Durisi
Telephone: 031 772 18 21

Published: Tue 22 Aug 2017.