Search programme

​Use the search function to search amongst programmes at Chalmers. The study programme and the study programme syllabus relating to your studies are generally from the academic year you began your studies.

Syllabus for

Academic year
SSY210 - Information theory, advanced level  
Syllabus adopted 2021-04-29 by Head of Programme (or corresponding)
Owner: MPCOM
7,5 Credits
Grading: UG - Pass, Fail
Education cycle: Second-cycle
Main field of study: Electrical Engineering

The course round is cancelled. For further questions, please contact the director of studies MPCOM: COMMUNICATION ENGINEERING, MSC PROGR, contact information can be found here. This course round is planned to be given every other year.

Teaching language: English
Application code: 13112
Open for exchange students: Yes

Module   Credit distribution   Examination dates
Sp1 Sp2 Sp3 Sp4 Summer course No Sp
0108 Oral examination 7,5 c Grading: UG   7,5 c    

In programs



Giuseppe Durisi

  Go to Course Homepage


General entry requirements for Master's level (second cycle)
Applicants enrolled in a programme at Chalmers where the course is included in the study programme are exempted from fulfilling the requirements above.

Specific entry requirements

English 6 (or by other approved means with the equivalent proficiency level)
Applicants enrolled in a programme at Chalmers where the course is included in the study programme are exempted from fulfilling the requirements above.

Course specific prerequisites

A solid foundation in probability and calculus. The course difficulty is on the Ph.D. level, which means that it is mathematically more advanced and runs at a higher pace than most Master's courses.


This course offers an introduction to information theory and its application to digital communication, statistics, and machine learning.

One important feature of the information-theory approach is its ability to provide fundamental results, i.e., results that demonstrate the optimality of certain procedures.

Obtaining results of this flavor is useful for many reasons: for example, we can assess whether achieving a target error probability in the transmission of information is feasible; we can determine how many data samples need to be collected to distinguish between two or more statistical hypotheses, or how many examples are needed to train a machine learning algorithm.

Learning outcomes (after completion of the course the student should be able to)

  • Define entropy, relative entropy, and mutual information and explain their operational meaning

  • Describe and demonstrate Shannon’s source coding and channel coding theorems

  • Compute the capacity of discrete communication channels

  • Describe the fundamental performance metrics in binary hypothesis testing, their trade-off, their asymptotic behavior, and the structure of the optimal test

  • Explain how relative entropy can help characterizing the generalization error in statistical learning

  • Apply Fano’s inequality to demonstrate impossibility results in group testing, graphical model selection, and sparse linear regression


  • Shannon’s information metrics: entropy, relative entropy (a.k.a. Kulback-Leibler divergence), mutual information

  • Asymptotic equipartition property and typicality

  • Data compression and the source coding theorem

  • Data transmission and the channel coding theorem

  • Binary hypothesis testing, Neyman-Pearson Lemma, Stein’s lemma

  • Generalization error in statistical learning theory and probably-approximately correct (PAC) Bayesian bounds

  • Minimax bounds in statistical estimations and the Fano’s method


Approximately 15 lectures and 7 exercise sessions


The course is partly based on the following references:

Examination including compulsory elements

Mandatory weekly assignments, oral exam (pass or fail), 7.5hp

The course examiner may assess individual students in other ways than what is stated above if there are special reasons for doing so, for example if a student has a decision from Chalmers on educational support due to disability.

Page manager Published: Mon 28 Nov 2016.