Search programme

​Use the search function to search amongst programmes at Chalmers. The study programme and the study programme syllabus relating to your studies are generally from the academic year you began your studies.

Syllabus for

Academic year
SSY125 - Digital communications
Digital kommunikation
Syllabus adopted 2021-02-26 by Head of Programme (or corresponding)
Owner: MPCOM
7,5 Credits
Grading: TH - Pass with distinction (5), Pass with credit (4), Pass (3), Fail
Education cycle: Second-cycle
Major subject: Electrical Engineering

Teaching language: English
Application code: 13120
Open for exchange students: Yes
Block schedule: A
Status, available places (updated regularly): Yes

Module   Credit distribution   Examination dates
Sp1 Sp2 Sp3 Sp4 Summer course No Sp
0107 Examination 7,5c Grading: TH   7,5c   15 Jan 2022 pm J,  13 Apr 2022 am J,  22 Aug 2022 pm J

In programs



Alexandre Graell i Amat

  Go to Course Homepage


General entry requirements for Master's level (second cycle)
Applicants enrolled in a programme at Chalmers where the course is included in the study programme are exempted from fulfilling the requirements above.

Specific entry requirements

English 6 (or by other approved means with the equivalent proficiency level)
Applicants enrolled in a programme at Chalmers where the course is included in the study programme are exempted from fulfilling the requirements above.

Course specific prerequisites

A passing grade in SSY121 Introduction to Communication Engineering, or a similar course, is required. Working knowledge of probability theory and signals and systems (especially transforms, filtering, convolution, sampling theorem) and experience of MATLAB is required. Knowledge of random processes is very useful, but not essential. Hence, a course in random signals analysis is recommended, e.g., MVE136 or a similar course.


This course introduces the basics of information and coding theory. We will be concerned with the design of a system that transfers information from a transmitter (source) to a receiver over a physical channel. Of course, we would like to do this at the lowest possible cost, but at the same time we must ensure that the quality of the information transfer is acceptable.
Several questions immediately come to mind when reading the above paragraph. What is meant by information? How can we compress a source? How is the transmission cost calculated? How is quality defined and measured? How much information can be transmitted reliably over the channel? What design tradeoffs can be made? The aim of this course is to answer these questions.

Learning outcomes (after completion of the course the student should be able to)

  • Compute or estimate the symbol and bit error probability for transmission over linear additive white Gaussian noise channels for simple modulations (e.g., PAM, PSK, QAM) 
  • Define entropy and mutual information and explain their operational meaning
  • Describe Shannon's source coding and channel coding theorems
  • Apply Huffmann codes to compress discrete memoryless sources losslessly
  • Compute the capacity of discrete memoryless point-to-point channels
  • Describe Shannon's capacity formula on the AWGN channel and elaborate on the fundamental tradeoff between power and bandwidth in order to reach a certain error performance of the communication link
  • Estimate the performance of communication links (i.e., modulation formats, channel codes and decoders, and equalizers) over linear additive white Gaussian noise channels by computer simulation. This includes determining simulation parameters to reach the desired accuracy as well as programming the simulation in MATLAB 
  • Explain the advantages and disadvantages of block and convolutional channel coding, define and compare some major decoding methods (syndrome, Viterbi), and estimate the error performance for channel-coded systems 
  • Design communication links (modulation, channel coding, and receiver algorithms) for linear additive white Gaussian channels such that specified requirements on power and spectral efficiency are satisfied.


  • From continuous-time to discrete-time AWGN channel
  • Entropy, data compression, prefix-free codes, Kraft's inequality, Huffman codes, the source coding theorem
  • Mutual information, channel capacity, the channel coding theorem
  • Detection theory: maximum likelihood (ML) and maximum a posteriori detection
  • Methods for computing and bounding symbol and bit error probabilities: decision regions, Q-function, union bound techniques
  • Analysis of linear modulation formats (PAM, PSK, QAM), power and spectral efficiency
  • Channel coding, Hamming distance, hard- and soft-decision decoding
  • Linear binary block codes: generator and parity check matrices, syndrome decoding, error correction and error detection capability
  • Binary convolutional codes: trellis diagram, ML decoding, Viterbi algorithm, union bound on the error probability for soft and hard ML decoding


The course is comprised of approximately 18 lectures, 11 exercise sessions, 3 quizzes, and 1 project.


Stefan M. Moser and Po-Ning Chen, A Student's Guide to Coding and Information Theory, Cambridge University Press, 2012. The book is available at Cremona.

Examination including compulsory elements

The final grade (TH) is based on scores from a project, quizzes, and a written exam. The project and the literature study are mandatory in the sense that they must be passed to pass the course.

The course examiner may assess individual students in other ways than what is stated above if there are special reasons for doing so, for example if a student has a decision from Chalmers on educational support due to disability.

Page manager Published: Mon 28 Nov 2016.