Search course

Use the search function to find more information about the study programmes and courses available at Chalmers. When there is a course homepage, a house symbol is shown that leads to this page.

Graduate courses

Departments' graduate courses for PhD-students.


Syllabus for

Academic year
SSY125 - Digital communications
Syllabus adopted 2014-02-13 by Head of Programme (or corresponding)
Owner: MPCOM
7,5 Credits
Grading: TH - Five, Four, Three, Not passed
Education cycle: Second-cycle
Major subject: Electrical Engineering

Teaching language: English
Open for exchange students
Block schedule: A

Course module   Credit distribution   Examination dates
Sp1 Sp2 Sp3 Sp4 Summer course No Sp
0107 Examination 7,5 c Grading: TH   7,5 c   12 Jan 2015 pm V,  14 Apr 2015 am M,  24 Aug 2015 pm M

In programs



Bitr professor  Alexandre Graell i Amat


ESS140   Digital communications ESS195   Digital communications

Course evaluation:


In order to be eligible for a second cycle course the applicant needs to fulfil the general and specific entry requirements of the programme that owns the course. (If the second cycle course is owned by a first cycle programme, second cycle entry requirements apply.)
Exemption from the eligibility requirement: Applicants enrolled in a programme at Chalmers where the course is included in the study programme are exempted from fulfilling these requirements.

Course specific prerequisites

A passing grade in SSY121 Introduction to Communication Engineering, or a similar course, is required. Working knowledge of probability theory and signals and systems and experience of MATLAB is required. Knowledge of random processes is very useful, but not essential. Hence, the course Random signals analysis is recommended.


This course introduces the basics of information and coding theory. We will be concerned with the design of a system that transfers information from a transmitter (source) to a receiver over a physical channel. Of course, we would like to do this efficiently and at the lowest possible cost, but at the same time we must ensure that the quality of the information transfer is acceptable.

Several questions immediately come to mind when reading the above paragraph. What is meant by information? How can we compress a source? How is the transmission cost calculated? How is quality defined and measured? What design trade-offs can be made? The aim of this course is to answer these questions.

Learning outcomes (after completion of the course the student should be able to)

  • Compute or estimate the symbol and bit error probability for transmission over linear additive white Gaussian noise channels for simple modulations (e.g., PAM, PSK, QAM) 
  • Define entropy and mutual information and explain their operational meaning
  • Describe Shannon's source coding and channel coding theorems
  • Apply Huffmann codes to compress discrete memoryless sources losslessly
  • Compute the capacity of discrete memoryless point-to-point channels
  • Describe Shannon's capacity formula on the AWGN channel and elaborate on the fundamental trade-off between transmitted signals; power and bandwidth in order to reach a certain error performance of the communication link
  • Estimate the performance of communication links (i.e., modulation formats, channel codes and decoders, and equalizers) over linear additive white Gaussian noise channels by computer simulations. This includes determining simulation parameters to reach the desired accuracy as well as programming the simulation in MATLAB 
  • Explain the advantages and disadvantages of block and convolutional channel coding, define and compare some major decoding methods (syndrome, Viterbi), and estimate the error performance for channel-coded systems 
  • Design communication links (modulation, channel coding, and receiver algorithms) for linear additive white Gaussian channels such that specified requirements on power and spectral efficiency are satisfied.


  • From continuous-time to discrete-time AWGN channel
  • Methods for computing and bounding symbol and bit error probabilities: decision regions, Q-function, union bound techniques
  • Error analysis of common modulation formats: PAM, PSK, QAM
  • Detection theory: maximum likelihood (ML) and maximum a posteriori detection
  • Power spectrum and spectral efficiency
  • Entropy, mutual information, Jensen's inequality, data-processing inequality
  • Data compression, the source coding theorem, Kraft inequality, Huffman codes
  • Channel capacity for the Gaussian channel, the channel coding theorem
  • Linear binary block codes: generator and parity check matrices, and syndrome decoding, error correcting capability, error detecting capability, union bound for soft and hard ML decoding
  • Binary convolutional codes: state diagram, trellis, ML decoding, Viterbi algorithm, union bound on bit error probability for soft and hard ML decoding
  • Organisation

    The course is comprised of approximately 16 lectures, 12 exercise sessions, 3 quizzes, and 1 project.


    To be decided



    The final grade (TH) is based on scores from projects, quizzes, and a written exam. The project and the literature study are mandatory in the sense that they must be passed to pass the course.

    Page manager Published: Thu 04 Feb 2021.