Search course

Use the search function to find more information about the study programmes and courses available at Chalmers. When there is a course homepage, a house symbol is shown that leads to this page.

Graduate courses

Departments' graduate courses for PhD-students.

​​​​
​​

Syllabus for

Academic year
SSY340 - Deep machine learning
Djup maskininlärning
 
Syllabus adopted 2019-02-14 by Head of Programme (or corresponding)
Owner: MPSYS
7,5 Credits
Grading: TH - Five, Four, Three, Fail
Education cycle: Second-cycle
Major subject: Automation and Mechatronics Engineering, Computer Science and Engineering, Electrical Engineering
Department: 32 - ELECTRICAL ENGINEERING

The course is full. For waiting list, please contact the director of studies: per-anders.traff@chalmers.se
Teaching language: English
Application code: 35124
Open for exchange students: No
Block schedule: A
Maximum participants: 140

Module   Credit distribution   Examination dates
Sp1 Sp2 Sp3 Sp4 Summer course No Sp
0117 Project 3,0 c Grading: TH   3,0 c    
0217 Written and oral assignments 4,5 c Grading: TH   4,5 c    

In programs

MPCAS COMPLEX ADAPTIVE SYSTEMS, MSC PROGR, Year 2 (elective)
MPCSN COMPUTER SYSTEMS AND NETWORKS, MSC PROGR, Year 2 (elective)
MPALG COMPUTER SCIENCE - ALGORITHMS, LANGUAGES AND LOGIC, MSC PROGR, Year 2 (elective)
MPSYS SYSTEMS, CONTROL AND MECHATRONICS, MSC PROGR, Year 2 (elective)

Examiner:

Lennart Svensson

  Go to Course Homepage


Eligibility:


In order to be eligible for a second cycle course the applicant needs to fulfil the general and specific entry requirements of the programme that owns the course. (If the second cycle course is owned by a first cycle programme, second cycle entry requirements apply.)
Exemption from the eligibility requirement: Applicants enrolled in a programme at Chalmers where the course is included in the study programme are exempted from fulfilling these requirements.

Course specific prerequisites

Students should have working knowledge of basic probability, linear algebra and programming. It is desirable to have basic knowledge in statistics and learning, corresponding to, e.g., ESS101 - Modelling and simulation, SSY230 - System identification or TDA231 - Algorithms for machine learning and inference, but it is not a strict requirement.

Aim

The purpose with this course is to give a thorough introduction to deep machine learning, also known as deep learning or deep neural networks. Over the last few years, deep machine learning has dramatically changed the state of the art performance in various fields including speech-recognition, computer vision and reinforcement learning (used, e.g., to learn how to play Go). We focus primarily on basic principles regarding how these networks are constructed and trained, but we also cover many of the key techniques used in different applications. The overall objective is to provide a solid understanding of how and why deep machine learning is useful, as well as the skills to apply them to solve problems of practical importance.

Learning outcomes (after completion of the course the student should be able to)

After the course, students should be able to:

  • explain the fundamental principles of supervised (and unsupervised) learning, including basic techniques like cross-validation to avoid overfitting

  • describe the standard cost functions optimised during supervised training (in particular the cross entropy) and the standard solution techniques (stochastic gradient descent, back propagation, etc.)

  • explain how traditional feed-forward networks are constructed and why they can approximate "almost" any function (the universality theorem)

  • understand the problem with vanishing gradients and modern tools to mitigate it (e.g., batch normalisation and residual networks)

  • summarise the key components in convolutional neural networks (CNNs) and their key advantages

  • describe common types of recurrent neural networks (RNN) and their applications

  • provide an overview of some of the many modern variations of the deep learning networks

  • argue for the benefits of generative models, transfer learning and data augmentation in situations when we have a limited amount of annotated/labelled data

  • explain what a Markov decision problem and reinforcement learning (RL) are

  • make use of deep learning to solve RL using, e.g., deep q-learning

  • train and apply CNNs to image applications and RNNs to applications related to time sequences such as those involved in R

  • use a suitable deep learning library (e.g., TensorFlow or Torch) to solve a variety of practical applications

Content


  • supervised learning by cross-entropy minimisation and cross-validation

  • back propagation and stochastic gradient descent

  • a suitable programming language for implementing deep learning algorithm

  • feedforward neural networks and convolutional neural networks

  • recurrent neural networks and long short-term memory networks

  • techniques for efficient training such as momentum and batch normalisation

  • modern variations of neural networks (e.g., attention and residual networks)

  • transfer learning and data augmentation

  • reinforcement learning, Markov decision problems, q-learning and deep q-learning

  • application of convolutional neural networks on image recognition and reinforcement learning

Organisation

The course comprises on-line lectures (to watch before the class), active learning sessions (where we review material from the corresponding lecture), home assignments, a project and tutorial sessions (primarily related to the home assignments).

Literature

We mainly use
Ian Goodfellow, Yoshua Bengio and Aaron Courville, Deep Learning, MIT Press, 2016,
which is available online http://www.deeplearningbook.org.

Examination including compulsory elements

There is no written exam in this course. Instead the students are evaluated individually based on their performance in the different activities in the course; more specifically, the grade is obtained by weighting the results on hand-ins, project and the degree of attendance.


Page manager Published: Thu 04 Feb 2021.