Search programme

​Use the search function to search amongst programmes at Chalmers. The programme overview and the programme syllabus relating to your studies are generally from the academic year you began your studies.

​​​

Syllabus for

Academic year
FFR135 - Artificial neural networks
 
Owner: FCMAS
5,0 Credits (ECTS 7,5)
Grading: TH - Five, Four, Three, Not passed
Level: A
Department: 0707 - Physical resource theory


Teaching language: English

Course module   Credit distribution   Examination dates
Sp1 Sp2 Sp3 Sp4 No Sp
0100 Examination 5,0 c Grading: TH   5,0 c    

In programs

TTFYA ENGINEERING PHYSICS, Year 4 (elective)
FCMAS MSc PROGRAMME IN COMPLEX ADAPTIVE SYSTEMS, Year 1 (compulsory)
TELTA ELECTRICAL ENGINEERING, Year 4 (elective)
TAUTA AUTOMATION AND MECHATRONICS ENGENEERING, Year 4 (elective)
TDATA COMPUTER SCIENCE AND ENGINEERING - Cognition Technology and Multimedia, Year 4 (elective)
TDATA COMPUTER SCIENCE AND ENGINEERING, Year 3 (elective)

Examiner:




Eligibility:

For single subject courses within Chalmers programmes the same eligibility requirements apply, as to the programme(s) that the course is part of.

Aim

Neural networks are distributed computational models inspired by the information process in the human brain, consisting of many simple analog processing elements connected in a network. Neural networks are increasingly used in many different fields of engineering for tasks such as pattern recognition, prediction and control. The theory of neural networks is a crossdisciplinary field which includes both neurobiology, computer science and statistical physics.
The course gives an overview and a fundamental theoretical understanding of the most important neural net algorithms. These includes models of associative memory (e.g., the Hopfield model) algorithms for learning from examples (e.g., perceptron learning, back-propagation, temporal difference learning), and models for self-organization (e.g., Hebbian learning, self-organizing feature maps). Through comparison with methods from statistics and computer science students can develop an understanding of when neural networks are useful in application problems.

Content

Introduction to neurobiology
Associative memory. The Hopfield model. Optimi-zation problems.
The perceptron
Multilayer perceptrons. Back-propagation and other learning algorithms. Radial basis network.
Introduction to learning theory. Generalization.
Reinforcement learning - learning in situations with infrequent information from the environment.
Self-organization in neural networks. Hebbian learning. Self-organizing feature maps.
Neural network applications.
Classification and prediction using methods from statistics and machine learning.
To develop a thorough understanding of the basic neural network algorithms, students are expected to develop their own implementations (in a language of their chocie), and apply them to simple application problems. This is done through a sequence of small projects and a larger exam project, which form an essential part of the course.

Literature

J. Hertz, A. Krogh, R.G. Palmer: Introduction to the Theory of Neural Computation (Addison-Wesley, 1991). S. Haykin, Neural Networks: A Comprehensive Foundation, Maxmillan/IEEE Press, 1994. Lecture notes and articles. Additional recommended reading: P.S. Churchland and T.J. Sejnowski, The Computational Brain, MIT Press, 1992.

Examination

Through a sequence of small projects and a larger exam project, where students implement the basic models in the course, and use them in application problems.Written and oral presentation of projects.


Page manager Published: Thu 03 Nov 2022.