Sök i kursutbudet

Använda sökfunktionen för att hitta i Chalmers utbildningsutbud, både vad gäller kurser och program. När det finns en kurshemsida visas en hus-symbol som leder till denna sida. Tänk på att välja det läsår du vill se information om.
Sök program och utbildningsplaner


Institutionernas kurser för doktorander

​​​​​​​​​​​​​​​​​​​​

Kursplan för

Läsår
TDA231 - Algorithms for machine learning and inference
 
Kursplanen fastställd 2009-02-24 av programansvarig (eller motsvarande)
Ägare: MPALG
7,5 Poäng
Betygskala: TH - Fem, Fyra, Tre, Underkänt
Utbildningsnivå: Avancerad nivå
Huvudområde: Datateknik, Informationsteknik
Institution: 37 - DATA- OCH INFORMATIONSTEKNIK


Undervisningsspråk: Engelska

Modul   Poängfördelning   Tentamensdatum
Lp1 Lp2 Lp3 Lp4 Sommarkurs
0101 Projekt 7,5 hp Betygskala: TH   7,5 hp    

I program

MPSYS SYSTEMS, CONTROL AND MECHATRONICS, MSC PROGR, Årskurs 2 
ITIDM INTELLIGENT SYSTEMS DESIGN, Årskurs 2 
MPALG COMPUTER SCIENCE - ALGORITHMS, LANGUAGES AND LOGIC, MSC PROGR, Årskurs 2 
MPALG COMPUTER SCIENCE - ALGORITHMS, LANGUAGES AND LOGIC, MSC PROGR, Årskurs 1 
TKITE INFORMATIONSTEKNIK, CIVILINGENJÖR, Årskurs 3 (valbar)
MPBIS BIOINFORMATICS AND SYSTEMS BIOLOGY, MSC PROGR, Årskurs 2 (valbar)
MPBIS BIOINFORMATICS AND SYSTEMS BIOLOGY, MSC PROGR - Systembiologi spår , Årskurs 1 (valbar)

Examinator:

Professor  Devdatt Dubhashi
Bitr professor  Peter Damaschke


Kursutvärdering:

http://document.chalmers.se/doc/1343713745


  Gå till kurshemsida

Behörighet:

För kurser inom Chalmers utbildningsprogram gäller samma behörighetskrav som till de(t) program kursen ingår i.

Kursspecifika förkunskaper

Basic courses in Algorithms and Mathematical Statistics.
Other courses which complement this course are AI, Applied Optimization and Information Theory.

Syfte

This course will discuss the theory and application of algorithms for machine learning and inference, from an AI perspective. In this context, we consider as learning to draw conclusions from given data or experience which results in some model that generalises these data. Inference is to compute the desired answers or actions based on the model.
Algorithms of this kind are commonly used in for example classification tasks (character recognition, or to predict if a new customer is creditworthy etc.) and in expert systems (for example for medical diagnosis). A new and commercially important area of application is data mining, where the algorithms are used to automatically detect intersting informa-tion and relations in large commercial or scientific databases.
The course intends to give a good understanding of this crossdisciplinary area, with a sufficient depth to use and evaluate the available methods, and to understand the scientific literature.

Lärandemål (efter fullgjord kurs ska studenten kunna)

- distinguish "learning" in an Artificial Intelligence perspective from human learning
- explain the basic notions of concept learning (training data, hypothesis, version space, generalization, etc.) and apply these terms properly to new concept learning problems, including some analysis and modelling of real-world scenarios
- recognize the implicit modle assumptions (inductive bias) in learning algorithms, and be aware that the results crucially depend on them
- represent hypotheses in different forms (decision trees, boolean formulae, rule sets), transform such representations, infer them from data by some exemplary methods (e.g., decision tree learning), know the advantages, drawbacks, and some variations of the methods
- relate the complexity of models and hypotheses (e.g., description length) to the amount of training data, explain the phenomenon of overfitting and counteract it
- explain the basic principles of some types of artificial neural networks and what they can represent, in particular, explain certain network training algorithms as local search algorithms for error minimization
- explain the basic notions and rules of Bayesian inference: conditional probability, a priori and a posteriori probability, maximum-likelihood hypothesis, etc., explain Bayesian classification methods, their underlying ideas and computational aspects
- do calculations for specific problems in the Bayesian framework, represent knowledge in form of Bayesian belief networks and other probabilistic models
- apply instance-based learning, especially nearest-neighbor learning and clustering, being aware of some problems involved (choice of parameters, dimensionality, implementation issues)
- evaluate the quality of hypotheses in a statistically sound way, and estimate the necessary sample size for learning a given hypothesis space, using suitable mathematical concepts
- see connections to related fields like pattern recognition, data mining, commerce, bioinformatics, etc. (depending on your interests)

Innehåll

Introduction. Some basic concepts and definitions: learning, model, inference, utility. Different kinds of models and inference. Generalisation and bias.
Rule induction. Concept learning. Version space algorithms. Inductive bias. Decision trees: ID3 with variations. Overfitting. Learning sets of rules. CN2.
Neural networks. Basic principles. Hopfield nets. Feedforward nets. Backpropagation.
Instance based learning. Distance metrics. Nearest neighbour. Discriminant analysis. Case based reasoning. Cluster analysis.
Bayesian statistics. Conditional probability, Bayes theorem. Bayesian inference. Prior probability. Naive Bayesian classifier.
Probabilistic expert systems. Rule based expert systems. Graphical models/Markov graphs. Mathematical properties of graphical models.
Inference and learning in probabilistic expert systems. Inference in polytrees. Modifying the structure of general networks. Stochastic simulation. General learning difficulties. Algorithms for learning probabilistic networks.
Automatic design of algorithms through evolution. Short overview of evolutionary computation. Specification of algorithmic problems. Efficient incrememental search for programs. The ADATE system.
Theories of learning. Relation between Bayesian inference and learning. Minimum description length principle. Non-informative priors. Kolmogorov complexity. PAC learning. VC-dimension.
The course includes compulsory homework exercises, giving practical experience och the different kinds of algorithms.
See the course home page for the most up to date information.

Organisation

Lectures and homework assignments.

Litteratur

Mitchell (1997): Machine Learning. Handouts.

Examination

Homework exercises plus a final report summarizing the course contents.


Sidansvarig Publicerad: må 13 jul 2020.