Graphical Markov Models

Přihlášení do KOSu pro zápis předmětu Zobrazit rozvrh
Kód Zakončení Kredity Rozsah Jazyk výuky
XEP33GMM ZK 4 2+1s
Boris Flach (gar.)
Boris Flach (gar.)
Předmět zajišťuje:
katedra kybernetiky

Markov models on graphs represent a model class widely applied in many areas of computer science, such as computer networks, data security, robotics and pattern recognition. The first part of the course covers inference and learning for Markov models on chains and trees. All these tasks including structure learning can be solved by efficient algorithms. The second part addresses graphical models on general graphs. Here on the contrary, practically all inference and learning tasks are NP-complete. The focus is therefore on efficient approximative algorithms.

Výsledek studentské ankety předmětu je zde: http://www.fel.cvut.cz/anketa/aktualni/courses/XEP33GMM


Basics of probability theory, graphs and graph algorithms.

Osnova přednášek:

1. Markov chains, equivalent representations, ergodicity, convergence theorem for homogeneous Markov chains

2. Hidden Markov Models on chains for speech recognition: pre-processing, dynamic time warping, HMM-s

3. Recognizing the generating model - calculating the emission probability for a measured signal sequence.

4. Recognizing the most probable sequence of hidden states and the sequence of most probable states.

5. Possible formulations for supervised and unsupervised learning tasks (parameter estimation).

6. Supervised and unsupervised learning according to the Maximum-Likelihood principle, the Expectation Maximization algorithm

7. Hidden Markov models on acyclic graphs (trees). Estimating the graph structure.

8. Hidden Markov models with continuous state spaces. Kalman filter and particle filters.

9. Markov Random Fields - Markov models on general graphs. Equivalence to Gibbs models, Examples from Computer Vision.

10. Relations to Constraint Satisfaction Problems and Energy Minimization tasks, unified formulation, semi-rings.

11. Searching the most probable state configuration: transforming the task into a MinCut-problem for the submodular case.

12. Searching the most probable state configuration: approximation algorithms for the general case.

13. The partition function and marginal probabilities: approximation algorithms for their estimation.

14. Duality between marginal probabilities and Gibbs potentials. The Expectation Maximization algorithm for parameter learning.

Osnova cvičení:

The lectures will be accompanied by seminars with the aim to discuss advanced exercises, to deepen the learned knowledge and to develop skills in recognizing the applicability of the learned concepts.

Cíle studia:

Course objectives:

(1) To gain in depth understanding of the chain „model-problem-algorithm“ in the context of probabilistic Markov models especially for inference and learning problems.

(2) To enable students to adapt and apply the learned problem formulations and algorithms for different application context.

(3) To provide a basic ability to recognize the applicability of the learned concepts for specific applications in artificial intelligence and machine learning.

Studijní materiály:

[1] Stan Y. Li; Markov Random Field Modeling in Image Analysis, Springer Verlag, 3. edition, 2009

[2] Michail I. Schlesinger and Vaclav Hlavac; Ten Lectures on Statistical and Structural Pattern Recognition. Kluwer Academic Press, 2002

[3] Daphne Koller, Nir Friedman, Probabilistic Graphical Models Principles and Techniques, MIT Press, 2009


URL: http://cw.felk.cvut.cz/doku.php/courses/xep33gmm/start

Další informace:
Rozvrh na zimní semestr 2018/2019:
místnost KN:E-128
Flach B.
(přednášková par. 1)
Karlovo nám.
Cvičebna K3
místnost KN:E-112
Flach B.

(přednášková par. 1
paralelka 101)

Karlovo nám.
Cvičebna Vyčichlova

Rozvrh na letní semestr 2018/2019:
Rozvrh není připraven
Předmět je součástí následujících studijních plánů:
Platnost dat k 20. 6. 2019
Aktualizace výše uvedených informací naleznete na adrese http://bilakniha.cvut.cz/cs/predmet1952506.html