Logo ČVUT
Loading...
ČESKÉ VYSOKÉ UČENÍ TECHNICKÉ V PRAZE
STUDIJNÍ PLÁNY
2011/2012

Graphical Markov Models

Přihlášení do KOSu pro zápis předmětu Zobrazit rozvrh
Kód Zakončení Kredity Rozsah Jazyk výuky
XEP33GMM ZK 4 2+1s
Přednášející:
Boris Flach (gar.)
Cvičící:
Boris Flach (gar.)
Předmět zajišťuje:
katedra kybernetiky
Anotace:

Markov models on graphs represent a model class widely applied in many areas of computer science, such as computer networks, data security, robotics and pattern recognition. The first part of the course covers inference and learning for Markov models on chains and trees. All these tasks including structure learning can be solved by efficient algorithms. The second part addresses graphical models on general graphs. Here on the contrary, practically all inference and learning tasks are NP-complete. The focus is therefore on efficient approximative algorithms.

Požadavky:

Basics of probability theory, graphs and graph algorithms.

Osnova přednášek:

1. Markov chains, equivalent representations, ergodicity, convergence theorem for homogeneous Markov chains

2. Hidden Markov Models on chains for speech recognition: pre-processing, dynamic time warping, HMM-s

3. Recognizing the generating model - calculating the emission probability for a measured signal sequence.

4. Recognizing the most probable sequence of hidden states and the sequence of most probable states.

5. Possible formulations for supervised and unsupervised learning tasks (parameter estimation).

6. Supervised and unsupervised learning according to the Maximum-Likelihood principle, the Expectation Maximization algorithm

7. Hidden Markov models on acyclic graphs (trees). Estimating the graph structure.

8. Hidden Markov models with continuous state spaces. Kalman filter and particle filters.

9. Markov Random Fields - Markov models on general graphs. Equivalence to Gibbs models, Examples from Computer Vision.

10. Relations to Constraint Satisfaction Problems and Energy Minimization tasks, unified formulation, semi-rings.

11. Searching the most probable state configuration: transforming the task into a MinCut-problem for the submodular case.

12. Searching the most probable state configuration: approximative algorithms for the general case.

13. The partition function and marginal probabilities: Approximative algorithms for their estimation.

14. Duality between marginal probabilities and Gibbs potentials. The Expectation Maximization algorithm for parameter learning.

Osnova cvičení:

The lectures will be accompanied by seminars with the aim to discuss advanced exercises, to deepen the learned knowledge and to develop skills in recognizing the applicability of the learned concepts.

Cíle studia:
Studijní materiály:

[1] Stan Y. Li; Markov Random Field Modeling in Image Analysis, Springer Verlag, 3. edition, 2009

[2] Michail I. Schlesinger and Vaclav Hlavac; Ten Lectures on Statistical and Structural Pattern Recognition. Kluwer Academic Press, 2002

[3] Daphne Koller, Nir Friedman, Probabilistic Graphical Models Principles and Techniques, MIT Press, 2009

Poznámka:
Rozvrh na zimní semestr 2011/2012:
06:00–08:0008:00–10:0010:00–12:0012:00–14:0014:00–16:0016:00–18:0018:00–20:0020:00–22:0022:00–24:00
Po
místnost KN:E-127
Flach B.
14:30–16:00
(přednášková par. 1)
Karlovo nám.
Kotkova cvičebna K4
Út
St
Čt

Rozvrh na letní semestr 2011/2012:
Rozvrh není připraven
Předmět je součástí následujících studijních plánů:
Platnost dat k 9. 7. 2012
Aktualizace výše uvedených informací naleznete na adrese http://bilakniha.cvut.cz/cs/predmet1952506.html