Graphical Markov Models
Code  Completion  Credits  Range  Language 

XEP33GMM  ZK  4  2P+1S 
 Lecturer:
 Boris Flach (guarantor)
 Tutor:
 Boris Flach (guarantor)
 Supervisor:
 Department of Cybernetics
 Synopsis:

Markov models on graphs represent a model class widely applied in many areas of computer science, such as computer networks, data security, robotics and pattern recognition. The first part of the course covers inference and learning for Markov models on chains and trees. All these tasks including structure learning can be solved by efficient algorithms. The second part addresses graphical models on general graphs. Here on the contrary, practically all inference and learning tasks are NPcomplete. The focus is therefore on efficient approximative algorithms.
 Requirements:

Basics of probability theory, graphs and graph algorithms.
 Syllabus of lectures:

1. Markov chains, equivalent representations, ergodicity, convergence theorem for homogeneous Markov chains
2. Hidden Markov Models on chains for speech recognition: preprocessing, dynamic time warping, HMMs
3. Recognizing the generating model  calculating the emission probability for a measured signal sequence.
4. Recognizing the most probable sequence of hidden states and the sequence of most probable states.
5. Possible formulations for supervised and unsupervised learning tasks (parameter estimation).
6. Supervised and unsupervised learning according to the MaximumLikelihood principle, the Expectation Maximization algorithm
7. Hidden Markov models on acyclic graphs (trees). Estimating the graph structure.
8. Hidden Markov models with continuous state spaces. Kalman filter and particle filters.
9. Markov Random Fields  Markov models on general graphs. Equivalence to Gibbs models, Examples from Computer Vision.
10. Relations to Constraint Satisfaction Problems and Energy Minimization tasks, unified formulation, semirings.
11. Searching the most probable state configuration: transforming the task into a MinCutproblem for the submodular case.
12. Searching the most probable state configuration: approximation algorithms for the general case.
13. The partition function and marginal probabilities: approximation algorithms for their estimation.
14. Duality between marginal probabilities and Gibbs potentials. The Expectation Maximization algorithm for parameter learning.
 Syllabus of tutorials:

The lectures will be accompanied by seminars and labs. Advanced exercises will be discussed in seminars. The aim is to deepen the learned knowledge and to develop skills in recognizing the applicability of the learned concepts.
 Study Objective:

Course objectives:
(1) To gain in depth understanding of the chain „modelproblemalgorithm“ in the context of probabilistic Markov models especially for inference and learning problems.
(2) To enable students to adapt and apply the learned problem formulations and algorithms for different application context.
(3) To provide a basic ability to recognize the applicability of the learned concepts for specific applications in artificial intelligence and machine learning.
 Study materials:

[1] Stan Y. Li; Markov Random Field Modeling in Image Analysis, Springer Verlag, 3. edition, 2009
[2] Michail I. Schlesinger and Vaclav Hlavac; Ten Lectures on Statistical and Structural Pattern Recognition. Kluwer Academic Press, 2002
[3] Daphne Koller, Nir Friedman, Probabilistic Graphical Models Principles and Techniques, MIT Press, 2009
 Note:
 Further information:
 http://cw.felk.cvut.cz/doku.php/courses/xep33gmm/start
 Timetable for winter semester 2019/2020:

06:00–08:0008:00–10:0010:00–12:0012:00–14:0014:00–16:0016:00–18:0018:00–20:0020:00–22:0022:00–24:00
Mon Tue Fri Thu Fri  Timetable for summer semester 2019/2020:
 Timetable is not available yet
 The course is a part of the following study plans:

 Doctoral studies, daily studies (compulsory elective course)
 Doctoral studies, combined studies (compulsory elective course)