Pattern Recognition and Machine Learning
Code  Completion  Credits  Range  Language 

BE5B33RPZ  Z,ZK  6  2+2c 
 The course cannot be taken simultaneously with:
 Recognition and machine learning (B4B33RPZ)
 The course is a substitute for:
 Recognition and machine learning (B4B33RPZ)
 Lecturer:
 Jiří Matas (guarantor), Ondřej Drbohlav
 Tutor:
 Jiří Matas (guarantor), Javier Alejandro Aldana Iuit, Filip Radenović
 Supervisor:
 Department of Cybernetics
 Synopsis:

The basic formulations of the statistical decision problem are presented. The necessary knowledge about the (statistical) relationship between observations and classes of objects is acquired by learning on the raining set. The course covers both wellestablished and advanced classifier learning methods, as Perceptron, AdaBoost, Support Vector Machines, and Neural Nets.
 Requirements:

Knowledge of linear algebra, mathematical analysis and
probability and statistics.
 Syllabus of lectures:

1.The pattern recognition problem. Overview of the Course. Basic notions.
2.The Bayesian decisionmaking problem, i.e. minimization of expected loss.
3.Nonbayesian decision problems.
4.Parameter estimation. The maximum likelihood method.
5.The nearest neighbour classifier.
6.Linear classifiers. Perceptron learning.
7.The Adaboost method.
8.Learning as a quadratic optimization problem. SVM classifiers.
9.Feedforward neural nets. The backpropagation algorithm.
10.Decision trees.
11.Logistic regression.
12.The EM (Expectation Maximization) algorithm.
13.Sequential decisionmaking (Wald´s sequential test).
14.Recap.
 Syllabus of tutorials:

Students solve four or five pattern recognition problems, for instance a simplified version of OCR (optical character recognition), face detection or spam detection using either classical methods or trained classifiers.
1.Introduction to MATLAB and the STPR toolbox, a simple recognition experiment
2.The Bayes recognition problem
3.Nonbayesian problems I: the NeymanPearson problem.
4.Nonbayesian problems II: The minimax problem.
5.Maximum likelihood estimates.
6.Nonparametric estimates, Parzen windows.
7.Linear classifiers, the perceptron algorithm
8.Adaboost
9.Support Vector Machines I
10.Support Vector Machines II
11.EM algoritmus I
12.EM algoritmus II
13.Submission of reports. Discussion of results.
14.Submission of reports. Discussion of results.
 Study Objective:

To teach the student to formalize statistical decision making problems, to use machine learning techniques and to solve pattern recognition problems with the most popular classifiers (SVM, AdaBoost, neural net, nearest neighbour).
 Study materials:

1.Duda, Hart, Stork: Pattern Classification, 2001.
2.Bishop: Pattern Recognition and Machine Learning, 2006.
3.Schlesinger, Hlavac: Ten Lectures on Statistical and Structural Pattern Recognition, 2002.
 Note:
 Further information:
 http://cw.felk.cvut.cz/doku.php/courses/ae4b33rpz/start
 Timetable for winter semester 2017/2018:

06:00–08:0008:00–10:0010:00–12:0012:00–14:0014:00–16:0016:00–18:0018:00–20:0020:00–22:0022:00–24:00
Mon Tue Fri Thu Fri  Timetable for summer semester 2017/2018:
 Timetable is not available yet
 The course is a part of the following study plans:

 Electrical Engineering and Computer Science (EECS) (compulsory elective course)
 Open Informatics  Computer Science 2016 (compulsory course in the program)
 Open Informatics  Internet of Things 2016 (compulsory course in the program)
 Open Informatics  Software 2016 (compulsory course in the program)
 Open Informatics  Computer Games and Graphics 2016 (compulsory course in the program)
 Open Informatics (compulsory course in the program)
 Open Informatics (compulsory course in the program)
 Open Informatics  Artificial Intelligence and Computer Science 2018 (compulsory course in the program)
 Open Informatics  Internet of Things 2018 (compulsory course in the program)
 Open Informatics  Software 2018 (compulsory course in the program)
 Open Informatics  Computer Games and Graphics 2018 (compulsory course in the program)