Logo ČVUT
CZECH TECHNICAL UNIVERSITY IN PRAGUE
STUDY PLANS
2024/2025

Probabilistic Learning Models

The course is not on the list Without time-table
Code Completion Credits Range Language
01PMU ZK 2 2+0 Czech
Garant předmětu:
Lecturer:
Tutor:
Supervisor:
Department of Mathematics
Synopsis:

Introduction into the theory PAC learning model, VC-dimension of finite sets, Sauer, Cover and Radon's lemma,

VC-dimension of composed mappings, application of VC-dimension for lower bound of necessary patterns, analysis of

properties of delta rule based learning processes, PAC learning model extensions and PAO learning, Fourier

coefficients search for Boolean functions.

Requirements:
Syllabus of lectures:

1. PAC learning introduction

2. Concepts and concept classes

3. PAC learning in finite case

4. Vapnik-Chervonenkis dimension (Sauer's lemma, Cover's lemma, Radon's lemma)

5. VC-dimension of finite set systems

6. VC-dimension of union and intersection

7. VC-dimension of linear concepts

8. Application of Cover?s lemma

9. Vapnik-Chervonenkis dimension of composed mapping

10. Pattern complexity and VC-dimension

11. Minimal number of patterns and PAC learning

12. Delta rule learning algorithm

13. Lover bound for maximal steps of delta rule algorithm

14. Polynomial learning and pattern dimension

15. Almost optimal solution of the cover set problem

16. Polynomial learning and conceptual complexity

17. Probabilistic learning algorithms

18. Probabilistic approximation of Fourier expansion

19. Probabilistic search of Fourier coefficients of Boolean functions

20. Probably approximately optimal learning

Syllabus of tutorials:
Study Objective:

Make the students acquainted with theoretical and mathematical fundamentals of PAC-like learning algorithms.

Study materials:

Key references:

F. Hakl, M. Holeňa. Úvod do teorie neuronových sítí. Ediční středisko ČVUT, Praha, 1997.

Recommended references:

Vwani Roychowdhury, Kai-Yeung Siu, Alon Orlitsky. Theoretical Advances in Neural Computation and Learning. Kluwer

Academic Publishers, 1994.

Martin Anthony and Norman Biggs. Computational Learning Theory. Press

Syndicate of the University of Cambridge, 1992.

A. Blumer, A. Ehrenfeucht, D. Haussler, and M. K. Warmuth. Learnabil-

ity and the Vapnik-Chervonenkis Dimension. Journal of the Association for

Computing Machinery, 36:929-965, oct 1989.

Note:
Further information:
No time-table has been prepared for this course
The course is a part of the following study plans:
Data valid to 2024-04-11
Aktualizace výše uvedených informací naleznete na adrese https://bilakniha.cvut.cz/en/predmet24173305.html