Logo ČVUT
CZECH TECHNICAL UNIVERSITY IN PRAGUE
STUDY PLANS
2019/2020

Applications of Optimization Methods

The course is not on the list Without time-table
Code Completion Credits Range
01AOM Z
Lecturer:
Tutor:
Supervisor:
Department of Mathematics
Synopsis:

Aim of this course is to enhance the knowledge of the optimization methods and show their practical applications. Number

of methods are applied on the support-vector machines and subsequently, methods for large problems and training of deep

artificial neural networks are explained. Finaly, advanced methods for regret minimization or sparsity inducing methods

are explained. All methods are demonstrated on real problems.

Requirements:
Syllabus of lectures:

1. Introduction to advanced optimization methods.

2. Support-vector mechines.

3. Artificial neural networks in optimization.

4. Hessian inverse approximation, BFGS method.

5. Stochastic gradient descent.

6. Convex optimizations for regret minimization.

7. Sparsity-inducing optimization methods.

Syllabus of tutorials:
Study Objective:
Study materials:

Compulsory literature:

[1] H. J. Kochenderfer, T. A. Wheeler, Algorithms for Optimization, The MIT Press, 2019.

[2] S. Sra, S. Nowozin, S. J. Wright, Optimization for Machine Learning, The MIT Press, 2012.

[3] D. P. Bertsekas, Convex Optimization Algorithms, Athena Scientific, 2015.

Optional literature:

[5] I. Goodfellow, Y. Bengio, A. Courville, Deep Learning, The MIT Press, 2016.

[6] Ch. C. Aggarwal, Neural Networks and Deep Learning, Springer, 2018.

Note:
Further information:
No time-table has been prepared for this course
The course is a part of the following study plans:
Data valid to 2020-07-04
For updated information see http://bilakniha.cvut.cz/en/predmet6384706.html