Logo ČVUT
ČESKÉ VYSOKÉ UČENÍ TECHNICKÉ V PRAZE
STUDIJNÍ PLÁNY
2023/2024

Deep Learning

Přihlášení do KOSu pro zápis předmětu Zobrazit rozvrh
Kód Zakončení Kredity Rozsah Jazyk výuky
BEV033DLE Z,ZK 6 2P+2C anglicky
Garant předmětu:
Boris Flach
Přednášející:
Boris Flach, Oleksandr Shekhovtsov
Cvičící:
Boris Flach, Oleksandr Shekhovtsov, Jan Šochman
Předmět zajišťuje:
katedra kybernetiky
Anotace:

The course introduces deep neural networks and deep learning – a branch of machine learning and artificial intelligence. Starting from a recap of generic concepts of machine learning (empirical risk minimisation, linear classifiers and regressions, generalisation bounds), it will introduce deep networks as model classes for prediction (classification) and regression and discuss their model complexity and generalisation bounds. The course aims at a solid understanding of all concepts and algorithms needed to successfully design, implement and learn deep networks in machine learning applications. This includes error back propagation and stochastic gradient methods, weight initialisation and normalisation, deterministic and stochastic regularisation methods, data augmentation as well as adversarially robust learning approaches. The course concludes with an introductory discussion of generative neural networks (VAEs and GANs) as well as recurrent neural networks (GRU and LSTM) for structured output classification.

Students will gain solid knowledge of all related methods and concepts as well as practical skills needed for successfully designing, implementing and learning deep networks for machine learning applications. At the same time, this course will provide a solid fundament for forthcoming courses (e.g. computer vision), which consider specialised and often more complex variants of neural networks, loss functions and learning approaches for solving machine learning task in their respective area.

Požadavky:

Fundamentals of mathematics comparable to the following courses: Linear Algebra (B0B01LAG), Calculus (B0B01MA2), Optimization (B0B33OPT) as well as Probability, Statistics, and Theory of Information (B0B01PST ).

Besides proficient knowledge of mathematics as given above, students are expected to have solid knowledge in the following areas of computer science and artificial intelligence: basics of graph theory and related algorithms; basics of pattern recognition, empirical risk minimisation, linear classifiers, support vector machines as in Pattern Recognition and Machine Learning (B4B33RPZ or BE4B33RPZ).

Osnova přednášek:

1. Recap: linear classifiers, linear regression, logistic regression, loss function, empirical risk minimisation, regularisation

2. Artificial neurons, activation functions, network architectures; sidestep: stochastic neurons; sidestep: biological

neurons

3. Neural networks as classifiers, empirical risk minimisation, loss functions, model complexity and generalisation bounds; neural networks as nonlinear regression models, loss functions

4. Backpropagation for feed-forward networks with arbitrary DAG structure, simplification/modularisation for layered

networks

5. NN loss landscape, Stochastic gradient descent for convex functions, SGD for nonlinear functions, (Nesterov)

momentum

6. Convolutional neural networks, architectures, application examples, side step: visual cortex

7. Training neural networks 0: project pipeline, data collection, training/validation/test set, model selection (architecture), overfitting, early stopping

8. Training neural networks 1: data preprocessing, weight initialisation, batch normalisation

9. Training neural networks 2: Adaptive SGD methods

10. Training neural networks 3: regularisation, L1/L2 weight regularisation, randomised predictors, dropout, data

augmentation

11. Training neural networks 4: adversarial patterns, robust learning approaches

12. Generative models: VAE, GANs (introductory level)

13. Recurrent neural networks: recurrent back-propagation, RNN, GRU, LSTM

14. Reserve, other topics not covered, e.g. graph neural networks, convolutions on graphs

Osnova cvičení:

Two types of labs (tutorials) will be proposed for the course (alternating):

• practical labs discussing homework assignments in which students will implement selected methods discussed in the course and experiment with them

• theoretical labs in which students will discuss solutions of theoretical assignments (made available before the class).

Cíle studia:

The proposed course aims at providing the relevant algorithmic and theoretical concepts needed for successfully designing and training NNs. At the same time it strives at providing technical and practical skills in this domain.

Studijní materiály:

I. Goodfellow, Y. Bengio and A. Courville, Deep Learning, MIT Press, 2016

Poznámka:
Další informace:
https://cw.fel.cvut.cz/wiki/courses/bev033dle/start
Rozvrh na zimní semestr 2023/2024:
Rozvrh není připraven
Rozvrh na letní semestr 2023/2024:
06:00–08:0008:00–10:0010:00–12:0012:00–14:0014:00–16:0016:00–18:0018:00–20:0020:00–22:0022:00–24:00
Po
Út
St
místnost KN:E-301
Flach B.
Shekhovtsov O.

11:00–12:30
(přednášková par. 1)
Karlovo nám.
Šrámkova posluchárna K9
Čt
místnost KN:E-126
Flach B.
Shekhovtsov O.

09:15–10:45
(přednášková par. 1
paralelka 101)

Karlovo nám.
Trnkova posluchárna K5
místnost KN:E-126
Flach B.
Shekhovtsov O.

11:00–12:30
(přednášková par. 1
paralelka 102)

Karlovo nám.
Trnkova posluchárna K5

místnost KN:E-126

11:00–12:30
(přednášková par. 1
paralelka 103)

Karlovo nám.
Trnkova posluchárna K5
Předmět je součástí následujících studijních plánů:
Platnost dat k 16. 3. 2024
Aktualizace výše uvedených informací naleznete na adrese https://bilakniha.cvut.cz/cs/predmet6170206.html