Logo ČVUT
CZECH TECHNICAL UNIVERSITY IN PRAGUE
STUDY PLANS
2025/2026

Neural Networks, Machine Learning, and Randomness

Display time-table
Code Completion Credits Range Language
01NSN Z,ZK 2 1P+1C Czech
Course guarantor:
Martin Holeňa
Lecturer:
Martin Holeňa
Tutor:
Martin Holeňa
Supervisor:
Department of Mathematics
Synopsis:

The remarkable rise of artificial intelligence is largely due to generative systems built using modern machine learning methodsespecially advanced variants of large neural networks. Stochastic methods, which rely on randomness, play a crucial role in constructing and training these networks and other machine learning models. Although students are introduced to probability and statistics in other courses, this course offers a systematic explanation of how stochastic methods relate to training neural networks and machine learning models. It explores various types of neural networks that fundamentally depend on randomness, as well as specific stochastic techniques used in their training. In the final topics, the course presents a general stochastic approach to training neural networks and shows how machine learning modelsincluding neural networksare used in one of the most important applications of randomness: stochastic optimization methods, such as evolutionary algorithms.

Requirements:
Syllabus of lectures:

1. Recalling what you already know:

Neural networks, signal transmission, network architecture, model training, feature selection, model evaluation, interpretability, supervised / unsupervised / reinforcement learning, probability distributionsä

2. Neural networks based on randomness:

Extreme Learning Machines (ELM), randomized convolutional networks, Echo State Networks (ESN), Bayesian Neural Networks (BNN).

3. Stochastic methods for neural networks:

Dropout techniques (Bernoulli, Gaussian), stochastic gradient descent (SGD), posterior probability approximations.

4. Stochastic methods for machine learning:

Observable vs. latent variables, MCMC (Metropolis-Hastings), variational inference (VI), deep Kalman filters.

5. General stochastic approach to neural networks:

Learning via expected values vs. random sampling, law of large numbers, central limit theorem, hypothesis testing, network pruning.

6. Machine learning and neural networks in stochastic optimization:

CMA-ES algorithm, surrogate modeling for expensive black-box functions, model selection strategies, surrogate models using neural networks, Gaussian processes, random forests, ordinal regression.

Syllabus of tutorials:

1. Basics of machine learning in Python using NumPy, Pandas, Seaborn, PyTorch

2. Automatic gradient computation, simple MLP architectures

3. Supervised, unsupervised, reinforcement, and self-supervised learning

4. Linear and logistic regression, cost functions, model evaluation. Perceptron learning algorithm

5. Train/test splits, overfitting, bias-variance trade-off, regularization, early stopping, dropout, batching, cross-validation, double descent

6. Bayesian theorem, maximum likelihood vs. maximum a posteriori estimation. Naive Bayes vs. k-Nearest Neighbors classifiers

Study Objective:

Knowledge:

Understand the role of randomness in neural networks and machine learning, and their application in evolutionary stochastic optimization.

Skills:

Implement aspects of randomness using one of three major platforms: Deep Learning Toolbox, PyTorch, or TensorFlow.

Study materials:

Required:

prezentace na webu přednášejícího

Recommended:

I. Goodfellow, Y. Bengio, A. Courville. Deep Learning. MIT, Boston.

Z.H. Zhou. Machine Learning. Springer Nature, Singapore.

Note:
Time-table for winter semester 2025/2026:
Time-table is not available yet
Time-table for summer semester 2025/2026:
Time-table is not available yet
The course is a part of the following study plans:
Data valid to 2025-11-08
For updated information see http://bilakniha.cvut.cz/en/predmet8381706.html