Logo ČVUT
CZECH TECHNICAL UNIVERSITY IN PRAGUE
STUDY PLANS
2025/2026

Theory of Neural Networks

The course is not on the list Without time-table
Code Completion Credits Range Language
NI-TNN Z,ZK 5 2P+1C Czech
Course guarantor:
Lecturer:
Tutor:
Supervisor:
Department of Applied Mathematics
Synopsis:

Artificial neural networks are now the foundation of artificial intelligence and the fastest-growing area of machine learning. This course introduces their theoretical foundations. It begins with general conceptsstructure, active dynamics, and adaptive dynamics (i.e., learning). Then it covers the theoretical basis of the most common types of artificial neural networks, from the perceptron of the 1950s to the transformer of 2017. Finally, using function approximation theory, it rigorously explains the most important theoretical result: the universal approximation capability of neural networks.

Requirements:

Probability and linear algebra knowledge at the bachelor level.

Syllabus of lectures:

1. Basic Concepts of Artificial Neural Networks

Neurons, connections, input/output/hidden neurons, network topology. Neuron activity and its evolution over time. Synaptic and somatic operations, activation functions. Global and local active dynamics, finite-dimensional parametrization. Adaptive dynamics (learning), supervised learning, reinforcement learning. Optimization algorithms and objective functions.

2. Most Common Types of Artificial Neural Networks

McCulloch-Pitts binary element as a precursor to the perceptron. Perceptron and its learning algorithm. Associative memory and its learning algorithm, linear associative memory. Hopfield network. Multilayer perceptron and backpropagation algorithm. Recurrent networks, LSTM. Convolutional neural networks. Autoencoders, denoising autoencoders. Transformer.

3. Neural Networks from the Perspective of Function Approximation Theory

Connection to expressing multivariable functions using fewer variables, Hilberts 13th problem, Kolmogorov-Arnold theorem, Vitushkins theorem. Overview of Banach function spaces: integrable functions, continuous functions, Sobolev spaces, functions with continuous derivatives. Universal approximation as the density of sets of multilayer perceptron-computed functions in function spaces. Specific universal approximation with sigmoid activation functions and its relationship to Kolmogorov-Arnold theorem.

Syllabus of tutorials:

1. Decision trees, random forests: construction, splitting criteria, implementation

2. Principal Component Analysis (PCA), dimensionality reduction, mathematical concepts (SVD), data visualization

3. Neural networks: architecture, gradient descent, activation functions, hyperparameter selection

4. Optimization algorithms: SGD, Momentum, Nesterov, Adagrad, Adadelta, Adam, AdamW

5. Advanced architectures: recurrent networks, transformers

6. Advanced architectures: graph neural networks

Study Objective:

Knowledge:

Understanding of basic concepts and common types of artificial neural networks, mathematical insight into their universal approximation capability.

Skills:

Ability to implement simple neural networks using one of the three major platforms: Deep Learning Toolbox, PyTorch, TensorFlow.

Study materials:

Required:

Lecturers presentations available on the course website

Recommended:

C. M. Bishop. Neural Networks for Pattern Recognition. Clarendon-Press, Oxford.

I. Goodfellow, Y. Bengio, A. Courville. Deep Learning. MIT, Boston.

Note:

The course is presented in Czech language. Additional course materials are available at https://courses.fit.cvut.cz/NI-TNN

Further information:
https://courses.fit.cvut.cz/NI-TNN
No time-table has been prepared for this course
The course is a part of the following study plans:
Data valid to 2025-09-18
For updated information see http://bilakniha.cvut.cz/en/predmet6141506.html