Neural Networks 2
| Code | Completion | Credits | Range | Language |
|---|---|---|---|---|
| 18NES2 | KZ | 3 | 0P+2C | Czech |
- Course guarantor:
- Lecturer:
- Zuzana Petříčková
- Tutor:
- Zuzana Petříčková
- Supervisor:
- Department of Software Engineering
- Synopsis:
-
The aim of the course „Neural Networks 2“ is to acquaint students with basic models of deep neural networks and teach them how to apply these models and methods to solve practical tasks.
- Requirements:
-
Course Requirements
1) Regular (active) participation in exercises. In justified cases (e.g., timetable conflicts), independent preparation may be permitted upon prior agreement with the instructor.
2) Completion of a project and its oral presentation during an exercise session.
Mandatory Deadlines
1) The student shall select a project topic and obtain the instructors approval no later than 25 November 2025.
2) The student shall submit and present the project in front of other students during an exercise session on a date arranged in advance, but no later than the week commencing 15 December 2025.
3) Should the project not be completed on time, the student shall present its current state during the exercise session. The final version must subsequently be submitted and defended during an individual consultation no later than 4 September 2026.
Assessment Criteria
The final grade will be determined on the basis of:
1) the quality of the completed project and adherence to deadlines,
2) the degree of active participation in exercises (regular attendance, attention to explanations, engagement in problem-solving, experimentation, asking questions, etc.).
- Syllabus of lectures:
- Syllabus of tutorials:
-
The exercises will focus on experimenting with various deep learning models using popular frameworks (such as Keras, TensorFlow or PyTorch) on practical tasks (processing image and sequential data, object detection, segmentation, etc.). Students will gain experience in analyzing results and learn about practical aspects of model implementation and tuning, which will help them better understand deep learning.
1. Introduction to Deep Learning: History and Basic Concepts. Existing Frameworks for Deep Learning. Basic Work with Keras, TensorFlow or PyTorch. Creating a Simple Neural Network with Numerical Data.
2. Deep Neural Networks: Architectures and Activation Functions. Implementing and Training a Deep Neural Network on the MNIST Dataset.
3. Introduction to Solving Basic Types of Tasks (Classification, Regression, Time Series Prediction). Specifics of Each Type of Task.
4. Convolutional Neural Networks: Basics and Principles. Classification Tasks. Architectures of Convolutional Neural Networks.
5. Deep Learning and Data. Acquisition, Preparation, and Processing of Data. Normalization and Standardization. Data Augmentation.
6. Algorithms for Deep Neural Network Training, Hyperparameter Optimization and Tuning (Grid Search, Random Search, Bayesian Optimization), Regularization Techniques for Deep Neural Networks.
7. Pre-training and Fine-tuning of Deep Neural Networks. Transfer Learning.
8. Recurrent Neural Networks and Sequential Data Processing.
9. Architectures of Recurrent Neural Networks.
10-11. Convolutional Network Architectures for Object Detection and Segmentation.
12. Autoencoders: Principles and Applications (Denoising, Dimensionality Reduction).
13. Introduction to Other Neural Network Models (Generative Models, Transformers, Reinforcement Learning).
- Study Objective:
-
Students will become familiar with various basic models of deep neural networks (including feedforward networks, convolutional neural networks, recurrent neural networks, and chosen generative models). They will learn how to implement and apply the discussed models and methods to solve practical tasks.
- Study materials:
-
[1] F. Chollet, M. Watson: Deep Learning with Python, Second Edition, 2021 (Third Edition - 2025).
[2] M. Nielson: Neural Networks and Deep Learning, 2019.
[3] A. Kapoor , A. Gulli , S. Pal: Deep Learning with TensorFlow and Keras 3rd edition, 2022.
[4] Ian Goodfellow, Yoshua Bengio, Aaron Courville: Deep Learning, 2016, MIT Press
[5] Charu C. Aggarwal: Neural Networks and Deep Learning: A Textbook, 2018, Springer
[6] Ivan Vasilev, Daniel Slater: Python Deep Learning, 2019, Packt Publishing
[7] Andrew W. Trask: Grokking Deep Learning, 2019, Manning Publications
- Note:
- Further information:
- http://zuzka.petricek.net/
- Time-table for winter semester 2025/2026:
- Time-table is not available yet
- Time-table for summer semester 2025/2026:
- Time-table is not available yet
- The course is a part of the following study plans:
-
- Aplikace informatiky v přírodních vědách (elective course)
- Aplikované matematicko-stochastické metody (elective course)
- Kvantové technologie (elective course)