Neural Networks 1
Code | Completion | Credits | Range | Language |
---|---|---|---|---|
818NEUS1 | Z | 3 | 1+1 | Czech |
- Lecturer:
- Tutor:
- Supervisor:
- Synopsis:
-
Mathematical analysis, model theory and biological context are used for construction of simple models of neural structures. The models are able to learn from pattern sets and their structures and parameters are subjects of optimization.
- Requirements:
-
Basic knowledges from linear algebra.
- Syllabus of lectures:
-
1.Biological neural networks and their models.
2.Artificial neural networks, basic terms.
3.ANN topology, acyclic and hierarchic networks
4.Bipolar perceptron as switching element.
5.Logical function as perceptron network.
6.Hebb learning, LSQ learning, pseudoinversion, OLAM.
7.Robust learning principles, pruning.
8.Rosenblatt learning, Widrow delta learning.
9.Non-linear preprocessing and Cover theorem.
10.Smooth perceptron, delta rule, stochastic gradient method.
11.Multi-layer perceptron, universal approximation, backpropagation.
12.RBF - Radial basis function network, structure, learning.
13.Hamming network, associative memories.
- Syllabus of tutorials:
-
1.Biological neural networks and their models.
2.Artificial neural networks, basic terms.
3.ANN topology, acyclic and hierarchic networks
4.Bipolar perceptron as switching element.
5.Logical function as perceptron network.
6.Hebb learning, LSQ learning, pseudoinversion, OLAM.
7.Robust learning principles, pruning.
8.Rosenblatt learning, Widrow delta learning.
9.Non-linear preprocessing and Cover theorem.
10.Smooth perceptron, delta rule, stochastic gradient method.
11.Multi-layer perceptron, universal approximation, backpropagation.
12.RBF - Radial basis function network, structure, learning.
13.Hamming network, associative memories.
- Study Objective:
-
Knowledge:
Elements of artificial neural networks.
Abilities:
Representation of logical function as perceptron network, use of algorithms for weights calculation of perceptron network and associative memories.
- Study materials:
-
Compulsory literature:
[1] J. Šíma, R. Neruda: Teoretické otázky neuronových sítí, Matfyzpress, Praha, 1996.
[2] M. Šnorek: Neuronové sítě a neuropočítače, ČVUT, Praha 2002
Recommended literature:
[3] S. Haykin: Neural Networks, Macmillan, New York, 1994.
[4] L.V. Fausett: Fundamentals of Neural Networks: Architectures, Algorithms and Applications, Prentice Hall, New Jersey, 1994.
- Note:
- Time-table for winter semester 2011/2012:
- Time-table is not available yet
- Time-table for summer semester 2011/2012:
- Time-table is not available yet
- The course is a part of the following study plans: