Information Theory
Code | Completion | Credits | Range | Language |
---|---|---|---|---|
QNI-TIN | Z,ZK | 6 | 2P+2C | Czech |
- Course guarantor:
- Pavel Hrabák
- Lecturer:
- Pavel Hrabák
- Tutor:
- Pavel Hrabák
- Supervisor:
- Department of Applied Mathematics
- Synopsis:
-
The course focuses on the mathematical description of a random message source, its coding and transmission of the source through a noisy channel. The coding problem is addressed probabilistically, the relation of the mean length of the optimal code with the entropy and entropy rate of the random source is emphasized. In the case of the noisy channel we focus on the set of typical sequences and its appropriate coding by self-correcting codes. The course includes a reminder of necessary concepts such as conditional distributions, goodness-of-fit and independence tests, and an introduction to random chains.
- Requirements:
-
It is necessary to be familiar with basic concepts of probability and statistics in the extent of course BIE-PST at FIT CTU.
- Syllabus of lectures:
-
1. Repetition of probability, random variables and their distribution.
2. Random vectors, multivariate distribution, conditional distribution.
3. Characteristics of random vectors.
4. Distances between distributions, relation to goodness-of-fit tests, tests of independence.
5. Hypothesis testing, tests of independence.
6. Message source, entropy, conditional entropy, relative entropy, mutual information.
7. Differential entropy, maximum entropy principle.
8. Data compression, instantaneous and uniquely decodable codes, Huffman coding, relation to source entropy.
9. Introduction to the Markov chain theory.
10. Entropy and Markov source coding.
11. Information channels, channel capacity.
12. Transmission of a source through the information channel, typical sequences.
13. Simulation of a random message source.
- Syllabus of tutorials:
-
1. Repetition of probability, random variables and their distribution.
2. Random vectors, multivariate distribution.
3. Characteristics of random vectors, conditional distribution.
4. Distances between distributions, goodness-of-fit tests.
5. Tests of independence.
6. Message source, entropy, conditional entropy, relative entropy, mutual information.
7. Maximum entropy principle.
8. Huffman coding, relation to source entropy.
9. Markov chains, Markov property.
10. Entropy and Markov source coding.
11. Information channels, channel capacity.
12. Coding of typical messages, Hamming codes.
13. Simulation of a random message source.
- Study Objective:
-
The course focuses on the mathematical description of a random message source, its coding and transmission of the source through a noisy channel. The coding problem is addressed probabilistically, the relation of the mean length of the optimal code with the entropy and entropy rate of the random source is emphasized. In the case of the noisy channel we focus on the set of typical sequences and its appropriate coding by self-correcting codes. The course includes a reminder of necessary concepts such as conditional distributions, goodness-of-fit and independence tests, and an introduction to random chains.
- Study materials:
-
1. Cover, T. M., Thomas, J. A.: Elements of Information Theory, 2nd Edition
Wiley-Interscience 2006
ISBN 9780471241959
2. Johnson, J. L.: Probability and Statistics for Computer Science
Wiley-Interscience 2008
ISBN 9780471326724
3. Wilde, M. M.: Quantum Information Theory
Cambridge University Pres 2013
ISBN 9781316809976
- Note:
-
Information about the course and teaching materials can be found at https://courses.fit.cvut.cz/QNI-TIN.
- Further information:
- https://courses.fit.cvut.cz/QNI-TIN
- No time-table has been prepared for this course
- The course is a part of the following study plans:
-
- Quantum Informatics (compulsory course in the program)