Applied Information Theory
Code | Completion | Credits | Range | Language |
---|---|---|---|---|
01ATI | ZK | 3 | 2+0 | Czech |
- Course guarantor:
- Lecturer:
- Tutor:
- Supervisor:
- Department of Mathematics
- Synopsis:
-
Information theory explores the fundamental limits of the representation and transmission of information. We will focus on the definition and implications of (information) entropy, the source coding theorem, and the channel coding theorem. These concepts provide a vital background for researchers in the areas of data compression, signal processing, controls, and pattern recognition. Emphasis will be putted on solution of concrete problems.
- Requirements:
-
Basic course of Calculus and Probability (in the extent of the courses 01MAB3, 01MAB4 and 01PRST held at the FNSPE CTU in Prague).
- Syllabus of lectures:
-
1. Definition of entropy, conditional entropy and mutual information
2. Source coding and data compression, Huffman codes
3. Information channels and their capacities
4. Information processing and preservation theorems
5. Fisher information
- Syllabus of tutorials:
- Study Objective:
-
Knowledge:
Basic notions and principles of information theory.
Skills:
Application of acquired knowledge to solution of practical problems such as finding optimal Huffman codes, calculation of stationary distribution of Markov chains, calculation of information channel capacity.
- Study materials:
-
Key references:
[1] T. Cover and J. Thomas: Elements of information theory, Wiley, 1994
Recommended references:
[2] D. J. C. Mackay: Information Theory, Inference, and Learning Algorithms, Cambridge University Press, 2003
- Note:
- Further information:
- No time-table has been prepared for this course
- The course is a part of the following study plans: