Background of Information Theory
Code | Completion | Credits | Range | Language |
---|---|---|---|---|
18ZTI | KZ | 2 | 2+0 | Czech |
- Garant předmětu:
- Lecturer:
- Tutor:
- Supervisor:
- Department of Software Engineering
- Synopsis:
-
Entropy as a measure of uncertainty and its use to measure the amount of information. Possibilities of use of information access in various fields of science, engineering economics, etc. to solve specific problems.
- Requirements:
- Syllabus of lectures:
-
The concept of information and its interpretation.
Some historical approaches to quantification.
Uncertainty as the default term.
Entropy
Shann´s model to determine the level of uncertainty
Mathematical properties of entropy and their practical interpretation.
Information and its quantifying the Shann´s access
Examples of problems, logical problems.
Random variable and its entropy. Information random variables.
Specific examples of problems and use in discrete and continuous case.
Entropy, Language and encoding
Redundancy, channel capacity
- Syllabus of tutorials:
- Study Objective:
-
The concept of information is now an essential communication tool in the world. In cybernetics was raised quantifier need "amount of information. A good approach to solving showed to be Entropy. This method became the basis for much wider application than originally intended, which is documented in a lecture.
- Study materials:
-
Key references:1. A.M. Jaglom, I.M. Jaglom, Probability and Information,Nakladadtelství ČSAV, Praha,1964
Recommended references:
1. Jiří Adámek, Coding,Mathematics for universities Techn. Workbook XXXI, SNTL, 1989
2. Igor Vajda, Information theory, Skripta, vydavatelstvíČVUT, FJFI,2004
- Note:
- Further information:
- Course may be repeated
- No time-table has been prepared for this course
- The course is a part of the following study plans: