Logo ČVUT
Loading...
CZECH TECHNICAL UNIVERSITY IN PRAGUE
STUDY PLANS
2011/2012

Information Theory

The course is not on the list Without time-table
Code Completion Credits Range Language
AE4B01TIN Z,ZK 6 2+2s Czech
Lecturer:
Tutor:
Supervisor:
Department of Mathematics
Synopsis:

Annotation: The goal of the course is to explain basic principles of mathematical information theory and related questions of source coding. In particular the following topics are treated: Shannon's entropy, mutual (conditional) information of two sources, codes, relationship between entropy and optimal codelength (Shannon's source coding theorem). Further, information channels and their capacity, especially Shannon's theorem on the capacity of a channel are explained. Data compression is studied. Although the exposition is focused primarily on the discrete probability model, the entropy of continuous random

Requirements:
Syllabus of lectures:

lectures:

1.Shannon's entropy of the discrete distribution - axiomatic approach.

2.Minimal and maximal entropy. Conditional entropy. Chain rule. Subadditivity.

3.Fane's inequality. Mutual information.

4.Codes, prefix codes, non-singular codes. Kraft-MacMillan's inequality.

5.Estimation of average codelength in terms of the entropy. Huffman coding.

6.Data compression by using the Law of Large Numbers. Typical messages.

7.Discrete random processes. Stationary processes. Markov chains.

8.Entropy speed of the stationary source.

9.Information channel and its capacity. Basic information channles.

10. Shannon's coding theorem.

11.Universal coding. Liv-Lempel coding.

12.Statistical information. Bayesian errors.

13.Entropy of continuous randon variable. Von Neumann's entropy of the matrix.

Quantum information theory.

Syllabus of tutorials:

1.Shannon's entropy of the discrete distribution - axiomatic approach.

2.Minimal and maximal entropy. Conditional entropy. Chain rule. Subadditivity.

3.Fane's inequality. Mutual information.

4.Codes, prefix codes, non-singular codes. Kraft-MacMillan's inequality.

5.Estimation of average codelength in terms of the entropy. Huffman coding.

6.Data compression by using the Law of Large Numbers. Typical messages.

7.Discrete random processes. Stationary processes. Markov chains.

8.Entropy speed of the stationary source.

9.Information channel and its capacity. Basic information channles.

10.Shannon's coding theorem.

11.Universal coding. Liv-Lempel coding.

12.Statistical information. Bayesian errors.

13.Entropy of continuous randon variable. Von Neumann?s entropy of the matrix.

Study Objective:
Study materials:

[1] David J.C. MacKay: Information Theory, Inference, and Learning

Algorithms, Cambridge University Press, 2003.

[2] T.M.Cover and J.Thomson: Elements of information theory, Wiley, 1991.

Note:
Further information:
No time-table has been prepared for this course
The course is a part of the following study plans:
Generated on 2012-7-9
For updated information see http://bilakniha.cvut.cz/en/predmet12819204.html