Logo ČVUT
Loading...
ČESKÉ VYSOKÉ UČENÍ TECHNICKÉ V PRAZE
STUDIJNÍ PLÁNY
2011/2012

Information Theory

Předmět není vypsán Nerozvrhuje se
Kód Zakončení Kredity Rozsah Jazyk výuky
AE4B01TIN Z,ZK 6 2+2s česky
Přednášející:
Cvičící:
Předmět zajišťuje:
katedra matematiky
Anotace:

Annotation: The goal of the course is to explain basic principles of mathematical information theory and related questions of source coding. In particular the following topics are treated: Shannon's entropy, mutual (conditional) information of two sources, codes, relationship between entropy and optimal codelength (Shannon's source coding theorem). Further, information channels and their capacity, especially Shannon's theorem on the capacity of a channel are explained. Data compression is studied. Although the exposition is focused primarily on the discrete probability model, the entropy of continuous random

Požadavky:

Pravděpodobnost a statistika, Lineaární algebra, Matematická analýza, Diskrétní matematika

Osnova přednášek:

lectures:

1.Shannon's entropy of the discrete distribution - axiomatic approach.

2.Minimal and maximal entropy. Conditional entropy. Chain rule. Subadditivity.

3.Fane's inequality. Mutual information.

4.Codes, prefix codes, non-singular codes. Kraft-MacMillan's inequality.

5.Estimation of average codelength in terms of the entropy. Huffman coding.

6.Data compression by using the Law of Large Numbers. Typical messages.

7.Discrete random processes. Stationary processes. Markov chains.

8.Entropy speed of the stationary source.

9.Information channel and its capacity. Basic information channles.

10. Shannon's coding theorem.

11.Universal coding. Liv-Lempel coding.

12.Statistical information. Bayesian errors.

13.Entropy of continuous randon variable. Von Neumann's entropy of the matrix.

Quantum information theory.

Osnova cvičení:

1.Shannon's entropy of the discrete distribution - axiomatic approach.

2.Minimal and maximal entropy. Conditional entropy. Chain rule. Subadditivity.

3.Fane's inequality. Mutual information.

4.Codes, prefix codes, non-singular codes. Kraft-MacMillan's inequality.

5.Estimation of average codelength in terms of the entropy. Huffman coding.

6.Data compression by using the Law of Large Numbers. Typical messages.

7.Discrete random processes. Stationary processes. Markov chains.

8.Entropy speed of the stationary source.

9.Information channel and its capacity. Basic information channles.

10.Shannon's coding theorem.

11.Universal coding. Liv-Lempel coding.

12.Statistical information. Bayesian errors.

13.Entropy of continuous randon variable. Von Neumann?s entropy of the matrix.

Cíle studia:
Studijní materiály:

[1] David J.C. MacKay: Information Theory, Inference, and Learning

Algorithms, Cambridge University Press, 2003.

[2] T.M.Cover and J.Thomson: Elements of information theory, Wiley, 1991.

Poznámka:

Rozsah výuky v kombinované formě studia: 14p+6s

Další informace:
Pro tento předmět se rozvrh nepřipravuje
Předmět je součástí následujících studijních plánů:
Platnost dat k 9. 7. 2012
Aktualizace výše uvedených informací naleznete na adrese http://bilakniha.cvut.cz/cs/predmet12819204.html