Probability, Statistics, and Theory of Information
Code  Completion  Credits  Range 

QBPSI  Z,ZK  6  4+2 
 Lecturer:
 Tutor:
 Supervisor:
 Department of Mathematics
 Synopsis:

Basics of probability theory, mathematical statistics, information theory, and coding. Includes descriptions of probability, random variables and their distributions, characteristics and operations with random variables. Basics of mathematical statistics: Point and interval estimates, methods of parameters estimation and hypotheses testing, least squares method. Basic notions and results of the theory of Markov chains. Shannon entropy, mutual and conditional information.
 Requirements:

Linear Algebra, Calculus, Discrete Mathematics
 Syllabus of lectures:

1. Basic notions of probability theory. Kolmogorov model of probability. Independence, conditional probability, Bayes formula.
2. Random variables and their description. Random vector. Probability distribution function.
3. Quantile function. Mixture of random variables.
4. Characteristics of random variables and their properties. Operations with random variables.
Basic types of distributions.
5. Characteristics of random vectors. Covariance, correlation. Chebyshev inequality. Law of large numbers. Central limit theorem.
6. Basic notions of statistics. Sample mean, sample variance.
Interval estimates of mean and variance.
7. Method of moments, method of maximum likelihood. EM algorithm.
8. Hypotheses testing. Goodnessoffit tests, tests of correlation, nonparametic tests.
9. Discrete random processes. Stationary processes. Markov chains.
10. Classification of states of Markov chains.
11. Asymptotic properties of Markov chains. Overview of applications.
12. Shannon's entropy of a discrete distribution and its axiomatical formulation. Theorem on minimal and maximal entropy.
13. Conditional entropy. Chain rule. Subadditivity. Entropy of a continuous variable.
14. Fano's inequality. Information of message Y in message X.
 Syllabus of tutorials:

1. Elementary probability.
2. Kolmogorov model of probability. Independence, conditional probability, Bayes formula.
3. Mixture of random variables. Mean. Unary operations with random variables.
4. Dispersion (variance). Random vector, joint distribution. Binary operations with random variables.
5. Sample mean, sample variance. Chebyshev inequality. Central limit theorem.
6. Interval estimates of mean and variance.
7. Method of moments, method of maximum likelihood.
8. Hypotheses testing. Goodnessoffit tests, tests of correlation, nonparametic tests.
9. Discrete random processes. Stationary processes. Markov chains.
10. Classification of states of Markov chains.
11. Asymptotic properties of Markov chains.
12. Shannon's entropy of a discrete distribution. Theorem on minimal and maximal entropy.
13. Conditional entropy. Chain rule. Subadditivity. Entropy of a continuous variable.
14. Fano's inequality. Information of message Y in message X.
 Study Objective:

Basics of probability theory and their application in statistical estimates and tests.
The use of Markov chains in modeling.
Basic notions of information theory.
 Study materials:

[1] Wasserman, L.: All of Statistics: A Concise Course in Statistical Inference. Springer Texts in Statistics, Corr. 2nd printing, 2004.
[2] Papoulis, A., Pillai, S.U.: Probability, Random Variables, and Stochastic Processes. McGrawHill, Boston, USA, 4th edition, 2002.
[3] Mood, A.M., Graybill, F.A., Boes, D.C.: Introduction to the Theory of Statistics. 3rd ed., McGrawHill, 1974.
 Note:
 Further information:
 http://cmp.felk.cvut.cz/~navara/psi/
 No timetable has been prepared for this course
 The course is a part of the following study plans: