Probability, Statistics, and Theory of Information
Code | Completion | Credits | Range |
---|---|---|---|
QB-PSI | Z,ZK | 6 | 4+2s |
- Lecturer:
- Mirko Navara (gar.), Tomáš Kroupa, Libor Nentvich
- Tutor:
- Mirko Navara (gar.), Tomáš Kroupa, Libor Nentvich
- Supervisor:
- Department of Mathematics
- Synopsis:
-
Basics of probability theory, mathematical statistics, information theory, and coding. Includes descriptions of probability, random variables and their distributions, characteristics and operations with random variables. Basics of mathematical statistics: Point and interval estimates, methods of parameters estimation and hypotheses testing, least squares method. Basic notions and results of the theory of Markov chains. Shannon entropy, mutual and conditional information.
- Requirements:
-
Linear Algebra, Calculus, Discrete Mathematics
- Syllabus of lectures:
-
1. Basic notions of probability theory. Kolmogorov model of probability. Independence, conditional probability, Bayes formula.
2. Random variables and their description. Random vector. Probability distribution function.
3. Quantile function. Mixture of random variables.
4. Characteristics of random variables and their properties. Operations with random variables.
Basic types of distributions.
5. Characteristics of random vectors. Covariance, correlation. Chebyshev inequality. Law of large numbers. Central limit theorem.
6. Basic notions of statistics. Sample mean, sample variance.
Interval estimates of mean and variance.
7. Method of moments, method of maximum likelihood. EM algorithm.
8. Hypotheses testing. Goodness-of-fit tests, tests of correlation, non-parametic tests.
9. Discrete random processes. Stationary processes. Markov chains.
10. Classification of states of Markov chains.
11. Asymptotic properties of Markov chains. Overview of applications.
12. Shannon's entropy of a discrete distribution and its axiomatical formulation. Theorem on minimal and maximal entropy.
13. Conditional entropy. Chain rule. Subadditivity. Entropy of a continuous variable.
14. Fano's inequality. Information of message Y in message X.
- Syllabus of tutorials:
-
1. Elementary probability.
2. Kolmogorov model of probability. Independence, conditional probability, Bayes formula.
3. Mixture of random variables. Mean. Unary operations with random variables.
4. Dispersion (variance). Random vector, joint distribution. Binary operations with random variables.
5. Sample mean, sample variance. Chebyshev inequality. Central limit theorem.
6. Interval estimates of mean and variance.
7. Method of moments, method of maximum likelihood.
8. Hypotheses testing. Goodness-of-fit tests, tests of correlation, non-parametic tests.
9. Discrete random processes. Stationary processes. Markov chains.
10. Classification of states of Markov chains.
11. Asymptotic properties of Markov chains.
12. Shannon's entropy of a discrete distribution. Theorem on minimal and maximal entropy.
13. Conditional entropy. Chain rule. Subadditivity. Entropy of a continuous variable.
14. Fano's inequality. Information of message Y in message X.
- Study Objective:
-
Basics of probability theory and their application in statistical estimates and tests.
The use of Markov chains in modeling.
Basic notions of information theory.
- Study materials:
-
[1] Wasserman, L.: All of Statistics: A Concise Course in Statistical Inference. Springer Texts in Statistics, Corr. 2nd printing, 2004.
[2] Papoulis, A., Pillai, S.U.: Probability, Random Variables, and Stochastic Processes. McGraw-Hill, Boston, USA, 4th edition, 2002.
[3] Mood, A.M., Graybill, F.A., Boes, D.C.: Introduction to the Theory of Statistics. 3rd ed., McGraw-Hill, 1974.
- Note:
- Time-table for winter semester 2011/2012:
-
06:00–08:0008:00–10:0010:00–12:0012:00–14:0014:00–16:0016:00–18:0018:00–20:0020:00–22:0022:00–24:00
Mon Tue Fri Thu Fri - Time-table for summer semester 2011/2012:
- Time-table is not available yet
- The course is a part of the following study plans: