Summary of Study |
Summary of Branches |
All Subject Groups |
All Subjects |
List of Roles |
Explanatory Notes
Instructions
Web page:
https://cw.fel.cvut.cz/wiki/courses/B4B33RPZ
Anotation:
The basic formulations of the statistical decision problem are presented. The necessary knowledge about the (statistical) relationship between observations and classes of objects is acquired by learning on the raining set. The course covers both well-established and advanced classifier learning methods, as Perceptron, AdaBoost, Support Vector Machines, and Neural Nets.
This course is also part of the inter-university programme prg.ai Minor. It pools the best of AI education in Prague to provide students with a deeper and broader insight into the field of artificial intelligence. More information is available at
https://prg.ai/minor.
Study targets:
To teach the student to formalize statistical decision making problems, to use machine learning techniques and to solve pattern recognition problems with the most popular classifiers (SVM, AdaBoost, neural net, nearest neighbour).
Course outlines:
| 1. | | The pattern recognition problem. Overview of the Course. Basic notions. |
| 2. | | The Bayesian decision-making problem, i.e. minimization of expected loss. |
| 3. | | Non-bayesian decision problems. |
| 4. | | Parameter estimation. The maximum likelihood method. |
| 5. | | The nearest neighbour classifier. |
| 6. | | Linear classifiers. Perceptron learning. |
| 7. | | The Adaboost method. |
| 8. | | Learning as a quadratic optimization problem. SVM classifiers. |
| 9. | | Feed-forward neural nets. The backpropagation algorithm. |
| 10. | | Decision trees. |
| 11. | | Logistic regression. |
| 12. | | The EM (Expectation Maximization) algorithm. |
| 13. | | Sequential decision-making (Wald´s sequential test). |
| 14. | | Recap. |
Exercises outline:
Students follow the lecture topics and implement most of the discussed algorithms in Python.
| 1. | | Introduction, work with Python, simple example |
| 2. | | Bayesian decision task |
| 3. | | Non-bayesian tasks - the minimax task |
| 4. | | Non-parametrical estimates - parzen windows |
| 5. | | MLE, MAP and Bayes parameter estimation |
| 6. | | Logistic regression |
| 7. | | Problem solving / exam questions |
| 8. | | Linear classifier - perceptron |
| 9. | | Support Vector Machine |
| 10. | | AdaBoost |
| 11. | | K-means clustering |
| 12. | | Convolutional neural networks |
| 13. | | Problem solving / exam questions |
Literature:
| 1. | | Duda, Hart, Stork: Pattern Classification, John Willey and Sons, 2nd edition, New York,2001. |
| 2. | | Schlesinger, Hlavac: Ten Lectures on Statistical and Structural Pattern Recognition, 2002. |
Requirements:
Knowledge of linear algebra, mathematical analysis and probability and statistics.
Keywords:
pattern recognition, statistical decision-making, machine learning, classification
Subject is included into these academic programs:
| Page updated 7.12.2025 17:51:48, semester: Z/2025-6, L/2026-7, L/2025-6, L/2024-5, Z/2026-7, Send comments about the content to the Administrators of the Academic Programs |
Proposal and Realization: I. Halaška (K336), J. Novák (K336) |