Statistical learning theory 2020
Lectures: Saturday 9h30 - 10h50, zoom https://zoom.us/j/96210489901
Teachers: Bruno Bauwens and Vladimir Podolskii
Seminar for group 1: Saturday 11h10 - 12h30, Bruno Bauwens and Vladimir Podolskii zoom https://zoom.us/j/94186131884,
Seminar for group 2: Tuesday ??, Nikita Lukyanenko
|Date||Summary||Lecture notes||Problem list||Solutions|
|12 Sept||Introduction and sample complexity in the realizable setting||lecture1.pdf||Problem list 1||Solutions 1|
|19 Sept||VC-dimension and sample complexity|
|26 Sept||Risk bounds and the fundamental theorem of statistical learning theory|
|03 Nov||Rademacher complexity and margin assumption||
<-- A gentle introduction to the materials of the first 3 lectures and an overview of probability theory, can be found in chapters 1-6 and 11-12 of the following book: Sanjeev Kulkarni and Gilbert Harman: An Elementary Introduction to Statistical Learning Theory, 2012.-->
<-- Afterward, we hope to cover chapters 1-8 from the book: Foundations of machine learning, Mehryar Mohri, Afshin Rostamizadeh, and Ameet Talwalker, 2012. These books can be downloaded from http://gen.lib.rus.ec/ .
(We will study a new boosting algorithm, based on the paper: ) -->
The following links might help students who have trouble with English. A lecture on VC-dimensions was given by K. Vorontsov. A course on Statistical Learning Theory by Nikita Zhivotovsky is given at MIPT. Some short description about PAC learning on p136 in the book ``Наука и искусство построения алгоритмов, которые извлекают знания из данных, Петер Флах. On machinelearning.ru you can find brief and clear definitions.