Statistical learning theory 2020

Материал из Wiki - Факультет компьютерных наук
Версия от 18:21, 11 сентября 2020; Bbauwens (обсуждение | вклад)

(разн.) ← Предыдущая | Текущая версия (разн.) | Следующая → (разн.)
Перейти к: навигация, поиск

General Information

Lectures: Saturday 9h30 - 10h50, zoom

Teachers: Bruno Bauwens and Vladimir Podolskii

Seminar for group 1: Saturday 11h10 - 12h30, Bruno Bauwens and Vladimir Podolskii zoom,

Seminar for group 2: Tuesday ??, Nikita Lukyanenko

Course materials

Date Summary Lecture notes Problem list Solutions
12 Sept Introduction and sample complexity in the realizable setting lecture1.pdf Problem list 1 Solutions 1
19 Sept VC-dimension and sample complexity
26 Sept Risk bounds and the fundamental theorem of statistical learning theory
03 Nov Rademacher complexity and margin assumption

<-- A gentle introduction to the materials of the first 3 lectures and an overview of probability theory, can be found in chapters 1-6 and 11-12 of the following book: Sanjeev Kulkarni and Gilbert Harman: An Elementary Introduction to Statistical Learning Theory, 2012.-->

<-- Afterward, we hope to cover chapters 1-8 from the book: Foundations of machine learning, Mehryar Mohri, Afshin Rostamizadeh, and Ameet Talwalker, 2012. These books can be downloaded from .

(We will study a new boosting algorithm, based on the paper: ) -->

Office hours

Person Monday Tuesday Wednesday Thursday Friday
Bruno Bauwens Room 620

Russian texts

The following links might help students who have trouble with English. A lecture on VC-dimensions was given by K. Vorontsov. A course on Statistical Learning Theory by Nikita Zhivotovsky is given at MIPT. Some short description about PAC learning on p136 in the book ``Наука и искусство построения алгоритмов, которые извлекают знания из данных, Петер Флах. On you can find brief and clear definitions.