Statistical learning theory 2021
Содержание
General Information
Teachers: Bruno Bauwens and Nikita Lukianenko
Lectures: Saturday 14:40 - 16:00. The lectures are in zoom.
Seminars: Tuesday 16:20 - 17:40. The seminars are here in google.meet.
Practical information on a telegram group.
The course is similar last year, except for the order of topics and part 3.
Problems exam
Dec 22, 12:00 -- 15:30
During the exam
-- You may consult notes, books and search on the internet
-- You may not interact with other humans (e.g. by phone, forums, etc)
Colloquium
Saturday December 11
rules and list of questions (version Dec 10)
Homeworks
Email to brbauwens-at-gmail.com. Start the subject line with SLT-HW. Results
Deadline before the lecture, every other lecture.
25 Sept: see problem lists 1 and 2
09 Oct: see problem lists 3 and 4
29 Oct: see problem lists 5 and 6
13 Nov: see problem lists 7 and 8
30 Nov, 08:00 [extended]: see problem lists 9 and 10
Course materials
Video | Summary | Slides | Lecture notes | Problem list | Solutions |
---|---|---|---|---|---|
Part 1. Online learning | |||||
4 Sept | Lecture: philosophy. Seminar: the online mistake bound model, the weighted majority, and perceptron algorithms movies | sl01 | ch00 ch01 | 01prob (9 Sept) | 01sol |
11 Sept | The perceptron algorithm in the agnostic setting. Kernels. The standard optimal algorithm. | sl02 | ch02 ch03 | 02prob (23 Sept) | 02sol |
18 Sept (rec to do) | Prediction with expert advice and the exponentially weighted majority algorithm. Recap probability theory. | sl03 | ch04 ch05 | 03prob(30 Sept) | 03sol |
Part 2. Risk bounds for binary classification | |||||
25 Sept | Sample complexity in the realizable setting, simple examples and bounds using VC-dimension | sl04 | ch06 | 04prob | 04sol |
2 Oct | Growth functions, VC-dimension and the characterization of sample comlexity with VC-dimensions | sl05 | ch07 ch08 | 05prob | 05sol |
9 Oct | Risk decomposition and the fundamental theorem of statistical learning theory | sl06 | ch09 | 06prob | 06sol |
16 Oct | Bounded differences inequality and Rademacher complexity | sl07 | ch10 ch11 | 07prob | 07sol |
30 Oct | Simple regression, support vector machines, margin risk bounds, and neural nets | sl08 | ch12 ch13 | 08prob | 08sol |
6 Nov | Kernels: risk bounds, RKHS, representer theorem, design | sl09 | ch14 | 09prob (Nov 23) | 09sol |
13 Nov | AdaBoost and risk bounds | sl10 | Mohri et al, chapt 7 | 10prob (Nov 23) | 10sol |
Part 3. Other topics | |||||
20 Nov | Clustering | sl11 | Mohri et al, ch7; lecture | ||
27 Nov | Dimensionality reduction and the Johnson-Lindenstrauss lemma | sl12 | Mohri et al, ch15; lecture | 12prob | |
4 Dec | No lecture | ||||
11 Dec | Colloquium |
The lectures in October and November are based on the book: Foundations of machine learning 2nd ed, Mehryar Mohri, Afshin Rostamizadeh, and Ameet Talwalker, 2018. This book can be downloaded from http://gen.lib.rus.ec/ .
Office hours
Person | Monday | Tuesday | Wednesday | Thursday | Friday | |
---|---|---|---|---|---|---|
Bruno Bauwens, Zoom | 12h30-14h30 | 14h-20h | Room S834 Pokrovkaya 11 | |||
Nikita Lukianenko, Telegram | 14h30-16h30 | 14h30-16h30 | Room S831 Pokrovkaya 11 |
It is always good to send an email in advance. Questions and feedback are welcome.