Statistical learning theory 2021 — различия между версиями
Bbauwens (обсуждение | вклад) |
|||
Строка 15: | Строка 15: | ||
The course is similar [http://wiki.cs.hse.ru/Statistical_learning_theory_2020 last year], except for the order of topics and part 3. | The course is similar [http://wiki.cs.hse.ru/Statistical_learning_theory_2020 last year], except for the order of topics and part 3. | ||
+ | |||
+ | == Colloquium == | ||
+ | |||
+ | Saturday December 11 | ||
+ | |||
+ | [https://www.dropbox.com/s/u8hyo1omvaoujle/colloqQuest.pdf?dl=0 rules and list of questions] (draft, at the moment only questions from lectures 1-7) | ||
== Homeworks == | == Homeworks == |
Версия 23:56, 20 ноября 2021
General Information
Teachers: Bruno Bauwens and Nikita Lukianenko
Lectures: Saturday 14:40 - 16:00. The lectures are here in zoom.
Seminars: Tuesday 16:20 - 17:40. The seminars are here in google.meet.
See ruz for the rooms.
Practical information on a telegram group. Join here.
The course is similar last year, except for the order of topics and part 3.
Colloquium
Saturday December 11
rules and list of questions (draft, at the moment only questions from lectures 1-7)
Homeworks
Email to brbauwens-at-gmail.com. Start the subject line with SLT-HW. Results
Deadline before the lecture, every other lecture.
25 Sept: see problem lists 1 and 2
09 Oct: see problem lists 3 and 4
29 Oct: see problem lists 5 and 6
13 Nov: see problem lists 7 and 8
27 Nov: see problem lists 9 and 10
11 Dec: see problem lists 11 and 12
Course materials
Video | Summary | Slides | Lecture notes | Problem list | Solutions |
---|---|---|---|---|---|
Part 1. Online learning | |||||
4 Sept | Lecture: philosophy. Seminar: the online mistake bound model, the weighted majority, and perceptron algorithms movies | 01sl | 00ch 01ch | 01prob (9 Sept) | 01sol |
11 Sept | The perceptron algorithm in the agnostic setting. Kernels. The standard optimal algorithm. | 02sl | 02ch 03ch | 02prob (23 Sept) | 02sol |
18 Sept (rec to do) | Prediction with expert advice and the exponentially weighted majority algorithm. Recap probability theory. | 03sl | ch04 ch05 | 03prob(30 Sept) | 03sol |
Part 2. Risk bounds for binary classification | |||||
25 Sept | Sample complexity in the realizable setting, simple examples and bounds using VC-dimension | sl04 | ch06 | 04prob | 04sol |
2 Oct | Growth functions, VC-dimension and the characterization of sample comlexity with VC-dimensions | sl05 | ch07 ch08 | 05prob | 05sol |
9 Oct | Risk decomposition and the fundamental theorem of statistical learning theory | sl06 | ch09 | 06prob | 06sol |
16 Oct | Bounded differences inequality and Rademacher complexity | sl07 | ch10 ch11 | 07prob | 07sol |
30 Oct | Simple regression, support vector machines, margin risk bounds, and neural nets | sl08 | ch12 ch13 | prob08 | sol08 |
6 Nov | Kernels: risk bounds, RKHS, representer theorem, design | sl09 | ch14 | prob09 | sol09 |
13 Nov | AdaBoost and risk bounds | sl10 | Mohri et al, chapt 7 | prob10 | |
Part 3. Other topics | |||||
20 Nov | Clustering | sl11 | Mohri et al, chapt 7, 8; David Sontag lecture | ||
27 Nov | Dimensionality reduction and the Johnson-Lindenstrauss lemma | ||||
4 Dec | No lecture | ||||
11 Dec | Colloquium |
The lectures in October and November are based on the book: Foundations of machine learning 2nd ed, Mehryar Mohri, Afshin Rostamizadeh, and Ameet Talwalker, 2018. This book can be downloaded from http://gen.lib.rus.ec/ .
Office hours
Person | Monday | Tuesday | Wednesday | Thursday | Friday | |
---|---|---|---|---|---|---|
Bruno Bauwens, Zoom | 12h30-14h30 | 14h-20h | Room S834 Pokrovkaya 11 | |||
Nikita Lukianenko, Telegram | 14h30-16h30 | 14h30-16h30 | Room S831 Pokrovkaya 11 |
It is always good to send an email in advance. Questions and feedback are welcome.
I am traveling from Sept 12 -- Sept 30 and Oct 16 -- Oct 26. On Fridays I'm available till 16h30.