Statistical learning theory 2022
General Information
Lectures: Friday 16h20 -- 17h40, Bruno Bauwens, Maxim Kaledin
Seminars: Friday 18h10 -- 19h30, Artur Goldman,
For discussions of the materials, join the telegram group
The course is similar to last year.
Homeworks
Email to brbauwens-at-gmail.com. Start the subject line with SLT-HW.
Deadline before the lecture, every other lecture.
16 Sept: see problem lists 1 and 2
30 Sept: see problem lists 3 and 4
14 Oct: see problem lists 5 and 6
04 Nov: see problem list 7 and 8
28 Nov: see problem lists 9 and 10
02 Dec: see problem lists 11 and 12
Course materials
Video | Summary | Slides | Lecture notes | Problem list | Solutions |
---|---|---|---|---|---|
Part 1. Online learning | |||||
02 Sept | Philosophy. The online mistake bound model. Weighted majority and perceptron algorithms movies | sl01 | ch00 ch01 | list 1 Шаблон:Font color | |
09 Sept | The perceptron algorithm in the agnostic setting. Kernels. The standard optimal algorithm. | sl02 | ch02 ch03 | list 2 | |
16 Sept | Prediction with expert advice and the exponentially weighted majority algorithm. Recap probability theory. | sl03 | ch04 ch05 | list 3 | |
Part 2. Distribution independent risk bounds | |||||
23 Sept | Sample complexity in the realizable setting, simple examples and bounds using VC-dimension | sl04 | ch06 | list 4 | |
30 Sept | Growth functions, VC-dimension and the characterization of sample comlexity with VC-dimensions | sl05 | ch07 ch08 | list 5 | |
07 Oct | Risk decomposition and the fundamental theorem of statistical learning theory | sl06 | ch09 | list 6 | |
14 Oct | Bounded differences inequality, Rademacher complexity, symmetrization, contraction lemma | sl07 | ch10 ch11 | list 7 | |
Part 3. Margin risk bounds with applications | |||||
21 Oct | Simple regression, support vector machines, margin risk bounds, and neural nets | sl08 | ch12 ch13 | list 8 | |
04 Nov | Kernels: RKHS, representer theorem, risk bounds | sl09 | ch14 | list 9 | |
11 Nov | AdaBoost and the margin hypothesis | sl10 | Mohri et al, chapt 7 | list 10 | |
18 Nov | Implicit regularization of stochastic gradient descent in neural nets | list 11 | |||
Part 4. Other topics | |||||
25 Nov | Regression I: classic noise assumption, sub-Guassian and sub-exponential noise | list 12 | |||
02 Dec | Regression II: Ridge and Lasso regression | list 13 | |||
09 Dec | Multiarmed bandids | list 14 | |||
16 Dec | Colloquium |
The lectures in October and November are based on the book: Foundations of machine learning 2nd ed, Mehryar Mohri, Afshin Rostamizadeh, and Ameet Talwalker, 2018. This book can be downloaded from http://gen.lib.rus.ec/ .
Problems exam
Dates, problems TBA
During the exam
-- You may consult notes, books and search on the internet
-- You may not interact with other humans (e.g. by phone, forums, etc)
Office hours
Person | Monday | Tuesday | Wednesday | Thursday | Friday | |
---|---|---|---|---|---|---|
Bruno Bauwens | 14h--20h | |||||
Maxim Kaledin |
It is always good to send an email in advance. Questions and feedback are welcome.