Statistical learning theory 2022 — различия между версиями
Bbauwens (обсуждение | вклад) |
Bbauwens (обсуждение | вклад) |
||
Строка 42: | Строка 42: | ||
|| | || | ||
|- | |- | ||
− | | [https://drive.google.com/file/d/16OoCqhh16BKQzyF-HM8RozigyJ3BBVxA/view?usp=sharing | + | | [https://drive.google.com/file/d/16OoCqhh16BKQzyF-HM8RozigyJ3BBVxA/view?usp=sharing 09 Sept] |
|| The perceptron algorithm in the agnostic setting. Kernels. The standard optimal algorithm. | || The perceptron algorithm in the agnostic setting. Kernels. The standard optimal algorithm. | ||
|| [https://www.dropbox.com/s/sy959ee81mov5cr/02slides.pdf?dl=0 sl02] | || [https://www.dropbox.com/s/sy959ee81mov5cr/02slides.pdf?dl=0 sl02] |
Версия 17:11, 30 августа 2022
General Information
Lectures: Friday 16h20 -- 17h40, Bruno Bauwens, Maxim Kaledin
Seminars: Friday 18h10 -- 19h30, Artur Goldman,
For discussions of the materials, join the telegram group
The course is similar to last year.
Homeworks
Email to brbauwens-at-gmail.com. Start the subject line with SLT-HW.
Deadline before the lecture, every other lecture.
23 Sept: see problem lists 1 and 2
07 Oct: see problem lists 3 and 4
21 Oct: see problem lists 5 and 6
04 Nov: see problem list 7
18 Nov: see problem lists 8 and 9
02 Dec: see problem lists 10 and 11
Course materials
Video | Summary | Slides | Lecture notes | Problem list | Solutions |
---|---|---|---|---|---|
Part 1. Online learning | |||||
02 Sept | Lecture: philosophy. The online mistake bound model, the weighted majority, and perceptron algorithms movies | sl01 | ch00 ch01 | ||
09 Sept | The perceptron algorithm in the agnostic setting. Kernels. The standard optimal algorithm. | sl02 | ch02 ch03 | ||
16 Sept | Prediction with expert advice and the exponentially weighted majority algorithm. Recap probability theory. | sl03 | ch04 ch05 | ||
Part 2. Distribution independent risk bounds | |||||
23 Sept | Sample complexity in the realizable setting, simple examples and bounds using VC-dimension | sl04 | ch06 | ||
30 Sept | Growth functions, VC-dimension and the characterization of sample comlexity with VC-dimensions | sl05 | ch07 ch08 | ||
07 Oct | Risk decomposition and the fundamental theorem of statistical learning theory | sl06 | ch09 | ||
14 Oct | Bounded differences inequality, Rademacher complexity, symmetrization, contraction lemma | sl07 | ch10 ch11 | ||
Part 3. Margin risk bounds with applications | |||||
21 Oct | Simple regression, support vector machines, margin risk bounds, and neural nets | sl08 | ch12 ch13 | ||
04 Nov | Kernels: RKHS, representer theorem, risk bounds | sl09 | ch14 | ||
11 Nov | AdaBoost and the margin hypothesis | sl10 | Mohri et al, chapt 7 | ||
18 Nov | Implicit regularization of stochastic gradient descent in neural nets | ||||
Part 4. Other topics | |||||
25 Nov | Regression I: classic noise assumption, sub-Guassian and sub-exponential noise | ||||
02 Dec | Regression II: Ridge and Lasso regression | ||||
09 Dec | Multiarmed bandids | ||||
16 Dec | Colloquium |
The lectures in October and November are based on the book: Foundations of machine learning 2nd ed, Mehryar Mohri, Afshin Rostamizadeh, and Ameet Talwalker, 2018. This book can be downloaded from http://gen.lib.rus.ec/ .
Problems exam
Dates, problems TBA
During the exam
-- You may consult notes, books and search on the internet
-- You may not interact with other humans (e.g. by phone, forums, etc)
Office hours
Person | Monday | Tuesday | Wednesday | Thursday | Friday | |
---|---|---|---|---|---|---|
Bruno Bauwens | 14h--20h | |||||
Maxim Kaledin |
It is always good to send an email in advance. Questions and feedback are welcome.