Statistical Learning Theory 2022
Содержание
General Information
Lectures: Friday 16h20 -- 17h40, Bruno Bauwens, Maxim Kaledin, room M202 and on zoom
Seminars: Saturday 14h40 -- 16h00, Artur Goldman, room M202 and on zoom (the link will be in telegram)
To discuss the materials, join the telegram group The course is similar to last year.
Problems exam
December 21, 13h-16h, computer room G403 (zoomlink for students abroad)
-- You may use handwritten notes, lecture materials from this wiki (either printed or through your PC), Mohri's book
-- You may not search on the internet or interact with other humans (e.g. by phone, forums, etc)
Course materials
Video | Summary | Slides | Lecture notes | Problem list | Solutions |
---|---|---|---|---|---|
Part 1. Online learning | |||||
02 Sept | Philosophy. The online mistake bound model. The halving and weighted majority algorithms movies | sl01 | ch00 ch01 | list 1 update 05.09 | solutions 1 |
09 Sept | The perceptron algorithm. The standard optimal algorithm. | sl02 | ch02 ch03 | list 2 update 25.09 | solutions 2 |
16 Sept | Kernels and the kernel perceptron algorithm. Prediction with expert advice. Recap probability theory. | sl03 | ch04 ch05 | list 3 | solutions 3 |
Part 2. Distribution independent risk bounds | |||||
23 Sept | Sample complexity in the realizable setting, simple examples and bounds using VC-dimension | sl04 | ch06 | list 4 | solutions 4 |
30 Sept | Growth functions, VC-dimension and the characterization of sample comlexity with VC-dimensions | sl05 | ch07 ch08 | list 5 | solutions 5 |
07 Oct | Risk decomposition and the fundamental theorem of statistical learning theory | sl06 | ch09 | list 6 | solutions 6 |
14 Oct | Bounded differences inequality, Rademacher complexity, symmetrization, contraction lemma, quiz | sl07 | ch10 ch11 | list 7 update 15.10 | solutions 7 |
Part 3. Margin risk bounds with applications | |||||
21 Oct | Simple regression, support vector machines, margin risk bounds, and neural nets | sl08 | ch12 ch13 | list 8 | solutions 8 |
04 Nov | Kernels: RKHS, representer theorem, risk bounds | sl09 | ch14 | list 9 | solutions 9 |
11 Nov | AdaBoost and the margin hypothesis | sl10 | ch15 | list 10 | solutions 10 |
18 Nov | Implicit regularization of stochastic gradient descent in neural nets | ch16 | no seminar | ||
Part 4. Other topics | |||||
25 Nov | Regression I: fixed design with sub-Gaussian noise | notes12 | list 12 | solutions 12 | |
02 Dec | Multiarmed bandids I | notes13 | list 13 | ||
09 Dec | Multiarmed bandids II (optional) | notes14 | notebook, notebook(solved) | ||
16 Dec | Colloquium |
The lectures in October and November are based on the book: Foundations of machine learning 2nd ed, Mehryar Mohri, Afshin Rostamizadeh, and Ameet Talwalker, 2018. This book can be downloaded from Library Genesis (the link changes sometimes and sometimes vpn is needed).
Grading formula
Final grade = 0.35 * [score of homeworks] + 0.35 * [score of colloquium] + 0.3 * [score on the exam] + bonus from quizzes.
All homework questions have the same weight. Each solved extra homework task increases the score of the final exam by 1 point.
There is no rounding except on the final grade. Grades fractional grades above 5/10 are rounded up, those below 5/10 are rounded down.
Autogrades: if you only need 4/10 to pass with maximal final score, it will be given automatically. This may happen because of extra questions and bonuses from quizzes.
For students who want to pass with 4/10 with minimal effort: each year on the exam, I ask to calculate the VC-dimension or Rademacher complexity of some class. It should be easy to have 4/10 for the final exam. If you understand all lecture notes, you pass the colloquium with maximal score. Together this is enough. If only a few students fail and the grades are at least 3.8/10 then failed students may resubmit a few homework tasks to pull up the grade. (This happened in the last 3 years.)
Colloquium
Rules and questions. Update 12/12 added question 24 and corrected typos.
Homeworks
Email to brbauwens-at-gmail.com. Start the subject line with SLT-HW.
Deadline before the start of the lecture, every other lecture.
Sat. 17 Sept 18h10: problems 1.7, 1.8, 2.9, and 2.11
Sat. 01 Oct 18h10: see lists 3 and 4, and 2.10
Fri. 14 Oct 16h20: see problem lists 5 and 6
Sat. 05 Nov 20h00: see problem lists 7 and 8
Sat. 19 Nov 20h00: see problem lists 9 and 10
Sun. 04 Dec 23h59: see problem list 12 send it to maxkaledin@gmail.com with subject line SLT-HW-Reg <YourName>_<YourSurname>
Office hours
Person | Monday | Tuesday | Wednesday | Thursday | Friday | |
---|---|---|---|---|---|---|
Bruno Bauwens | 15-20h | 18-20h | ||||
Maxim Kaledin | Write | in | Telegram | time is | flexible |
It is always good to send an email in advance. Questions and feedback are welcome.