Statistical learning theory 2023/24

Материал из Wiki - Факультет компьютерных наук
Перейти к: навигация, поиск

General Information

Lectures: Tuesday 14h40 -- 16h00, Bruno Bauwens, Maxim Kaledin, room S321 and in zoom

Seminars: TBA, Artur Goldman The first seminar will be on 11.09 in room D201, 9:30--10:50 and also in https://us02web.zoom.us/j/82300259484?pwd=NWxXekxBeE5yMm9UTmwvLzNNNGlnUT09 zoom].

To discuss the materials, join the telegram group The course is similar to last year.


Homeworks

Deadline every 2 weeks, before the seminar.

Email to brbauwens-at-gmail.com. Start the subject line with SLT-HW. -Link with results todo-.


Course materials

Video Summary Slides Lecture notes Problem list Solutions
Part 1. Online learning
05 Sept Philosophy. The online mistake bound model. The halving and weighted majority algorithms movies sl01 ch00 ch01 prob01
12 Sept The perceptron algorithm. Kernels. The standard optimal algorithm. sl02 ch02 ch03
19 Sept Prediction with expert advice. Recap probability theory. Multi-armed bandids. sl03 ch04 ch05
26 Sept Multi-armed bandids. sl03 ch04 ch05
Part 2. Distribution independent risk bounds
03 Oct Sample complexity in the realizable setting, simple examples and bounds using VC-dimension sl04 ch06
10 Oct Growth functions, VC-dimension and the characterization of sample comlexity with VC-dimensions sl05 ch07 ch08
17 Oct Risk decomposition and the fundamental theorem of statistical learning theory sl06 ch09
24 Oct Bounded differences inequality, Rademacher complexity, symmetrization, contraction lemma, quiz sl07 ch10 ch11
Part 3. Margin risk bounds with applications
07 Nov Simple regression, support vector machines, margin risk bounds, and neural nets sl08 ch12 ch13
14 Nov Kernels: RKHS, representer theorem, risk bounds sl09 ch14
21 Nov AdaBoost and the margin hypothesis sl10 ch15
28 Nov Implicit regularization of stochastic gradient descent in neural nets ch16
Part 4. Neural tangent kernels
05 Dec Optional: part 1.
12 Dec Colloquium
19 Dec Optional: part 2.

Background on multi-armed bandits: A. Slivkins, [Introduction to multi-armed bandits https://arxiv.org/pdf/1904.07272.pdf], 2022.

The lectures in October and November are based on the book: Foundations of machine learning 2nd ed, Mehryar Mohri, Afshin Rostamizadeh, and Ameet Talwalker, 2018. This book can be downloaded from Library Genesis (the link changes sometimes and sometimes vpn is needed).


Grading formula

Final grade = 0.35 * [score of homeworks] + 0.35 * [score of colloquium] + 0.3 * [score on the exam] + bonus from quizzes.

All homework questions have the same weight. Each solved extra homework task increases the score of the final exam by 1 point.

There is no rounding except on the final grade. Grades fractional grades above 5/10 are rounded up, those below 5/10 are rounded down.

Autogrades: if you only need 4/10 to pass with maximal final score, it will be given automatically. This may happen because of extra questions and bonuses from quizzes.


Colloquium

Rules and questions of previous year.


Problems exam

December 21--30, TBA.
-- You may use handwritten notes, lecture materials from this wiki (either printed or through your PC), Mohri's book
-- You may not search on the internet or interact with other humans (e.g. by phone, forums, etc)


Office hours

Bruno Bauwens: Wednesday 13h-16h, Friday 14h-20h, (better send an email in advance).

Maxim Kaledin: Write in Telegram, the time is flexible