Statistical learning theory 2022 — различия между версиями

Материал из Wiki - Факультет компьютерных наук
Перейти к: навигация, поиск
Строка 44: Строка 44:
 
|| [https://www.dropbox.com/s/sy959ee81mov5cr/02slides.pdf?dl=0 sl02]  
 
|| [https://www.dropbox.com/s/sy959ee81mov5cr/02slides.pdf?dl=0 sl02]  
 
|| [https://www.dropbox.com/s/p3auugqwc89132b/02book_sequentialOptimalAlgorithm.pdf?dl=0 ch02] [https://www.dropbox.com/s/b00dcqk1rob7rdz/03book_perceptron.pdf?dl=0 ch03]
 
|| [https://www.dropbox.com/s/p3auugqwc89132b/02book_sequentialOptimalAlgorithm.pdf?dl=0 ch02] [https://www.dropbox.com/s/b00dcqk1rob7rdz/03book_perceptron.pdf?dl=0 ch03]
|| [https://www.dropbox.com/s/88jgjvxo16zfrjs/02sem.pdf?dl=0 list 2]
+
|| [https://www.dropbox.com/s/88jgjvxo16zfrjs/02sem.pdf?dl=0 list 2] <span style="color:red">update 25.09</span>
 
|| [https://www.dropbox.com/s/pqblktfky8to5hr/02sol.pdf?dl=0 solutions 2]
 
|| [https://www.dropbox.com/s/pqblktfky8to5hr/02sol.pdf?dl=0 solutions 2]
 
|-
 
|-

Версия 20:22, 25 сентября 2022

General Information

Lectures: Friday 16h20 -- 17h40, Bruno Bauwens, Maxim Kaledin

Seminars: 09.09 -- 01.10 Saturday 14:40 -- 16:00, starting from 07.10 Friday 18h10 -- 19h30, Artur Goldman,

For discussions of the materials, join the telegram group

The course is similar to last year.

Homeworks

Email to brbauwens-at-gmail.com. Start the subject line with SLT-HW.

Deadline before the start of the lecture, every other lecture.

Sat. 17 Sept 18h10: problems 1.7, 1.8, 2.9, and 2.11
Sat. 01 Oct 18h10: see lists 3 and 4, and 2.10
Fri. 14 Oct 16h20: see problem lists 5 and 6
Fri. 04 Nov 16h20: see problem lists 7 and 8
Fri. 28 Nov 16h20: see problem lists 9 and 10
Fri. 02 Dec 16h20: see problem lists 11 and 12

Course materials

Video Summary Slides Lecture notes Problem list Solutions
Part 1. Online learning
02 Sept Philosophy. The online mistake bound model. The halving and weighted majority algorithms movies sl01 ch00 ch01 list 1 update 05.09 solutions 1
09 Sept The perceptron algorithm. The standard optimal algorithm. sl02 ch02 ch03 list 2 update 25.09 solutions 2
16 Sept Kernels and the kernel perceptron algorithm. Prediction with expert advice. Recap probability theory. sl03 ch04 ch05 list 3 solutions 3
Part 2. Distribution independent risk bounds
23 Sept Sample complexity in the realizable setting, simple examples and bounds using VC-dimension sl04 ch06 list 4 solutions 4
30 Sept Growth functions, VC-dimension and the characterization of sample comlexity with VC-dimensions sl05 ch07 ch08 list 5
07 Oct Risk decomposition and the fundamental theorem of statistical learning theory sl06 ch09 list 6
14 Oct Bounded differences inequality, Rademacher complexity, symmetrization, contraction lemma sl07 ch10 ch11 list 7
Part 3. Margin risk bounds with applications
21 Oct Simple regression, support vector machines, margin risk bounds, and neural nets sl08 ch12 ch13 list 8
04 Nov Kernels: RKHS, representer theorem, risk bounds sl09 ch14 list 9
11 Nov AdaBoost and the margin hypothesis sl10 Mohri et al, chapt 7 list 10
18 Nov Implicit regularization of stochastic gradient descent in neural nets list 11
Part 4. Other topics
25 Nov Regression I: classic noise assumption, sub-Guassian and sub-exponential noise list 12
02 Dec Regression II: Ridge and Lasso regression list 13
09 Dec Multiarmed bandids list 14
16 Dec Colloquium

The lectures in October and November are based on the book: Foundations of machine learning 2nd ed, Mehryar Mohri, Afshin Rostamizadeh, and Ameet Talwalker, 2018. This book can be downloaded from Library Genesis (the link changes sometimes and sometimes vpn is needed).


Problems exam

Dates, problems TBA

During the exam
-- You may consult notes, books and search on the internet
-- You may not interact with other humans (e.g. by phone, forums, etc)



Office hours

Person Monday Tuesday Wednesday Thursday Friday
Bruno Bauwens 15-20h 18-20h
Maxim Kaledin

It is always good to send an email in advance. Questions and feedback are welcome.