Statistical learning theory 2021 — различия между версиями

Материал из Wiki - Факультет компьютерных наук
Перейти к: навигация, поиск
Строка 63: Строка 63:
 
| 25 Sept
 
| 25 Sept
 
|| Sample complexity in the realizable setting, simple examples and bounds using VC-dimension
 
|| Sample complexity in the realizable setting, simple examples and bounds using VC-dimension
|| [https://www.dropbox.com/s/pd2ockzxqdfo66t/04slides.pdf?dl=0 sl04]
+
|| [https://www.dropbox.com/s/pi0f3wab1xna6d7/04slides.pdf?dl=0 sl04]
|| [https://www.dropbox.com/s/8xrgcugs4xv2r2p/06book_sampleComplexity.pdf?dl=0 ch06]
+
|| [https://www.dropbox.com/s/8xrgcugs4xv2r2p/06book_sampleComplexity.pdf?dl=0 ch06] [https://www.dropbox.com/s/ctc48w1d2vvyiyt/07book_growthFunctions.pdf?dl=0 ch07(draft)]
 
||
 
||
 
||
 
||
Строка 71: Строка 71:
 
|| Risk decomposition and the fundamental theorem of statistical learning theory
 
|| Risk decomposition and the fundamental theorem of statistical learning theory
 
||
 
||
||
+
|| [https://www.dropbox.com/s/jofixf9tstz0f8z/08book_VCdimension.pdf?dl=0 ch08(draft)]
 
||
 
||
 
||
 
||

Версия 14:30, 25 сентября 2021

General Information

Grading

Teachers: Bruno Bauwens and Nikita Lukianenko

Lectures: Saturday 14:40 - 16:00. The lectures are Pokrovkaya and also streamed here in zoom. On 18 and 25 Sept only online.

Seminars: Tuesday 16:20 - 17:40. The seminars are Pokrovkaya and also streamed here in zoom.

See ruz for the rooms.

Practical information on telegram group

The course is similar last year, except for the order of topics and part 3.

Homeworks

Email to brbauwens-at-gmail.com. Start the subject line with SLT-HW.

Deadline before the lecture, every 2 weeks.

25 Sept: see problem lists 1 and 2 [update Sept 16], [Hint for 2.8 added on Sept 23.]

09 Oct: see problem lists 3 and 4

Etc.

Course materials

Video Summary Slides Lecture notes Problem list Solutions
Part 1. Online learning
4 Sept Lecture: philosophy. Seminar: the online mistake bound model, the weighted majority, and perceptron algorithms movies 01sl 00ch 01ch 01prob (9 Sept) 01sol
11 Sept The perceptron algorithm in the agnostic setting. Kernels. The standard optimal algorithm. 02sl 02ch 03ch 02prob (23 Sept) 02sol
18 Sept (online) Prediction with expert advice and the exponentially weighted majority algorithm. Recap probability theory. 03sl ch05 ch04 todo 03prob 03sol
Part 2. Risk bounds for binary classification
25 Sept Sample complexity in the realizable setting, simple examples and bounds using VC-dimension sl04 ch06 ch07(draft)
2 Oct Risk decomposition and the fundamental theorem of statistical learning theory ch08(draft)
9 Oct Rademacher complexity
16 Oct Support vector machines and margin risk bounds
29 Oct Kernels: risk bounds, design, and representer theorem
6 Nov AdaBoost and risk bounds
Part 3. Other topics
13 Nov Clustering
20 Nov Dimensionality reduction and the Johnson-Lindenstrauss lemma
27 Nov Active learning
4 Dec Extra space for a lesson, in the likely case we are a bit slower.
11 Dec Colloquium


The lectures in October and November are based on the book: Foundations of machine learning 2nd ed, Mehryar Mohri, Afshin Rostamizadeh, and Ameet Talwalker, 2018. This book can be downloaded from http://gen.lib.rus.ec/ .


Office hours

Person Monday Tuesday Wednesday Thursday Friday
Bruno Bauwens, Zoom 12h30-14h30 14h-20h Room S834 Pokrovkaya 11
Nikita Lukianenko, Telegram 14h30-16h30 14h30-16h30 Room S831 Pokrovkaya 11

It is always good to send an email in advance. Questions and feedback are welcome.

I am traveling from Sept 12 -- Sept 30 and Oct 16 -- Oct 26. On Fridays I'm available till 16h30.