Statistical learning theory 2021

Материал из Wiki - Факультет компьютерных наук
Перейти к: навигация, поиск

General Information


Teachers: Bruno Bauwens and Nikita Lukianenko

Lectures: Saturday 14:40 - 16:00. The lectures are in room R308 (Pokrovkaya) and also streamed here in zoom.

Seminars: Tuesday 16:20 - 17:40. The seminars are in room ?? (Pokrovkaya) and also streamed here in zoom.

Practical information on telegram group

The course is similar last year, except for the order of topics and part 3.

Course materials

Date Summary Slides Lecture notes Problem list Solutions
Part 1. Online learning
4 Sept Lecture: philosophy video. Seminar: the online mistake bound model, the weighted majority, and perceptron algorithms 01sl.pdf
11 Sept The standard optimal algorithm, prediction with expert advice, exponentially weighted algorithm
18 Sept Better mistake bounds using VC-dimensions. Recap probability theory. Leave on out risk for SVM.
Part 2. Supervised classification
25 Sept Sample complexity in the realizable setting, simple example and bounds using VC-dimension
2 Oct Risk decomposition and the fundamental theorem of statistical learning theory
9 Oct Rademacher complexity
16 Oct Support vector machines and margin risk bounds
29 Oct Kernels: risk bounds, design, and representer theorem
6 Nov AdaBoost and risk bounds
Part 3. Other topics
13 Nov Clustering
20 Nov Dimensionality reduction and the Johnson-Lindenstrauss lemma
27 Nov Active learning
4 Dec Extra space for a lesson, in the likely case we are a bit slower.
11 Dec Colloquium

The lectures in October and November are based on the book: Foundations of machine learning 2nd ed, Mehryar Mohri, Afshin Rostamizadeh, and Ameet Talwalker, 2018. This book can be downloaded from .

Office hours

Person Monday Tuesday Wednesday Thursday Friday
Bruno Bauwens, Zoom 12h30-14h30 14h-20h Room S834 Pokrovkaya 11

It is always good to send an email in advance. Questions and feedback are welcome.