Statistical learning theory 2021
Материал из Wiki - Факультет компьютерных наук
Версия от 12:45, 2 сентября 2021; Bbauwens (обсуждение | вклад)
General Information
Teachers: Bruno Bauwens and Nikita Lukianenko
Lectures: Tuesdays 9h30 - 10h50, zoom
Seminars: Tuesday 11h10 - 12h30
Practical information on telegram group
The course is similar last year, except for the order of topics.
Course materials
Date | Summary | Lecture notes | Problem list | Solutions |
---|---|---|---|---|
Part 1. Online learning | ||||
7 Sept | Introduction, the online mistake bound model, the weighted majority and perceptron algorithms | |||
14 Sept | The standard optimal algorithm, prediction with expert advice, exponentially weighted algorithm | |||
21 Sept | Better mistake bounds using VC-dimensions. Recap probability theory. Leave on out risk for SVM. | |||
Part 2. Supervised classification | ||||
28 Sept | Sample complexity in the realizable setting, simple example and bounds using VC-dimension | |||
5 Oct | Risk decomposition and the fundamental theorem of statistical learning theory | |||
12 Oct | Rademacher complexity | |||
26 Oct | Support vector machines and margin risk bounds | |||
2 Nov | Kernels: risk bounds, design, and representer theorem | |||
9 Nov | AdaBoost and risk bounds | |||
Part 3. Other topics | ||||
16 Nov | Clustering | |||
23 Nov | Dimensionality reduction and the Johnson-Lindenstrauss lemma | |||
30 Nov | Active learning | |||
7 Dec | Extra space for a lesson, in the likely case we are slower. | |||
14 Dec | Colloquium |
The lectures in October and November are based on the book: Foundations of machine learning 2nd ed, Mehryar Mohri, Afshin Rostamizadeh, and Ameet Talwalker, 2018. This book can be downloaded from http://gen.lib.rus.ec/ .
Office hours
Person | Monday | Tuesday | Wednesday | Thursday | Friday | |
---|---|---|---|---|---|---|
Bruno Bauwens, Zoom (email in advance) | 12h30-14h30 | 14h-20h | Room S834 Pokrovkaya 11 |
It is always good to send an email in advance. Questions and feedback are welcome.