Statistical learning theory 2021
Материал из Wiki - Факультет компьютерных наук
Версия от 12:12, 2 сентября 2021; Bbauwens (обсуждение | вклад)
General Information
Teachers: Bruno Bauwens and Nikita Lukianenko
Lectures: Tuesdays 9h30 - 10h50, zoom
Seminars: Tuesday 11h10 - 12h30
Practical information on telegram group
Course materials
Date | Summary | Lecture notes | Problem list | Solutions |
---|---|---|---|---|
Part 1. Online learning | ||||
7 Sept | Introduction, the online mistake bound model, the weighted majority and perceptron algorithms | |||
14 Sept | The standard optimal algorithm, prediction with expert advice, exponentially weighted algorithm | |||
21 Sept | Better mistake bounds using VC-dimensions. Recap probability theory. Leave on out risk for SVM. | |||
Part 2. Supervised classification | ||||
28 Sept | Sample complexity in the realizable setting, simple example and bounds using VC-dimension | |||
5 Oct | Risk decomposition and the fundamental theorem of statistical learning theory | |||
12 Oct | Rademacher complexity | |||
26 Oct | Support vector machines and margin risk bounds | |||
2 Nov | AdaBoost and risk bounds | |||
Part 3. Other topics | ||||
9 Nov | Clustering | |||
16 Nov | Dimensionality reduction and the Johnson-Lindenstrauss lemma |
The lectures in October and November are based on the book: Foundations of machine learning 2nd ed, Mehryar Mohri, Afshin Rostamizadeh, and Ameet Talwalker, 2018. This book can be downloaded from http://gen.lib.rus.ec/ .
For online learning, we also study a few topics from lecture notes by Н. К. Животовский
Office hours
Person | Monday | Tuesday | Wednesday | Thursday | Friday | |
---|---|---|---|---|---|---|
Bruno Bauwens, Zoom (email in advance) | 14h-18h | 16h15-20h | Room S834 Pokrovkaya 11 |
It is always good to send an email in advance. Questions are welcome.