Statistical learning theory 2021 — различия между версиями
Материал из Wiki - Факультет компьютерных наук
Bbauwens (обсуждение | вклад) |
Bbauwens (обсуждение | вклад) |
||
Строка 24: | Строка 24: | ||
|- | |- | ||
| 7 Sept | | 7 Sept | ||
− | || Introduction, the online mistake bound model, weighted majority and perceptron algorithms | + | || Introduction, the online mistake bound model, the weighted majority and perceptron algorithms |
|| | || | ||
|| | || | ||
|- | |- | ||
− | || | + | | 14 Sept |
+ | || The standard optimal algorithm, prediction with expert advice, exponentially weighted algorithm | ||
|| | || | ||
|| | || | ||
|- | |- | ||
+ | | 21 Sept | ||
+ | || Better mistake bounds using VC-dimensions. Recap probability theory. Leave on out risk for SVM. | ||
+ | || | ||
+ | || | ||
+ | |- | ||
+ | | | ||
+ | || Part 2. Supervised classification || || || | ||
+ | |- | ||
+ | | 28 Sept | ||
+ | || Sample complexity in the realizable setting, simple example and bounds using VC-dimension | ||
+ | || | ||
+ | || | ||
+ | |- | ||
+ | | 5 Oct | ||
+ | || Risk decomposition and the fundamental theorem of statistical learning theory | ||
+ | || | ||
+ | || | ||
+ | |- | ||
+ | | 12 Oct | ||
+ | || Rademacher complexity | ||
+ | || | ||
+ | || | ||
+ | |- | ||
+ | | 26 Oct | ||
+ | || Support vector machines and margin risk bounds | ||
|| | || | ||
− | <!-- | 12 Sept || Introduction and | + | || |
+ | |- | ||
+ | | 2 Nov | ||
+ | || AdaBoost and risk bounds | ||
+ | || | ||
+ | || | ||
+ | |- | ||
+ | || Part 3. Other topics || || || | ||
+ | |- | ||
+ | | 9 Nov | ||
+ | || Clustering | ||
+ | || | ||
+ | || | ||
+ | |- | ||
+ | | 16 Nov | ||
+ | || Dimensionality reduction and the Johnson-Lindenstrauss lemma | ||
+ | || | ||
+ | || | ||
+ | <!-- | 12 Sept || Introduction and | ||
|| [https://www.dropbox.com/s/kicoo9xf356eam5/01lect.pdf?dl=0 lecture1.pdf] | || [https://www.dropbox.com/s/kicoo9xf356eam5/01lect.pdf?dl=0 lecture1.pdf] | ||
|| [https://www.dropbox.com/s/pehka8xyu5hlpis/slides01.pdf?dl=0 slides1.pdf] | || [https://www.dropbox.com/s/pehka8xyu5hlpis/slides01.pdf?dl=0 slides1.pdf] |
Версия 12:10, 2 сентября 2021
General Information
Teachers: Bruno Bauwens and Nikita Lukianenko
Lectures: Tuesdays 9h30 - 10h50, zoom
Seminars: Tuesday 11h10 - 12h30
Practical information on telegram group
Course materials
Date | Summary | Lecture notes | Problem list | Solutions |
---|---|---|---|---|
Part 1. Online learning | ||||
7 Sept | Introduction, the online mistake bound model, the weighted majority and perceptron algorithms | |||
14 Sept | The standard optimal algorithm, prediction with expert advice, exponentially weighted algorithm | |||
21 Sept | Better mistake bounds using VC-dimensions. Recap probability theory. Leave on out risk for SVM. | |||
Part 2. Supervised classification | ||||
28 Sept | Sample complexity in the realizable setting, simple example and bounds using VC-dimension | |||
5 Oct | Risk decomposition and the fundamental theorem of statistical learning theory | |||
12 Oct | Rademacher complexity | |||
26 Oct | Support vector machines and margin risk bounds | |||
2 Nov | AdaBoost and risk bounds | |||
Part 3. Other topics | ||||
9 Nov | Clustering | |||
16 Nov | Dimensionality reduction and the Johnson-Lindenstrauss lemma |
The lectures in October and November are based on the book: Foundations of machine learning 2nd ed, Mehryar Mohri, Afshin Rostamizadeh, and Ameet Talwalker, 2018. This book can be downloaded from http://gen.lib.rus.ec/ .
For online learning, we also study a few topics from lecture notes by Н. К. Животовский
Office hours
Person | Monday | Tuesday | Wednesday | Thursday | Friday | |
---|---|---|---|---|---|---|
Bruno Bauwens, Zoom (email in advance) | 14h-18h | 16h15-20h | Room S834 Pokrovkaya 11 |
It is always good to send an email in advance. Questions are welcome.