Statistical learning theory 2021 — различия между версиями
Материал из Wiki - Факультет компьютерных наук
Bbauwens (обсуждение | вклад) |
Bbauwens (обсуждение | вклад) |
||
Строка 6: | Строка 6: | ||
Teachers: [https://www.hse.ru/en/org/persons/160550073 Bruno Bauwens] and [https://www.hse.ru/en/org/persons/225553845 Nikita Lukianenko] | Teachers: [https://www.hse.ru/en/org/persons/160550073 Bruno Bauwens] and [https://www.hse.ru/en/org/persons/225553845 Nikita Lukianenko] | ||
− | Lectures: | + | Lectures: Saturday 14:40 - 16:00. The lectures can be attended in Pokrovkaya and will also be streamed [https://us02web.zoom.us/j/82173400975?pwd=L1lhTzFTc2lGem5BVFdRcFEyVUhqZz09 here] in zoom. |
+ | |||
+ | Seminars: Tuesday 16:20 - 17:40. The seminars are in Pokrovkaya and also streamed [https://us02web.zoom.us/j/82612783590?pwd=U0FwOUVkRjYzZlF1blc2d1FNT1FZQT09 here] in zoom. | ||
− | |||
Practical information on [https://t.me/joinchat/IER2-8hc0wUxNDQ0 telegram group] | Practical information on [https://t.me/joinchat/IER2-8hc0wUxNDQ0 telegram group] |
Версия 18:27, 2 сентября 2021
General Information
Teachers: Bruno Bauwens and Nikita Lukianenko
Lectures: Saturday 14:40 - 16:00. The lectures can be attended in Pokrovkaya and will also be streamed here in zoom.
Seminars: Tuesday 16:20 - 17:40. The seminars are in Pokrovkaya and also streamed here in zoom.
Practical information on telegram group
The course is similar last year, except for the order of topics and part 3.
Course materials
Date | Summary | Lecture notes | Problem list | Solutions |
---|---|---|---|---|
Part 1. Online learning | ||||
7 Sept | Introduction, the online mistake bound model, the weighted majority and perceptron algorithms | |||
14 Sept | The standard optimal algorithm, prediction with expert advice, exponentially weighted algorithm | |||
21 Sept | Better mistake bounds using VC-dimensions. Recap probability theory. Leave on out risk for SVM. | |||
Part 2. Supervised classification | ||||
28 Sept | Sample complexity in the realizable setting, simple example and bounds using VC-dimension | |||
5 Oct | Risk decomposition and the fundamental theorem of statistical learning theory | |||
12 Oct | Rademacher complexity | |||
26 Oct | Support vector machines and margin risk bounds | |||
2 Nov | Kernels: risk bounds, design, and representer theorem | |||
9 Nov | AdaBoost and risk bounds | |||
Part 3. Other topics | ||||
16 Nov | Clustering | |||
23 Nov | Dimensionality reduction and the Johnson-Lindenstrauss lemma | |||
30 Nov | Active learning | |||
7 Dec | Extra space for a lesson, in the likely case we are slower. | |||
14 Dec | Colloquium |
The lectures in October and November are based on the book: Foundations of machine learning 2nd ed, Mehryar Mohri, Afshin Rostamizadeh, and Ameet Talwalker, 2018. This book can be downloaded from http://gen.lib.rus.ec/ .
Office hours
Person | Monday | Tuesday | Wednesday | Thursday | Friday | |
---|---|---|---|---|---|---|
Bruno Bauwens, Zoom | 12h30-14h30 | 14h-20h | Room S834 Pokrovkaya 11 |
It is always good to send an email in advance. Questions and feedback are welcome.