Statistical learning theory 2020 — различия между версиями
Bbauwens (обсуждение | вклад) (Новая страница: « == General Information == Lectures: Saturday 9h30 - 10h50, zoom https://zoom.us/j/96210489901 Teachers: Bruno Bauwens and Vladimir Podolskii Seminar for group…») |
Bbauwens (обсуждение | вклад) |
||
Строка 24: | Строка 24: | ||
|- | |- | ||
| 03 Nov || Rademacher complexity and margin assumption || | | 03 Nov || Rademacher complexity and margin assumption || | ||
− | <! | + | <!-- |
|- | |- | ||
| 1 Oct || Agnostic PAC-learnability is equivalent with finite VC-dimension, structural risk minimization || [https://www.dropbox.com/s/jsrse5qaqk2jhi1/05lect.pdf?dl=0 lecture5.pdf] 14/10 || [https://www.dropbox.com/s/etw67uq1pu5g58t/05sem.pdf?dl=0 Problem list 5] || [https://www.dropbox.com/s/6mpom53yrldcrjy/05solution.pdf?dl=0 Solution 5] | | 1 Oct || Agnostic PAC-learnability is equivalent with finite VC-dimension, structural risk minimization || [https://www.dropbox.com/s/jsrse5qaqk2jhi1/05lect.pdf?dl=0 lecture5.pdf] 14/10 || [https://www.dropbox.com/s/etw67uq1pu5g58t/05sem.pdf?dl=0 Problem list 5] || [https://www.dropbox.com/s/6mpom53yrldcrjy/05solution.pdf?dl=0 Solution 5] | ||
Строка 65: | Строка 65: | ||
|} | |} | ||
− | + | <!-- | |
== Russian texts == | == Russian texts == | ||
Строка 72: | Строка 72: | ||
``Наука и искусство построения алгоритмов, которые извлекают знания из данных'', Петер Флах. On [http://www.machinelearning.ru machinelearning.ru] | ``Наука и искусство построения алгоритмов, которые извлекают знания из данных'', Петер Флах. On [http://www.machinelearning.ru machinelearning.ru] | ||
you can find brief and clear definitions. | you can find brief and clear definitions. | ||
+ | --> |
Версия 18:22, 11 сентября 2020
General Information
Lectures: Saturday 9h30 - 10h50, zoom https://zoom.us/j/96210489901
Teachers: Bruno Bauwens and Vladimir Podolskii
Seminar for group 1: Saturday 11h10 - 12h30, Bruno Bauwens and Vladimir Podolskii zoom https://zoom.us/j/94186131884,
Seminar for group 2: Tuesday ??, Nikita Lukyanenko
Course materials
Date | Summary | Lecture notes | Problem list | Solutions | ||||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
12 Sept | Introduction and sample complexity in the realizable setting | lecture1.pdf | Problem list 1 | Solutions 1 | ||||||||||||
19 Sept | VC-dimension and sample complexity | |||||||||||||||
26 Sept | Risk bounds and the fundamental theorem of statistical learning theory | |||||||||||||||
03 Nov | Rademacher complexity and margin assumption |
<-- A gentle introduction to the materials of the first 3 lectures and an overview of probability theory, can be found in chapters 1-6 and 11-12 of the following book: Sanjeev Kulkarni and Gilbert Harman: An Elementary Introduction to Statistical Learning Theory, 2012.--> <-- Afterward, we hope to cover chapters 1-8 from the book: Foundations of machine learning, Mehryar Mohri, Afshin Rostamizadeh, and Ameet Talwalker, 2012. These books can be downloaded from http://gen.lib.rus.ec/ . (We will study a new boosting algorithm, based on the paper: ) --> Office hours
|