Statistical learning theory 2020 — различия между версиями

Материал из Wiki - Факультет компьютерных наук
Перейти к: навигация, поиск
Строка 17: Строка 17:
 
|-
 
|-
 
| 12 Sept || Introduction and sample complexity in the realizable setting  || [https://www.dropbox.com/s/l8e8xjfe2f8tjz8/01lect.pdf?dl=0 lecture1.pdf]  
 
| 12 Sept || Introduction and sample complexity in the realizable setting  || [https://www.dropbox.com/s/l8e8xjfe2f8tjz8/01lect.pdf?dl=0 lecture1.pdf]  
|| [https://www.dropbox.com/s/kicoo9xf356eam5/01lect.pdf?dl=0 Problem list 1] || [https://www.dropbox.com/s/cixli4sghy0w01q/01solution.pdf?dl=0 Solutions 1]
+
|| [https://www.dropbox.com/s/kicoo9xf356eam5/01lect.pdf?dl=0 Problem list 1] || <!--[https://www.dropbox.com/s/cixli4sghy0w01q/01solution.pdf?dl=0 Solutions 1]-->
 
|-
 
|-
 
| 19 Sept || VC-dimension and sample complexity ||  
 
| 19 Sept || VC-dimension and sample complexity ||  
Строка 24: Строка 24:
 
|-
 
|-
 
| 03 Nov || Rademacher complexity and margin assumption ||
 
| 03 Nov || Rademacher complexity and margin assumption ||
<!--
 
|-
 
| 1 Oct || Agnostic PAC-learnability is equivalent with finite VC-dimension, structural risk minimization || [https://www.dropbox.com/s/jsrse5qaqk2jhi1/05lect.pdf?dl=0 lecture5.pdf] 14/10 || [https://www.dropbox.com/s/etw67uq1pu5g58t/05sem.pdf?dl=0 Problem list 5] || [https://www.dropbox.com/s/6mpom53yrldcrjy/05solution.pdf?dl=0 Solution 5]
 
|-
 
| 9 Oct || Boosting, Mohri's book pages 121-131. || [https://www.dropbox.com/s/m6tc4miryv6cs21/06lect.pdf?dl=0 lecture6.pdf] 23/10 || [https://www.dropbox.com/s/85t74k9wmibcnmr/06sem.pdf?dl=0 Problem list 6] || No solution.
 
|-
 
| 15 Oct || Rademacher complexity and contraction lemma (=Talagrand's lemma), Mohri's book pages 33-41 and 78-79 || [https://www.dropbox.com/s/y2vr3mrwp66cuvz/07lect.pdf?dl=0 lecture7.pdf] || [https://www.dropbox.com/s/cuo0tmfv4k2egvh/07sem.pdf?dl=0 Problem list 7] || See lecture7.pdf
 
|-
 
| 21 Oct || Margin theory and risk bounds for boosting. || [https://www.dropbox.com/s/o5zae3d8nw5eexw/08lect.pdf?dl=0 lecture8.pdf] || [https://www.dropbox.com/s/xg7u3ss1a0vog5j/08sem.pdf?dl=0 Problem list 8]|| See lecture6.pdf for ex. 8.6.
 
|-
 
| 12 Nov || Deep boosting, we study the paper [http://www.cs.nyu.edu/~mohri/pub/mboost.pdf Multi-class deep boosting], V. Kuznetsov, M Mohri, and U. Syed, Advances in Neural Information Processing Systems, p2501--2509, 2014. Notes will be provided. || [https://www.dropbox.com/s/tc7drmxwu53opzq/09lect.pdf?dl=0 lecture9.pdf] || [https://www.dropbox.com/s/lsu6tgmc767u3yd/09sem.pdf?dl=0 Problem list 9] || [https://www.dropbox.com/s/8wmswbynzx0s9hd/09sol.pdf?dl=0 Solutions 9.]
 
|-
 
| 19 Nov || Support vector machines, primal and dual optimization problem, risk bounds.  || See chapt. 5 of Mohri's book || [https://www.dropbox.com/s/ys37nsdfz3aa4ry/10sem.pdf?dl=0 Problem list 10]|| No solution.
 
|-
 
| 26 Nov || Kernels, Kernel reproducing Hilbert spaces, representer theorem, examples of kernels || [https://www.dropbox.com/s/xkic1j6r516ierl/11lect.pdf?dl=0 lecture11.pdf] || [https://www.dropbox.com/s/g3huq5aqzdaesrg/11sem.pdf?dl=0 Problem set 11] || Solutions: see lecture11.pdf
 
|-
 
| 3 Dec || A polynomial time improper learning algorithm for constant depth L1-regularized neural networks, from [http://www.jmlr.org/proceedings/papers/v48/zhangd16.pdf this paper].  Online algorithms: halving algorithm, weighted and exponentially weighted average algorithms. See Mohri's book Sections 7.1 and 7.2. || [https://www.dropbox.com/s/aq6798jps111l86/12lect.pdf?dl=0 lecture12.pdf] || [https://www.dropbox.com/s/o4t6smc70o1bt3t/12sem.pdf?dl=0 Problem list 12] || No solution.
 
|-
 
| 10 Dec || We finish online learning. Discuss the algorithm from [http://papers.nips.cc/paper/4616-bandit-algorithms-boost-brain-computer-interfaces-for-motor-task-selection-of-a-brain-controlled-button.pdf this paper].  || || See previous list. ||
 
 
|}
 
|}
-->
+
 
  
 
<-- A gentle introduction to the materials of the first 3 lectures and an overview of probability theory, can be found in chapters 1-6 and 11-12 of the following book:
 
<-- A gentle introduction to the materials of the first 3 lectures and an overview of probability theory, can be found in chapters 1-6 and 11-12 of the following book:

Версия 18:23, 11 сентября 2020

General Information

Lectures: Saturday 9h30 - 10h50, zoom https://zoom.us/j/96210489901

Teachers: Bruno Bauwens and Vladimir Podolskii

Seminar for group 1: Saturday 11h10 - 12h30, Bruno Bauwens and Vladimir Podolskii zoom https://zoom.us/j/94186131884,

Seminar for group 2: Tuesday ??, Nikita Lukyanenko

Course materials

Date Summary Lecture notes Problem list Solutions
12 Sept Introduction and sample complexity in the realizable setting lecture1.pdf Problem list 1
19 Sept VC-dimension and sample complexity
26 Sept Risk bounds and the fundamental theorem of statistical learning theory
03 Nov Rademacher complexity and margin assumption


<-- A gentle introduction to the materials of the first 3 lectures and an overview of probability theory, can be found in chapters 1-6 and 11-12 of the following book: Sanjeev Kulkarni and Gilbert Harman: An Elementary Introduction to Statistical Learning Theory, 2012.-->

<-- Afterward, we hope to cover chapters 1-8 from the book: Foundations of machine learning, Mehryar Mohri, Afshin Rostamizadeh, and Ameet Talwalker, 2012. These books can be downloaded from http://gen.lib.rus.ec/ .

(We will study a new boosting algorithm, based on the paper: ) -->

Office hours

Person Monday Tuesday Wednesday Thursday Friday
Bruno Bauwens Room 620