Statistical learning theory 2018 2019 — различия между версиями

Материал из Wiki - Факультет компьютерных наук
Перейти к: навигация, поиск
Строка 23: Строка 23:
 
|-
 
|-
 
| 3 Sept || PAC-learning in the realizable setting definitions  || [https://www.dropbox.com/s/l8e8xjfe2f8tjz8/01lect.pdf?dl=0 lecture1.pdf] updated 23/09
 
| 3 Sept || PAC-learning in the realizable setting definitions  || [https://www.dropbox.com/s/l8e8xjfe2f8tjz8/01lect.pdf?dl=0 lecture1.pdf] updated 23/09
|| [https://www.dropbox.com/s/4ic3ce71znglmu9/01sem.pdf?dl=0 Problem list 1] ||  
+
|| [https://www.dropbox.com/s/4ic3ce71znglmu9/01sem.pdf?dl=0 Problem list 1] || [https://www.dropbox.com/s/cixli4sghy0w01q/01solution.pdf?dl=0 Solutions 1]
 
|-
 
|-
| 10 Sept || VC-dimension and growth functions || [https://www.dropbox.com/s/q1jc2dlotwdn9e2/02lect.pdf?dl=0 lecture2.pdf] updated 23/09 || [https://www.dropbox.com/s/4gimo3fij5p7lnc/02sem.pdf?dl=0 Problem list 2] ||
+
| 10 Sept || VC-dimension and growth functions || [https://www.dropbox.com/s/q1jc2dlotwdn9e2/02lect.pdf?dl=0 lecture2.pdf] updated 23/09 || [https://www.dropbox.com/s/4gimo3fij5p7lnc/02sem.pdf?dl=0 Problem list 2] || [https://www.dropbox.com/s/69pnkefexsmq6nu/02solution.pdf?dl=0 Solutions 2]
 
|-
 
|-
| 17 Sept || Proof that finite VC-dimension implies PAC-learnability || [https://www.dropbox.com/s/9rfvwvf0ne95j8e/03lect.pdf?dl=0 lecture3.pdf] updated 23/09 || [https://www.dropbox.com/s/jb9mriumhtdpn8m/03sem.pdf?dl=0 Problem list 3] ||
+
| 17 Sept || Proof that finite VC-dimension implies PAC-learnability || [https://www.dropbox.com/s/9rfvwvf0ne95j8e/03lect.pdf?dl=0 lecture3.pdf] updated 23/09 || [https://www.dropbox.com/s/jb9mriumhtdpn8m/03sem.pdf?dl=0 Problem list 3] || [https://www.dropbox.com/s/f0gnrfxv9i7at91/03solution.pdf?dl=0 Solutions 3]
 
|-
 
|-
 
| 24 Sept || Applications to decision trees and threshold neural networks. Agnostic PAC-learnability. || [https://www.dropbox.com/s/9oa2zg7jz2ovquf/04lect.pdf?dl=0 lecture4.pdf] || [https://www.dropbox.com/s/l2d9f7u77smrx4u/04sem.pdf?dl=0  Problem list 4] ||
 
| 24 Sept || Applications to decision trees and threshold neural networks. Agnostic PAC-learnability. || [https://www.dropbox.com/s/9oa2zg7jz2ovquf/04lect.pdf?dl=0 lecture4.pdf] || [https://www.dropbox.com/s/l2d9f7u77smrx4u/04sem.pdf?dl=0  Problem list 4] ||

Версия 15:12, 24 октября 2018

General Information

The syllabus

Questions colloquium on 29 October. (Lectures 1-8 updated 24/10.)

Deadline homework 1: October 2nd. Questions: see seminars 3 and 4.

Deadline homework 2: October 27nd. Questions: see seminars 5-8 below.

Deadline homework 3: TBA.

Marks

Intermediate exams: October 29th.

Course materials

Date Summary Lecture notes Problem list Solutions
3 Sept PAC-learning in the realizable setting definitions lecture1.pdf updated 23/09 Problem list 1 Solutions 1
10 Sept VC-dimension and growth functions lecture2.pdf updated 23/09 Problem list 2 Solutions 2
17 Sept Proof that finite VC-dimension implies PAC-learnability lecture3.pdf updated 23/09 Problem list 3 Solutions 3
24 Sept Applications to decision trees and threshold neural networks. Agnostic PAC-learnability. lecture4.pdf Problem list 4
1 Oct Agnostic PAC-learnability is equivalent with finite VC-dimension, structural risk minimization lecture5.pdf 14/10 Problem list 5
9 Oct Boosting, Mohri's book pages 121-131. lecture6.pdf 23/10 Problem list 6
15 Oct Rademacher complexity and contraction lemma (=Talagrand's lemma), Mohri's book pages 33-41 and 78-79 lecture7.pdf Problem list 7
21 Oct Margin theory and risk bounds for boosting. lecture8.pdf Problem list 8

A gentle introduction to the materials of the first 3 lectures and an overview of probability theory, can be found in chapters 1-6 and 11-12 of the following book: Sanjeev Kulkarni and Gilbert Harman: An Elementary Introduction to Statistical Learning Theory, 2012.

Afterward, we hope to cover chapters 1-8 from the book: Foundations of machine learning, Mehryar Mohri, Afshin Rostamizadeh, and Ameet Talwalker, 2012. These books can be downloaded from http://gen.lib.rus.ec/ .


Office hours

Person Monday Tuesday Wednesday Thursday Friday
Bruno Bauwens 16:45–19:00 15:05–18:00 Room 620


Russian texts

The following links might help students who have trouble with English. A lecture on VC-dimensions was given by K. Vorontsov. A course on Statistical Learning Theory by Nikita Zhivotovsky is given at MIPT. Some short description about PAC learning on p136 in the book ``Наука и искусство построения алгоритмов, которые извлекают знания из данных, Петер Флах. On machinelearning.ru you can find brief and clear definitions.