Statistical learning theory — различия между версиями

Материал из Wiki - Факультет компьютерных наук
Перейти к: навигация, поиск
м
м
Строка 30: Строка 30:
 
| 3 okt ||  Boosting: the adaBoost algorithm  || [https://www.dropbox.com/s/7ya6pk9wfo8fbk9/5lect.pdf?dl=0 5th lecture] (part about agnostic learning) About boosting: see chapt 6 in the book of Mohri (see below)|| [https://www.dropbox.com/s/p1774864pxyvqwm/5seminar.pdf?dl=0 Problem list 5]
 
| 3 okt ||  Boosting: the adaBoost algorithm  || [https://www.dropbox.com/s/7ya6pk9wfo8fbk9/5lect.pdf?dl=0 5th lecture] (part about agnostic learning) About boosting: see chapt 6 in the book of Mohri (see below)|| [https://www.dropbox.com/s/p1774864pxyvqwm/5seminar.pdf?dl=0 Problem list 5]
 
|-
 
|-
| 10 okt ||  Boosting: several other algorithms || ||  
+
| 10 okt ||  Boosting: risk bounds using Rademacher complexities, and a new algorithm || Mohri's book: p33-40, p75-83 and p131-137 (not all proofs) ||  
 
|-
 
|-
 
| 17 okt ||  Online learning algorithms ||  ||  
 
| 17 okt ||  Online learning algorithms ||  ||  
Строка 40: Строка 40:
  
 
Foundations of machine learning, Mehryar Mohri, Afshin Rostamizadeh, and Ameet Talwalker, 2012. These books can downloaded from http://gen.lib.rus.ec/ .
 
Foundations of machine learning, Mehryar Mohri, Afshin Rostamizadeh, and Ameet Talwalker, 2012. These books can downloaded from http://gen.lib.rus.ec/ .
 +
 +
(We will study a new algorithm, based on the [http://www.cs.nyu.edu/~mohri/pub/mboost.pdf paper]: Multi-class deep boosting, V. Kuznetsov, M Mohri, and U. Syed, Advances in Neural Information Processing Systems, p2501--2509, 2014. Notes will be provided.)
  
 
== Office hours ==
 
== Office hours ==

Версия 16:41, 9 октября 2017

General Information

Syllabus for the 1st module

The intermediate exam will happen on Tuesday Okt. 31 for exercises and the colloquium (for theory questions) will be in smaller groups around this date.

Homework

Homework module 1 (1 okt: corrected a typo in question 2b.) The deadline for submission is Sunday 15th of Oktober (3 day extension upon request). Submit either by email, in paper during the lecture, or place it under the door of office 620.

Additional clarifications based on questions of the students. Defenses of the homework will happen from 16th of Okt. till 20 Okt. Check your hse email to reserve a time slot for this.

Course materials

Date Summary Lecture notes Problem list
5 sept PAC-learning and VC-dimension: definitions 1st and 2nd lecture Updated on 13th of Sept. Problem list 1
12 sept PAC-learning and VC-dimension: proof of fundamental theorem Problem list 2
19 sept Sauer's lemma, neural networks and agnostic PAC-learning 3th lecture Updated on the 23th of Sept. Problem list 3
26 sept Measure concentration, agnostic PAC-learning and Computational learning theory 4th lecture Problem list 4
3 okt Boosting: the adaBoost algorithm 5th lecture (part about agnostic learning) About boosting: see chapt 6 in the book of Mohri (see below) Problem list 5
10 okt Boosting: risk bounds using Rademacher complexities, and a new algorithm Mohri's book: p33-40, p75-83 and p131-137 (not all proofs)
17 okt Online learning algorithms

A gentle introduction to the materials of the first 3 lectures and an overview of probability theory, can be found in chapters 1-6 and 11-12 of the following book: Sanjeev Kulkarni and Gilbert Harman: An Elementary Introduction to Statistical Learning Theory, 2012.

Foundations of machine learning, Mehryar Mohri, Afshin Rostamizadeh, and Ameet Talwalker, 2012. These books can downloaded from http://gen.lib.rus.ec/ .

(We will study a new algorithm, based on the paper: Multi-class deep boosting, V. Kuznetsov, M Mohri, and U. Syed, Advances in Neural Information Processing Systems, p2501--2509, 2014. Notes will be provided.)

Office hours

Person Monday Tuesday Wednesday Thursday Friday
1
Bruno Bauwens 15:05–18:00 15:05–18:00 Room 620
2
Quentin Paris