Statistical learning theory

Материал из Wiki - Факультет компьютерных наук
Перейти к: навигация, поиск

Help yourself and others

Some number of students have trouble with math and English. Help yourself and other students by sending emails with questions. Don't be ashamed! Tell me what is difficult to understand in the notes. If you find papers or books in Russian, tell me. I will maintain a list with questions, answers, and advices here (as soon as I get them).

Exams module 1

There are two exams.

Problems exam: Tuesday 31 Okt. 12h10-15h00: The score of your exam has weight 0.2 in your final grade. You solve exercises similar to the ones in the seminars. You can bring lecture notes, handwritten notes, and pages from Chapt 3, Sect. 4.4 and Chapt 6 from the book "Foundations of Machine Learning Mehryar Mohri, Afshin Rostamizadeh, and Ameet Talwalkar".

Colloquium exam: This exam counts for 0.2 of your final grade. You will receive a lemma, proposition or theorem from the lecture notes (and a few topics from the seminars). You need to write the proof and the teacher will ask questions to check your understanding. A list with questions will be posted here. You can know your subgroup from this list.

Group Date Time Room
БПМИ 141-1 Wednesday 1st of November 12h10-15h40
БПМИ 141-2 Wednesday 1st of November 13h40-16h10
БПМИ 142-1 Wednesday 1st of November 16h40-18h40
БПМИ 142-2 Wednesday 1st of November 17h40-19h40
БПМИ 143+145 Thursday 2th of November 15h10-17h10
БПМИ 144 Thursday 2th of November 16h40-18h40
3th year Friday 3th of November 15h10-17h40

Your score of the homework has weight 0.1 in your final grade. Activities in the second module count for 0.5 of weight to the final grade.

Homework

Homework module 1

General Information

Syllabus for the 1st module


Course materials

Date Summary Lecture notes Problem list
5 sept PAC-learning and VC-dimension: definitions 1st and 2nd lecture Updated on 13th of Sept. Problem list 1
12 sept PAC-learning and VC-dimension: proof of fundamental theorem Problem list 2
19 sept Sauer's lemma, neural networks and agnostic PAC-learning 3th lecture Updated on the 23th of Sept. Problem list 3
26 sept Measure concentration, agnostic PAC-learning and Computational learning theory 4th lecture Problem list 4
3 okt Boosting: the adaBoost algorithm 5th lecture (part about agnostic learning) About boosting: see chapt 6 in the book of Mohri (see below) Problem list 5
10 okt Boosting: risk bounds using Rademacher complexities Mohri's book: p33-40, Talagrand's lemma, McDiarmid's inequality 6th lecture (Draft) Problem list 6
17 okt Margin theory and a deep boosting algorithm Mohri's book: p75-83, p131-136 (see the paper below)

A gentle introduction to the materials of the first 3 lectures and an overview of probability theory, can be found in chapters 1-6 and 11-12 of the following book: Sanjeev Kulkarni and Gilbert Harman: An Elementary Introduction to Statistical Learning Theory, 2012.

Foundations of machine learning, Mehryar Mohri, Afshin Rostamizadeh, and Ameet Talwalker, 2012. These books can downloaded from http://gen.lib.rus.ec/ .

(We will study a new boosting algorithm, based on the paper: Multi-class deep boosting, V. Kuznetsov, M Mohri, and U. Syed, Advances in Neural Information Processing Systems, p2501--2509, 2014. Notes will be provided.)

Office hours

Person Monday Tuesday Wednesday Thursday Friday
1
Bruno Bauwens 15:05–18:00 15:05–18:00 Room 620
2
Quentin Paris