Statistical learning theory
Help yourself and others
Some number of students have trouble with math and English. Sending me emails with questions. Tell me what is difficult to understand in the notes. If you find papers or books in Russian, tell me. Questions and answers that I got so far.
Tatiana has send me the following links that might help those who have trouble with English. A lecture on VC-dimensions was given by K. Vorontsov. A course on Statistical Learning Theory by Nikita Zhivotovsky is given at MIPT. Some short description about PAC learning on p136 in the book ``Наука и искусство построения алгоритмов, которые извлекают знания из данных, Петер Флах. On machinelearning.ru you can find brief and clear definitions.
Exams module 1
Consultation: Monday 30th of Oktober, 9h30-11h50 classroom 435: I will be answering questions to all interested students.
There are two exams.
Problems exam: Tuesday 31 Okt. 12h10-15h00: The score of this exam has weight 0.2 in your final grade. You solve exercises similar to the ones in the seminars. You can bring lecture notes, handwritten notes, and pages from Chapt 3, Sect. 4.4 and Chapt 6 from the book "Foundations of Machine Learning Mehryar Mohri, Afshin Rostamizadeh, and Ameet Talwalkar".
Colloquium exam: This exam counts for 0.2 of your final grade. You will receive a lemma, proposition or theorem from the lecture notes (and a few topics from the seminars). You need to write the proof and the teacher will ask questions to check your understanding. A list with questions will be posted here. You can know your subgroup from this list.
List of questions for the colloquium.
|БПМИ 141-1||Wednesday 1st of November||12h10-15h40||219|
|БПМИ 141-2||Wednesday 1st of November||13h40-16h10||219|
|БПМИ 142-1||Wednesday 1st of November||16h40-18h40||219|
|БПМИ 142-2||Wednesday 1st of November||17h40-19h40||219|
|БПМИ 143+145||Thursday 2th of November||15h10-17h10||219|
|БПМИ 144||Thursday 2th of November||16h40-18h40||219|
|3th year||Friday 3th of November||15h10-17h40||219|
Your score of the homework has weight 0.1 in your final grade. Activities in the second module count for 0.5 of weight to the final grade.
|Date||Summary||Lecture notes||Problem list||Solutions|
|5 sept||PAC-learning and VC-dimension: definitions||1st and 2nd lecture Updated on 13th of Sept.||Problem list 1||Solutions list 1|
|12 sept||PAC-learning and VC-dimension: proof of fundamental theorem||Problem list 2||Solutions list 2|
|19 sept||Sauer's lemma, neural networks and agnostic PAC-learning||3th lecture Updated on the 23th of Sept.||Problem list 3|
|26 sept||Measure concentration, agnostic PAC-learning and Computational learning theory||4th lecture||Problem list 4|
|3 okt||Agnostic learning and the adaBoost algorithm||5th lecture (21st of Okt. added part about comp. learning and Boosting)||Problem list 5|
|10 okt||Boosting: risk bounds using Rademacher complexities||Mohri's book: p33-40, Talagrand's lemma, McDiarmid's inequality 6th lecture (Draft)||Problem list 6|
|17 okt||Margin theory and a deep boosting algorithm||Mohri's book: p75-83, p131-136 (see the paper below)|
A gentle introduction to the materials of the first 3 lectures and an overview of probability theory, can be found in chapters 1-6 and 11-12 of the following book: Sanjeev Kulkarni and Gilbert Harman: An Elementary Introduction to Statistical Learning Theory, 2012.
Foundations of machine learning, Mehryar Mohri, Afshin Rostamizadeh, and Ameet Talwalker, 2012. These books can downloaded from http://gen.lib.rus.ec/ .
(We will study a new boosting algorithm, based on the paper: Multi-class deep boosting, V. Kuznetsov, M Mohri, and U. Syed, Advances in Neural Information Processing Systems, p2501--2509, 2014. Notes will be provided.)
||Bruno Bauwens||15:05–18:00||15:05–18:00||Room 620|