Statistical learning theory 2018 2019
Questions colloquium on 29 October. (Lectures 1-8 updated 24/10.)
Deadline homework 2: October 27nd. Questions: see seminars 5-8 below.
Deadline homework 3: December 11nd. Questions: see seminars 9-12 below.
Intermediate exams: October 29th.
Final exam: December 20th, same system as for intermediate exams. Theory questions
Consultation: December 17th, no lecture. Students can ask questions and ask for solutions of exercises.
|Date||Summary||Lecture notes||Problem list||Solutions|
|3 Sept||PAC-learning in the realizable setting definitions||lecture1.pdf updated 23/09||Problem list 1||Solutions 1|
|10 Sept||VC-dimension and growth functions||lecture2.pdf updated 23/09||Problem list 2||Solutions 2|
|17 Sept||Proof that finite VC-dimension implies PAC-learnability||lecture3.pdf updated 23/09||Problem list 3||Solutions 3|
|24 Sept||Applications to decision trees and threshold neural networks. Agnostic PAC-learnability.||lecture4.pdf||Problem list 4||Solution 4|
|1 Oct||Agnostic PAC-learnability is equivalent with finite VC-dimension, structural risk minimization||lecture5.pdf 14/10||Problem list 5||Solution 5|
|9 Oct||Boosting, Mohri's book pages 121-131.||lecture6.pdf 23/10||Problem list 6||No solution.|
|15 Oct||Rademacher complexity and contraction lemma (=Talagrand's lemma), Mohri's book pages 33-41 and 78-79||lecture7.pdf||Problem list 7||See lecture7.pdf|
|21 Oct||Margin theory and risk bounds for boosting.||lecture8.pdf||Problem list 8||See lecture6.pdf for ex. 8.6.|
|12 Nov||Deep boosting, we study the paper Multi-class deep boosting, V. Kuznetsov, M Mohri, and U. Syed, Advances in Neural Information Processing Systems, p2501--2509, 2014. Notes will be provided.||Problem list 9||Solutions 9.|
|19 Nov||Support vector machines, primal and dual optimization problem, risk bounds.||See chapt. 5 of Mohri's book||Problem list 10|
|26 Nov||Kernels, Kernel reproducing Hilbert spaces, representer theorem, examples of kernels||lecture11.pdf||Problem set 11||Solutions: see lecture11.pdf|
|3 Dec||A polynomial time improper learning algorithm for constant depth L1-regularized neural networks, from this paper. Online algorithms: halving algorithm, weighted and exponentially weighted average algorithms. See Mohri's book Sections 7.1 and 7.2.||lecture12.pdf||Problem list 12|
|10 Dec||We finish online learning. Discuss the algorithm from this paper.||See previous list.|
A gentle introduction to the materials of the first 3 lectures and an overview of probability theory, can be found in chapters 1-6 and 11-12 of the following book: Sanjeev Kulkarni and Gilbert Harman: An Elementary Introduction to Statistical Learning Theory, 2012.
Afterward, we hope to cover chapters 1-8 from the book: Foundations of machine learning, Mehryar Mohri, Afshin Rostamizadeh, and Ameet Talwalker, 2012. These books can be downloaded from http://gen.lib.rus.ec/ .
|Bruno Bauwens||16:45–19:00||15:05–18:00||Room 620|
The following links might help students who have trouble with English. A lecture on VC-dimensions was given by K. Vorontsov. A course on Statistical Learning Theory by Nikita Zhivotovsky is given at MIPT. Some short description about PAC learning on p136 in the book ``Наука и искусство построения алгоритмов, которые извлекают знания из данных, Петер Флах. On machinelearning.ru you can find brief and clear definitions.