Statistical learning theory 2018 2019 — различия между версиями

Материал из Wiki - Факультет компьютерных наук
Перейти к: навигация, поиск
Строка 44: Строка 44:
 
| 21 Oct || Margin theory and risk bounds for boosting. || [https://www.dropbox.com/s/o5zae3d8nw5eexw/08lect.pdf?dl=0 lecture8.pdf] || [https://www.dropbox.com/s/xg7u3ss1a0vog5j/08sem.pdf?dl=0 Problem list 8]|| See lecture6.pdf for ex. 8.6.
 
| 21 Oct || Margin theory and risk bounds for boosting. || [https://www.dropbox.com/s/o5zae3d8nw5eexw/08lect.pdf?dl=0 lecture8.pdf] || [https://www.dropbox.com/s/xg7u3ss1a0vog5j/08sem.pdf?dl=0 Problem list 8]|| See lecture6.pdf for ex. 8.6.
 
|-
 
|-
| 12 Nov || Deep boosting, we study the paper [http://www.cs.nyu.edu/~mohri/pub/mboost.pdf Multi-class deep boosting], V. Kuznetsov, M Mohri, and U. Syed, Advances in Neural Information Processing Systems, p2501--2509, 2014. Notes will be provided. || || [https://www.dropbox.com/s/lsu6tgmc767u3yd/09sem.pdf?dl=0 Problem list 9] ||
+
| 12 Nov || Deep boosting, we study the paper [http://www.cs.nyu.edu/~mohri/pub/mboost.pdf Multi-class deep boosting], V. Kuznetsov, M Mohri, and U. Syed, Advances in Neural Information Processing Systems, p2501--2509, 2014. Notes will be provided. || || [https://www.dropbox.com/s/lsu6tgmc767u3yd/09sem.pdf?dl=0 Problem list 9] || [https://www.dropbox.com/s/8wmswbynzx0s9hd/09sol.pdf?dl=0 Solutions.]
 
|-
 
|-
 
| 19 Nov || Support vector machines, primal and dual optimization problem, risk bounds.  || See chapt. 5 of Mohri's book || [https://www.dropbox.com/s/ys37nsdfz3aa4ry/10sem.pdf?dl=0 Problem list 10]||
 
| 19 Nov || Support vector machines, primal and dual optimization problem, risk bounds.  || See chapt. 5 of Mohri's book || [https://www.dropbox.com/s/ys37nsdfz3aa4ry/10sem.pdf?dl=0 Problem list 10]||

Версия 16:02, 9 декабря 2018

General Information

The syllabus

Questions colloquium on 29 October. (Lectures 1-8 updated 24/10.)

Deadline homework 1: October 2nd. Questions: see seminars 3 and 4.

Deadline homework 2: October 27nd. Questions: see seminars 5-8 below.

Deadline homework 3: December 11nd. Questions: see seminars 9-12 below.

Marks

Intermediate exams: October 29th.

Final exam: December 20th, same system as for intermediate exams. Theory questions

Consultation: December 17th, no lecture. Students can ask questions and ask for solutions of exercises.


Course materials

Date Summary Lecture notes Problem list Solutions
3 Sept PAC-learning in the realizable setting definitions lecture1.pdf updated 23/09 Problem list 1 Solutions 1
10 Sept VC-dimension and growth functions lecture2.pdf updated 23/09 Problem list 2 Solutions 2
17 Sept Proof that finite VC-dimension implies PAC-learnability lecture3.pdf updated 23/09 Problem list 3 Solutions 3
24 Sept Applications to decision trees and threshold neural networks. Agnostic PAC-learnability. lecture4.pdf Problem list 4 Solution 4
1 Oct Agnostic PAC-learnability is equivalent with finite VC-dimension, structural risk minimization lecture5.pdf 14/10 Problem list 5 Solution 5
9 Oct Boosting, Mohri's book pages 121-131. lecture6.pdf 23/10 Problem list 6 No solution.
15 Oct Rademacher complexity and contraction lemma (=Talagrand's lemma), Mohri's book pages 33-41 and 78-79 lecture7.pdf Problem list 7 See lecture7.pdf
21 Oct Margin theory and risk bounds for boosting. lecture8.pdf Problem list 8 See lecture6.pdf for ex. 8.6.
12 Nov Deep boosting, we study the paper Multi-class deep boosting, V. Kuznetsov, M Mohri, and U. Syed, Advances in Neural Information Processing Systems, p2501--2509, 2014. Notes will be provided. Problem list 9 Solutions.
19 Nov Support vector machines, primal and dual optimization problem, risk bounds. See chapt. 5 of Mohri's book Problem list 10
26 Nov Kernels, Kernel reproducing Hilbert spaces, representer theorem, examples of kernels lecture11.pdf Problem set 11 Solutions: see lecture11.pdf
3 Dec A polynomial time improper learning algorithm for constant depth L1-regularized neural networks, from this paper. [Notes will be written soon.] Online algorithms: halving algorithm, weighted and exponentially weighted average algorithms. See Mohri's book Sections 7.1 and 7.2. Problem list 12
10 Dec We finish online learning. Discuss the algorithm from this paper.

A gentle introduction to the materials of the first 3 lectures and an overview of probability theory, can be found in chapters 1-6 and 11-12 of the following book: Sanjeev Kulkarni and Gilbert Harman: An Elementary Introduction to Statistical Learning Theory, 2012.

Afterward, we hope to cover chapters 1-8 from the book: Foundations of machine learning, Mehryar Mohri, Afshin Rostamizadeh, and Ameet Talwalker, 2012. These books can be downloaded from http://gen.lib.rus.ec/ .


Office hours

Person Monday Tuesday Wednesday Thursday Friday
Bruno Bauwens 16:45–19:00 15:05–18:00 Room 620


Russian texts

The following links might help students who have trouble with English. A lecture on VC-dimensions was given by K. Vorontsov. A course on Statistical Learning Theory by Nikita Zhivotovsky is given at MIPT. Some short description about PAC learning on p136 in the book ``Наука и искусство построения алгоритмов, которые извлекают знания из данных, Петер Флах. On machinelearning.ru you can find brief and clear definitions.