Statistical learning theory 2018 2019 — различия между версиями

Материал из Wiki - Факультет компьютерных наук
Перейти к: навигация, поиск
(Новая страница: « == Information == The syllabus == General Information == [https://www.dropbox.com/s/r5u7gl33berpokv/syllabusStatisticalLearning.pdf?dl=0 Syllabus for the 1st…»)
 
Строка 2: Строка 2:
 
== Information ==
 
== Information ==
  
The syllabus
+
The [https://www.dropbox.com/s/8iivgt3a96yw308/syllabus_StatisticalLearning_Bach_2018_2019.pdf?dl=0 syllabus]
  
  
Строка 15: Строка 15:
 
! Date !! Summary !! Lecture notes !! Problem list !! Solutions
 
! Date !! Summary !! Lecture notes !! Problem list !! Solutions
 
|-
 
|-
| 3 sept || PAC-learning in the realizable setting definitions  ||  
+
| 3 sept || PAC-learning in the realizable setting definitions  || [https://www.dropbox.com/s/4ic3ce71znglmu9/01sem.pdf?dl=0 Problem list 1]
 
||  ||  
 
||  ||  
 
|}
 
|}
Строка 22: Строка 22:
 
Sanjeev Kulkarni and Gilbert Harman: An Elementary Introduction to Statistical Learning Theory, 2012.
 
Sanjeev Kulkarni and Gilbert Harman: An Elementary Introduction to Statistical Learning Theory, 2012.
  
<!-- Foundations of machine learning, Mehryar Mohri, Afshin Rostamizadeh, and Ameet Talwalker, 2012. These books can downloaded from http://gen.lib.rus.ec/ .
+
Afterwards, we hope to cover chapters 1-8 from the book:
 +
Foundations of machine learning, Mehryar Mohri, Afshin Rostamizadeh, and Ameet Talwalker, 2012. These books can downloaded from http://gen.lib.rus.ec/ .
  
 +
<!--
 
(We will study a new boosting algorithm, based on the paper: [http://www.cs.nyu.edu/~mohri/pub/mboost.pdf Multi-class deep boosting], V. Kuznetsov, M Mohri, and U. Syed, Advances in Neural Information Processing Systems, p2501--2509, 2014. Notes will be provided.)
 
(We will study a new boosting algorithm, based on the paper: [http://www.cs.nyu.edu/~mohri/pub/mboost.pdf Multi-class deep boosting], V. Kuznetsov, M Mohri, and U. Syed, Advances in Neural Information Processing Systems, p2501--2509, 2014. Notes will be provided.)
 
-->
 
-->

Версия 17:36, 3 сентября 2018

Information

The syllabus


General Information

Syllabus for the 1st module

Course materials

Date Summary Lecture notes Problem list Solutions
3 sept PAC-learning in the realizable setting definitions Problem list 1

A gentle introduction to the materials of the first 3 lectures and an overview of probability theory, can be found in chapters 1-6 and 11-12 of the following book: Sanjeev Kulkarni and Gilbert Harman: An Elementary Introduction to Statistical Learning Theory, 2012.

Afterwards, we hope to cover chapters 1-8 from the book: Foundations of machine learning, Mehryar Mohri, Afshin Rostamizadeh, and Ameet Talwalker, 2012. These books can downloaded from http://gen.lib.rus.ec/ .


Office hours

Person Monday Tuesday Wednesday Thursday Friday
1
Bruno Bauwens 16:45–19:00 15:05–18:00 Room 620


Russian texts

The following links might help students who have trouble with English. A lecture on VC-dimensions was given by K. Vorontsov. A course on Statistical Learning Theory by Nikita Zhivotovsky is given at MIPT. Some short description about PAC learning on p136 in the book ``Наука и искусство построения алгоритмов, которые извлекают знания из данных, Петер Флах. On machinelearning.ru you can find brief and clear definitions.