Statistical learning theory 2021 — различия между версиями

Материал из Wiki - Факультет компьютерных наук
Перейти к: навигация, поиск
 
(не показаны 93 промежуточные версии 2 участников)
Строка 6: Строка 6:
 
Teachers: [https://www.hse.ru/en/org/persons/160550073 Bruno Bauwens] and [https://www.hse.ru/en/org/persons/225553845 Nikita Lukianenko]  
 
Teachers: [https://www.hse.ru/en/org/persons/160550073 Bruno Bauwens] and [https://www.hse.ru/en/org/persons/225553845 Nikita Lukianenko]  
  
Lectures: Saturday 14:40 - 16:00. The lectures are Pokrovkaya and also streamed [https://us02web.zoom.us/j/82173400975?pwd=L1lhTzFTc2lGem5BVFdRcFEyVUhqZz09  here] in zoom. <span style="color:red">On 18 and 25 Sept only online</span>.
+
Lectures: Saturday 14:40 - 16:00. The lectures are in zoom.  
  
Seminars: Tuesday 16:20 - 17:40. The seminars are Pokrovkaya and also streamed [https://zoom.us/j/95201626157?pwd=dE16MHlJU0xtakNUTHdvRE5POEdpQT09 here] in zoom.
+
Seminars: Tuesday 16:20 - 17:40. The seminars are [https://meet.google.com/ber-yzns-hxz here] in google.meet.
  
See [https://ruz.hse.ru/ruz/main ruz] for the rooms.
+
Practical information on a telegram group.
 
+
Practical information on [https://t.me/joinchat/IER2-8hc0wUxNDQ0 telegram group]
+
  
 
The course is similar [http://wiki.cs.hse.ru/Statistical_learning_theory_2020 last year], except for the order of topics and part 3.
 
The course is similar [http://wiki.cs.hse.ru/Statistical_learning_theory_2020 last year], except for the order of topics and part 3.
  
== Homeworks ==
+
== Problems exam ==
  
Email to brbauwens-at-gmail.com. Start the subject line with SLT-HW.
+
Dec 22, 12:00 -- 15:30
  
Deadline before the lecture, every 2 weeks.
+
During the exam<br>
 +
-- You may consult notes, books and search on the internet <br>
 +
-- You may not interact with other humans (e.g. by phone, forums, etc)
  
<span style="color:red">25 Sept</span>: see problem lists 1 and 2 <span style="color:red">[update Sept 16], [Hint for 2.8 added on Sept 23.]</span>
+
== Colloquium ==
 +
 
 +
Saturday December 11
 +
 
 +
[https://www.dropbox.com/s/u8hyo1omvaoujle/colloqQuest.pdf?dl=0 rules and list of questions] (version Dec 10)
 +
 
 +
== Homeworks ==
  
09 Oct: see problem lists 3 and 4
+
Email to brbauwens-at-gmail.com. Start the subject line with SLT-HW. [https://www.dropbox.com/s/prsmzhtr5p5uome/scores.pdf?dl=0 Results]
  
Etc.
+
Deadline before the lecture, every other lecture.
 +
 
 +
25 Sept: see problem lists 1 and 2 <br>
 +
09 Oct: see problem lists 3 and 4  <br>
 +
29 Oct: see problem lists 5 and 6 <br>
 +
13 Nov: see problem lists 7 and 8 <br>
 +
30 Nov, 08:00 [extended]: see problem lists 9 and 10
  
 
== Course materials ==
 
== Course materials ==
Строка 39: Строка 51:
 
| [https://drive.google.com/file/d/1WL9LSNDD1B_q6LdpfDQ8BPluNfhjWrD9/view?usp=sharing 4 Sept]  
 
| [https://drive.google.com/file/d/1WL9LSNDD1B_q6LdpfDQ8BPluNfhjWrD9/view?usp=sharing 4 Sept]  
 
|| Lecture: philosophy. Seminar: the online mistake bound model, the weighted majority, and perceptron algorithms [https://drive.google.com/drive/folders/1NXiLbhmO2Ml7jFmnLtjqhOgCoHg7yn9T?usp=sharing movies]
 
|| Lecture: philosophy. Seminar: the online mistake bound model, the weighted majority, and perceptron algorithms [https://drive.google.com/drive/folders/1NXiLbhmO2Ml7jFmnLtjqhOgCoHg7yn9T?usp=sharing movies]
|| [https://www.dropbox.com/s/uk9awkfa827pmtf/01allSlides.pdf?dl=0 01sl]
+
|| [https://www.dropbox.com/s/uk9awkfa827pmtf/01allSlides.pdf?dl=0 sl01]
|| [https://www.dropbox.com/s/uvsfzb997kantoa/00book_intro.pdf?dl=0 00ch] [https://www.dropbox.com/s/6ah70h5loyrz5lx/01book_onlineMistakeBound.pdf?dl=0 01ch]
+
|| [https://www.dropbox.com/s/uvsfzb997kantoa/00book_intro.pdf?dl=0 ch00] [https://www.dropbox.com/s/6ah70h5loyrz5lx/01book_onlineMistakeBound.pdf?dl=0 ch01]
 
|| [https://www.dropbox.com/s/aoma8ma8mkd3885/01sem.pdf?dl=0 01prob (9 Sept)]
 
|| [https://www.dropbox.com/s/aoma8ma8mkd3885/01sem.pdf?dl=0 01prob (9 Sept)]
 
|| [https://www.dropbox.com/s/sqzqlrtzr2nu8cq/01sol.pdf?dl=0 01sol]
 
|| [https://www.dropbox.com/s/sqzqlrtzr2nu8cq/01sol.pdf?dl=0 01sol]
Строка 46: Строка 58:
 
| [https://drive.google.com/file/d/16OoCqhh16BKQzyF-HM8RozigyJ3BBVxA/view?usp=sharing 11 Sept]
 
| [https://drive.google.com/file/d/16OoCqhh16BKQzyF-HM8RozigyJ3BBVxA/view?usp=sharing 11 Sept]
 
|| The perceptron algorithm in the agnostic setting. Kernels. The standard optimal algorithm.
 
|| The perceptron algorithm in the agnostic setting. Kernels. The standard optimal algorithm.
|| [https://www.dropbox.com/s/sy959ee81mov5cr/02slides.pdf?dl=0 02sl]  
+
|| [https://www.dropbox.com/s/sy959ee81mov5cr/02slides.pdf?dl=0 sl02]  
|| [https://www.dropbox.com/s/0029k15cbnxj2v1/02book_sequentialOptimalAlgorithm.pdf?dl=0 02ch] [https://www.dropbox.com/s/eggk7kctgox8aza/03book_perceptron.pdf?dl=0 03ch]
+
|| [https://www.dropbox.com/s/0029k15cbnxj2v1/02book_sequentialOptimalAlgorithm.pdf?dl=0 ch02] [https://www.dropbox.com/s/eggk7kctgox8aza/03book_perceptron.pdf?dl=0 ch03]
 
|| [https://www.dropbox.com/s/415nws7qi589bme/02sem.pdf?dl=0 02prob (23 Sept)]
 
|| [https://www.dropbox.com/s/415nws7qi589bme/02sem.pdf?dl=0 02prob (23 Sept)]
 
|| [https://www.dropbox.com/s/ofcctflbnxt0kx3/02sol.pdf?dl=0 02sol]
 
|| [https://www.dropbox.com/s/ofcctflbnxt0kx3/02sol.pdf?dl=0 02sol]
Строка 53: Строка 65:
 
| 18 Sept (rec to do)
 
| 18 Sept (rec to do)
 
|| Prediction with expert advice and the exponentially weighted majority algorithm. Recap probability theory.  
 
|| Prediction with expert advice and the exponentially weighted majority algorithm. Recap probability theory.  
|| [https://www.dropbox.com/s/a60p9b76cxusgqy/03slides.pdf?dl=0 03sl]
+
|| [https://www.dropbox.com/s/a60p9b76cxusgqy/03slides.pdf?dl=0 sl03]
|| [https://www.dropbox.com/s/l11afq1d0qn6za7/05book_introProbability.pdf?dl=0 ch05] ch04 todo
+
|| [https://www.dropbox.com/s/ytl6q83q6gkax3w/04book_predictionWithExperts.pdf?dl=0 ch04] [https://www.dropbox.com/s/l11afq1d0qn6za7/05book_introProbability.pdf?dl=0 ch05]
|| [https://www.dropbox.com/s/nsrcy3yxgey67lp/03sem.pdf?dl=0 03prob]
+
|| [https://www.dropbox.com/s/nsrcy3yxgey67lp/03sem.pdf?dl=0 03prob(30 Sept)]
 
|| [https://www.dropbox.com/s/bg9nd01h1fhzjsi/03sol.pdf?dl=0 03sol]
 
|| [https://www.dropbox.com/s/bg9nd01h1fhzjsi/03sol.pdf?dl=0 03sol]
 
|-
 
|-
Строка 64: Строка 76:
 
|| Sample complexity in the realizable setting, simple examples and bounds using VC-dimension
 
|| Sample complexity in the realizable setting, simple examples and bounds using VC-dimension
 
|| [https://www.dropbox.com/s/pi0f3wab1xna6d7/04slides.pdf?dl=0 sl04]
 
|| [https://www.dropbox.com/s/pi0f3wab1xna6d7/04slides.pdf?dl=0 sl04]
|| [https://www.dropbox.com/s/8xrgcugs4xv2r2p/06book_sampleComplexity.pdf?dl=0 ch06] [https://www.dropbox.com/s/ctc48w1d2vvyiyt/07book_growthFunctions.pdf?dl=0 ch07(draft)]
+
|| [https://www.dropbox.com/s/8xrgcugs4xv2r2p/06book_sampleComplexity.pdf?dl=0 ch06]  
||
+
|| [https://www.dropbox.com/s/7qn2yz5fxc93rez/04sem.pdf?dl=0 04prob]
||
+
|| [https://www.dropbox.com/s/xm3nhgj5d6h49nz/04sol.pdf?dl=0 04sol]
 
|-  
 
|-  
| 2 Oct
+
| [https://drive.google.com/drive/folders/1jjyJ3eIaed64ogpR11g8M44IOikt5Mj2?usp=sharing 2 Oct]
 +
|| Growth functions, VC-dimension and the characterization of sample comlexity with VC-dimensions
 +
|| [https://www.dropbox.com/s/rpnh6288rdb3j8m/05slides.pdf?dl=0 sl05]
 +
|| [https://www.dropbox.com/s/ctc48w1d2vvyiyt/07book_growthFunctions.pdf?dl=0 ch07] [https://www.dropbox.com/s/jofixf9tstz0f8z/08book_VCdimension.pdf?dl=0 ch08]
 +
|| [https://www.dropbox.com/s/zbyqxy3qp3pz79i/05sem.pdf?dl=0 05prob]
 +
|| [https://www.dropbox.com/s/a8efm18dof2zeox/05sol.pdf?dl=0 05sol]
 +
|-
 +
| [https://drive.google.com/file/d/17zynIg_CZ6cCNBig5QXmBx7VFS8peyuU/view?usp=sharing 9 Oct]
 
|| Risk decomposition and the fundamental theorem of statistical learning theory
 
|| Risk decomposition and the fundamental theorem of statistical learning theory
||
+
|| [https://www.dropbox.com/s/jxijka88vfanv5n/06slides.pdf?dl=0 sl06]
|| [https://www.dropbox.com/s/jofixf9tstz0f8z/08book_VCdimension.pdf?dl=0 ch08(draft)]
+
|| [https://www.dropbox.com/s/r44bwxz34qj98gg/09book_riskBounds.pdf?dl=0 ch09]
||
+
|| [https://www.dropbox.com/s/x87txc8v5v6u8vb/06sem.pdf?dl=0 06prob]
||
+
|| [https://www.dropbox.com/s/ydlqu8oce3xj6ix/06sol.pdf?dl=0 06sol]
|-
+
| 9 Oct
+
|| Rademacher complexity
+
||
+
||
+
||
+
||
+
 
|-
 
|-
 
| 16 Oct
 
| 16 Oct
|| Support vector machines and margin risk bounds
+
|| Bounded differences inequality and Rademacher complexity
||  
+
|| [https://www.dropbox.com/s/kfithyq0dgcq6h8/07slides.pdf?dl=0 sl07]
||
+
|| [https://www.dropbox.com/s/5quc1jfkrvm3t71/10book_measureConcentration.pdf?dl=0 ch10] [https://www.dropbox.com/s/km0fns8n3aihauv/11book_RademacherComplexity.pdf?dl=0 ch11]
||
+
|| [https://www.dropbox.com/s/d1rsxceqmbk5llw/07sem.pdf?dl=0 07prob]
||
+
|| [https://www.dropbox.com/s/sftaa8b92ru3ii5/07sol.pdf?dl=0 07sol]
 +
|-
 +
| [https://drive.google.com/file/d/1L-BeDxhoHcoDrdlVTlfoMFwnWXKV46cr/view?usp=sharing 30 Oct]
 +
|| Simple regression, support vector machines, margin risk bounds, and neural nets
 +
|| [https://www.dropbox.com/s/0xrhe4732d0jshb/08slides.pdf?dl=0 sl08]
 +
|| [https://www.dropbox.com/s/cvqlwst3e69709t/12book_regression.pdf?dl=0 ch12] [https://www.dropbox.com/s/dwwxgriiaj4efn0/13book_SVM.pdf?dl=0 ch13]
 +
|| [https://www.dropbox.com/s/qqdbrh2ll0dv03a/08sem.pdf?dl=0 08prob]
 +
|| [https://www.dropbox.com/s/9o8fyd0ff735hxu/08sol.pdf?dl=0 08sol]
 
|-
 
|-
| 29 Oct
+
| [https://youtu.be/9FhFxLHR4eE 6 Nov]
|| Kernels: risk bounds, design, and representer theorem
+
|| Kernels: risk bounds, RKHS, representer theorem, design
||
+
|| [https://www.dropbox.com/s/nhqtbekclekf6k7/09slides.pdf?dl=0 sl09]
||
+
|| [https://www.dropbox.com/s/bpb9ijn2p7k19j3/14book_kernels.pdf?dl=0 ch14]
||
+
|| [https://www.dropbox.com/s/d2dmh017lw207ns/09sem.pdf?dl=0 09prob] (Nov 23)
||
+
|| [https://www.dropbox.com/s/2wq9mxrqchsqujr/09sol.pdf?dl=0 09sol]
 
|-  
 
|-  
| 6 Nov
+
| [https://youtu.be/ZBHe5RhTuzI 13 Nov]
 
|| AdaBoost and risk bounds
 
|| AdaBoost and risk bounds
||
+
|| [https://www.dropbox.com/s/umum3kd9439dt42/10slides.pdf?dl=0 sl10]
||
+
|| Mohri et al, chapt 7
||
+
|| [https://www.dropbox.com/s/j8s197e0mjv9qla/10sem.pdf?dl=0 10prob] (Nov 23)
||
+
|| [https://www.dropbox.com/s/7lw1u8750k7s8qt/10sol.pdf?dl=0 10sol]
 
|-
 
|-
 
|
 
|
|| ''Part 3. Other topics'' || || ||
+
|| ''Part 3. Other topics''  
 
|-
 
|-
| 13 Nov  
+
| [https://youtu.be/L4o7dXcaQrk 20 Nov]
 
|| Clustering   
 
|| Clustering   
||  
+
|| [https://www.dropbox.com/s/5a9flvg95iihz7m/11slides.pdf?dl=0 sl11]
||
+
|| Mohri et al, ch7; [https://people.csail.mit.edu/dsontag/courses/ml12/slides/lecture14.pdf lecture]
||
+
|| <!-- [https://www.dropbox.com/s/a9459keof3omav1/11sem.pdf?dl=0 11prob] -->
||  
+
|| <!-- [https://www.dropbox.com/s/kredac52pbn7qvk/11sol.pdf?dl=0 11sol] -->
 
|-
 
|-
| 20 Nov
+
| [https://youtu.be/FN6l4Ceq5lE 27 Nov]
 
|| Dimensionality reduction and the Johnson-Lindenstrauss lemma
 
|| Dimensionality reduction and the Johnson-Lindenstrauss lemma
||
+
|| [https://www.dropbox.com/s/wbgwwk7a9mjo1bv/12slides.pdf?dl=0 sl12]
||
+
|| Mohri et al, ch15; [https://ramanlab.wustl.edu/Lectures/Lecture12_LDA_CCA.pdf lecture]
||
+
|| [https://www.dropbox.com/s/c5anx2htaw9rslr/12sem.pdf?dl=0 12prob]
||
+
|-
+
| 27 Nov
+
|| Active learning
+
||
+
||
+
||
+
 
||
 
||
 
|-
 
|-
 
| 4 Dec
 
| 4 Dec
|| Extra space for a lesson, in the likely case we are a bit slower.
+
|| No lecture
 
||
 
||
 
||
 
||
Строка 140: Строка 152:
 
||
 
||
 
||
 
||
<!-- | 12 Sept || Introduction and
 
|| [https://www.dropbox.com/s/kicoo9xf356eam5/01lect.pdf?dl=0 lecture1.pdf]
 
|| [https://www.dropbox.com/s/pehka8xyu5hlpis/slides01.pdf?dl=0 slides1.pdf]
 
||
 
|| [https://www.dropbox.com/s/fbdew1vdzskenie/01sem.pdf?dl=0 Problem list 1] <span style="color:red">Update 26.09, prob 1.7</span>
 
|| [https://www.dropbox.com/s/rn8nv9y0db61a0h/01sol.pdf?dl=0 Solutions 1]
 
|-
 
| 19 Sept || VC-dimension and sample complexity
 
|| [https://www.dropbox.com/s/ayry6kp91h5s1nv/02lect.pdf?dl=0 lecture2.pdf]
 
|| [https://www.dropbox.com/s/6p6h1ooy4i5wt1t/02slides.pdf?dl=0 slides2.pdf]
 
|| [https://youtu.be/SBoffzKZebg Chapt 2,3]
 
|| [https://www.dropbox.com/s/4qn4qzr6mgu9lt3/02sem.pdf?dl=0 Problem list 2]
 
|| [https://www.dropbox.com/s/0g5gw3yrjjjzz07/02sol.pdf?dl=0 Solutions 2]
 
|-
 
| 26 Sept || Risk bounds and the fundamental theorem of statistical learning theory
 
|| [https://www.dropbox.com/s/njekia6g8t0x5mb/03lect.pdf?dl=0 lecture3.pdf]
 
|| [https://www.dropbox.com/s/at4eph4mv9gfnp1/03slides.pdf?dl=0 slides3.pdf]
 
||
 
|| [https://www.dropbox.com/s/nvb25e0ccebbz2a/03sem.pdf?dl=0 Problem list 3]
 
|| [https://www.dropbox.com/s/5jbl0xul25mrbg1/03sol.pdf?dl=0 Solutions 3]
 
|-
 
| 03 Oct || Rademacher complexity
 
|| [https://www.dropbox.com/s/ggw79gau85a4mcl/04lect.pdf?dl=0 lecture4.pdf]
 
|| [https://www.dropbox.com/s/pd2ockzxqdfo66t/04slides.pdf?dl=0 slides4.pdf]
 
||
 
|| [https://www.dropbox.com/s/rbx6jwlusnwhkzn/04sem.pdf?dl=0 Problem list 4] <span style="color:red">Update 23.10, prob 4.1d</span>
 
|| [https://www.dropbox.com/s/nhxkxfjajzsgfnf/04sol.pdf?dl=0 Solutions 4]
 
|-
 
| 10 Oct || Support vector machines and risk bounds
 
|| Chapt 5, Mohri et al, see below
 
|| [https://www.dropbox.com/s/q2onm9o6wgceg5i/05slides.pdf?dl=0 slides5.pdf]
 
||
 
|| [https://www.dropbox.com/s/upv70of97fqpx5f/05sem.pdf?dl=0 Problem list 5] <span style="color:red">Update 29.10, typo 5.8</span>
 
|| [https://www.dropbox.com/s/jfneptto1qoug1g/05sol.pdf?dl=0 Solutions 5]
 
|-
 
| 17 Oct || Support vector machines and recap
 
|| Chapt 5, Mohri et al.
 
|| [https://www.dropbox.com/s/tot9akaoonja1zp/06slides.pdf?dl=0 slides6.pdf]
 
||
 
|| [https://www.dropbox.com/s/y7w3srgsrp9d7m0/06sem.pdf?dl=0 Problem list 6]  <span style="color:red">Update 10.11</span>
 
|| [https://www.dropbox.com/s/qc0847q8q8llgg2/06sol.pdf?dl=0 Solutions 6] 
 
|-
 
| 31 Oct || Kernels
 
|| [https://www.dropbox.com/s/lzhbe7sb4aw49d4/07lec.pdf?dl=0 lecture7.pdf]
 
|| [https://www.dropbox.com/s/yrptkeaydam7r2v/07slides.pdf?dl=0 slides7.pdf]
 
||
 
|| [https://www.dropbox.com/s/81edvzrgiel3do6/07sem.pdf?dl=0 Problem list 7] <span style="color:red">Update 11.11, prob 7.6</span>
 
|| [https://www.dropbox.com/s/xaoxh2i12x15jz6/07sol.pdf?dl=0 Solutions 7]
 
|-
 
| 07 Nov || Adaboost
 
|| Chapt 6, Mohri et al
 
|| [https://www.dropbox.com/s/2ied3qr0xrsb127/08slides.pdf?dl=0 slides8.pdf]
 
||
 
|| [https://www.dropbox.com/s/i9jo9dlj06t51um/08sem.pdf?dl=0 Problem list 8]
 
|| [https://www.dropbox.com/s/1bxxzvorzbxpgji/08sol.pdf?dl=0 Solutions 8]
 
|-
 
| 14 Nov || Online learning 1, Littlestone dimension, weighted majority algorithm
 
|| Chapt 7, Mohri et al, and [http://machinelearning.ru/wiki/images/9/99/SLT%2C_lecture_85.pdf Животовский]
 
|| [https://www.dropbox.com/s/rtlsy6ssm2yj2p0/09slides.pdf?dl=0 slides9.pdf]
 
||
 
|| [https://www.dropbox.com/s/k0ynyl5x874e0gq/09sem.pdf?dl=0 Problem list 9] <span style="color:red">Update 08.12, 9.4</span>
 
|| [https://www.dropbox.com/s/k2zpqnoiwe19osu/09sol.pdf?dl=0 Solutions 9]
 
|-
 
| 21 Nov || Online learning 2, Exponential weighted average algorithm, preceptron
 
|| Chapt 7, Mohri et al
 
|| [https://www.dropbox.com/s/rtlsy6ssm2yj2p0/09slides.pdf?dl=0 slides9.pdf]
 
||
 
|| [https://www.dropbox.com/s/jh7krrihpc5f3ua/10sem.pdf?dl=0 Problem list 10]
 
|| [https://www.dropbox.com/s/tf8mdjxfbz86lj4/10sol.pdf?dl=0 Solutions 10]
 
|-
 
| 28 Nov || Online learning 3, perception, Winnow and online to batch conversion
 
|| Chapt 7, Mohri et al
 
|| [https://www.dropbox.com/s/ntkmnxhsvk9j38y/11slides.pdf?dl=0 slides11.pdf]
 
||
 
|| [https://www.dropbox.com/s/py43d5k4mr7rv26/11sem.pdf?dl=0 Problem list 11]
 
|| [https://www.dropbox.com/s/fuj1wclaq7wwa7c/11sol.pdf?dl=0 Solutions 11]
 
|-
 
| 5 Dec || Recap of requested topics, Q&A
 
|| [https://www.dropbox.com/s/ugiqfsk2mg01262/QandA.pdf?dl=0 Q&A]
 
||
 
||
 
||
 
||
 
|-
 
-->
 
 
|}
 
|}
  
Строка 233: Строка 160:
 
The lectures in October and November are based on the book:
 
The lectures in October and November are based on the book:
 
Foundations of machine learning 2nd ed, Mehryar Mohri, Afshin Rostamizadeh, and Ameet Talwalker, 2018. This book can be downloaded from http://gen.lib.rus.ec/ .
 
Foundations of machine learning 2nd ed, Mehryar Mohri, Afshin Rostamizadeh, and Ameet Talwalker, 2018. This book can be downloaded from http://gen.lib.rus.ec/ .
 
  
 
== Office hours ==
 
== Office hours ==
Строка 248: Строка 174:
  
 
It is always good to send an email in advance. Questions and feedback are welcome.  
 
It is always good to send an email in advance. Questions and feedback are welcome.  
 
I am traveling from Sept 12 -- Sept 30 and Oct 16 -- Oct 26. On Fridays I'm available till 16h30.
 
  
 
<!--
 
<!--

Текущая версия на 14:35, 16 декабря 2022

General Information

Grading

Teachers: Bruno Bauwens and Nikita Lukianenko

Lectures: Saturday 14:40 - 16:00. The lectures are in zoom.

Seminars: Tuesday 16:20 - 17:40. The seminars are here in google.meet.

Practical information on a telegram group.

The course is similar last year, except for the order of topics and part 3.

Problems exam

Dec 22, 12:00 -- 15:30

During the exam
-- You may consult notes, books and search on the internet
-- You may not interact with other humans (e.g. by phone, forums, etc)

Colloquium

Saturday December 11

rules and list of questions (version Dec 10)

Homeworks

Email to brbauwens-at-gmail.com. Start the subject line with SLT-HW. Results

Deadline before the lecture, every other lecture.

25 Sept: see problem lists 1 and 2
09 Oct: see problem lists 3 and 4
29 Oct: see problem lists 5 and 6
13 Nov: see problem lists 7 and 8
30 Nov, 08:00 [extended]: see problem lists 9 and 10

Course materials

Video Summary Slides Lecture notes Problem list Solutions
Part 1. Online learning
4 Sept Lecture: philosophy. Seminar: the online mistake bound model, the weighted majority, and perceptron algorithms movies sl01 ch00 ch01 01prob (9 Sept) 01sol
11 Sept The perceptron algorithm in the agnostic setting. Kernels. The standard optimal algorithm. sl02 ch02 ch03 02prob (23 Sept) 02sol
18 Sept (rec to do) Prediction with expert advice and the exponentially weighted majority algorithm. Recap probability theory. sl03 ch04 ch05 03prob(30 Sept) 03sol
Part 2. Risk bounds for binary classification
25 Sept Sample complexity in the realizable setting, simple examples and bounds using VC-dimension sl04 ch06 04prob 04sol
2 Oct Growth functions, VC-dimension and the characterization of sample comlexity with VC-dimensions sl05 ch07 ch08 05prob 05sol
9 Oct Risk decomposition and the fundamental theorem of statistical learning theory sl06 ch09 06prob 06sol
16 Oct Bounded differences inequality and Rademacher complexity sl07 ch10 ch11 07prob 07sol
30 Oct Simple regression, support vector machines, margin risk bounds, and neural nets sl08 ch12 ch13 08prob 08sol
6 Nov Kernels: risk bounds, RKHS, representer theorem, design sl09 ch14 09prob (Nov 23) 09sol
13 Nov AdaBoost and risk bounds sl10 Mohri et al, chapt 7 10prob (Nov 23) 10sol
Part 3. Other topics
20 Nov Clustering sl11 Mohri et al, ch7; lecture
27 Nov Dimensionality reduction and the Johnson-Lindenstrauss lemma sl12 Mohri et al, ch15; lecture 12prob
4 Dec No lecture
11 Dec Colloquium


The lectures in October and November are based on the book: Foundations of machine learning 2nd ed, Mehryar Mohri, Afshin Rostamizadeh, and Ameet Talwalker, 2018. This book can be downloaded from http://gen.lib.rus.ec/ .

Office hours

Person Monday Tuesday Wednesday Thursday Friday
Bruno Bauwens, Zoom 12h30-14h30 14h-20h Room S834 Pokrovkaya 11
Nikita Lukianenko, Telegram 14h30-16h30 14h30-16h30 Room S831 Pokrovkaya 11

It is always good to send an email in advance. Questions and feedback are welcome.