Statistical learning theory 2022 — различия между версиями

Материал из Wiki - Факультет компьютерных наук
Перейти к: навигация, поиск
 
(не показано 114 промежуточных версии 3 участников)
Строка 3: Строка 3:
  
  
Lectures: Friday 16h20 -- 17h40, [https://www.hse.ru/en/org/persons/160550073 Bruno Bauwens], [https://www.hse.ru/staff/mkaledin Maxim Kaledin]
+
Lectures: Friday 16h20 -- 17h40, [https://www.hse.ru/en/org/persons/160550073 Bruno Bauwens], [https://www.hse.ru/staff/mkaledin Maxim Kaledin], room M202 and on [https://us02web.zoom.us/j/82300259484?pwd=NWxXekxBeE5yMm9UTmwvLzNNNGlnUT09 zoom]
  
Seminars: Friday 18h10 -- 19h30, [https://www.hse.ru/org/persons/225526439 Artur Goldman],
+
Seminars: Saturday 14h40 -- 16h00, [https://www.hse.ru/org/persons/225526439 Artur Goldman], room M202 and on zoom (the link will be in telegram)
  
For discussions of the materials, join the [https://t.me/+G0VKOE2-nnkwNDE0 telegram group]
+
To discuss the materials, join the [https://t.me/+G0VKOE2-nnkwNDE0 telegram group] The course is similar to [http://wiki.cs.hse.ru/Statistical_learning_theory_2021 last year].
  
The course is similar to [http://wiki.cs.hse.ru/Statistical_learning_theory_2022 last year].
 
  
 +
== Problems exam ==
  
== Homeworks ==
+
December 21, 13h-16h, computer room G403 ([https://us02web.zoom.us/j/82300259484?pwd=NWxXekxBeE5yMm9UTmwvLzNNNGlnUT09 zoomlink] for students abroad)<br>
 
+
-- You may use handwritten notes, lecture materials from this wiki (either printed or through your PC), Mohri's book <br>
Email to brbauwens-at-gmail.com. Start the subject line with SLT-HW.
+
-- You may not search on the internet or interact with other humans (e.g. by phone, forums, etc)
 
+
Deadline before the lecture, every other lecture.
+
 
+
23 Sept: see problem lists 1 and 2 <br>
+
07 Oct: see problem lists 3 and 4 <br>
+
21 Oct: see problem lists 5 and 6 <br>
+
4 Nov: see problem list 7 <br>
+
18 Nov: see problem lists 8 and 9 <br>
+
02 Dec: see problem lists 10 and 11 <br>
+
  
  
Строка 35: Строка 26:
 
|| ''Part 1. Online learning''  
 
|| ''Part 1. Online learning''  
 
|-
 
|-
| [2 Sept]
+
| 02 Sept
|| Lecture: philosophy. The online mistake bound model, the weighted majority, and perceptron algorithms [https://drive.google.com/drive/folders/1NXiLbhmO2Ml7jFmnLtjqhOgCoHg7yn9T?usp=sharing movies]
+
|| Philosophy. The online mistake bound model. The halving and weighted majority algorithms [https://drive.google.com/drive/folders/1NXiLbhmO2Ml7jFmnLtjqhOgCoHg7yn9T?usp=sharing movies]
|| [https://www.dropbox.com/s/uk9awkfa827pmtf/01allSlides.pdf?dl=0 sl01]
+
|| [https://www.dropbox.com/s/ryvpnfqfrwyurjc/01slides.pdf?dl=0 sl01]
|| [https://www.dropbox.com/s/uvsfzb997kantoa/00book_intro.pdf?dl=0 ch00] [https://www.dropbox.com/s/6ah70h5loyrz5lx/01book_onlineMistakeBound.pdf?dl=0 ch01]
+
|| [https://www.dropbox.com/s/oncvg4mxulbt56d/00book_intro.pdf?dl=0 ch00] [https://www.dropbox.com/s/i9pc4kf0zsdeksb/01book_onlineMistakeBound.pdf?dl=0 ch01]
||  
+
|| [https://www.dropbox.com/s/ztk3n9s5c0vuzd9/01sem.pdf?dl=0 list 1] <span style="color:red">update 05.09</span>
||  
+
|| [https://www.dropbox.com/s/r528uroi60gow08/01sol.pdf?dl=0 solutions 1]
 
|-
 
|-
| [https://drive.google.com/file/d/16OoCqhh16BKQzyF-HM8RozigyJ3BBVxA/view?usp=sharing 9 Sept]
+
| [https://drive.google.com/file/d/16OoCqhh16BKQzyF-HM8RozigyJ3BBVxA/view?usp=sharing 09 Sept]
|| The perceptron algorithm in the agnostic setting. Kernels. The standard optimal algorithm.
+
|| The perceptron algorithm. The standard optimal algorithm.
 
|| [https://www.dropbox.com/s/sy959ee81mov5cr/02slides.pdf?dl=0 sl02]  
 
|| [https://www.dropbox.com/s/sy959ee81mov5cr/02slides.pdf?dl=0 sl02]  
|| [https://www.dropbox.com/s/0029k15cbnxj2v1/02book_sequentialOptimalAlgorithm.pdf?dl=0 ch02] [https://www.dropbox.com/s/eggk7kctgox8aza/03book_perceptron.pdf?dl=0 ch03]
+
|| [https://www.dropbox.com/s/p3auugqwc89132b/02book_sequentialOptimalAlgorithm.pdf?dl=0 ch02] [https://www.dropbox.com/s/b00dcqk1rob7rdz/03book_perceptron.pdf?dl=0 ch03]
||  
+
|| [https://www.dropbox.com/s/88jgjvxo16zfrjs/02sem.pdf?dl=0 list 2] <span style="color:red">update 25.09</span>
||  
+
|| [https://www.dropbox.com/s/pqblktfky8to5hr/02sol.pdf?dl=0 solutions 2]
 
|-
 
|-
| 16 Sept
+
| [https://www.youtube.com/watch?v=xgyPvnDkyZs 16 Sept]
|| Prediction with expert advice and the exponentially weighted majority algorithm. Recap probability theory.  
+
|| Kernels and the kernel perceptron algorithm. Prediction with expert advice. Recap probability theory.  
 
|| [https://www.dropbox.com/s/a60p9b76cxusgqy/03slides.pdf?dl=0 sl03]
 
|| [https://www.dropbox.com/s/a60p9b76cxusgqy/03slides.pdf?dl=0 sl03]
|| [https://www.dropbox.com/s/ytl6q83q6gkax3w/04book_predictionWithExperts.pdf?dl=0 ch04] [https://www.dropbox.com/s/l11afq1d0qn6za7/05book_introProbability.pdf?dl=0 ch05]
+
|| [https://www.dropbox.com/s/3vtxvs4esnvbhlb/04book_predictionWithExperts.pdf?dl=0 ch04] [https://www.dropbox.com/s/l11afq1d0qn6za7/05book_introProbability.pdf?dl=0 ch05]
||  
+
|| [https://www.dropbox.com/s/fnx2cl5wsgjbmel/03sem.pdf?dl=0 list 3]
||  
+
|| [https://www.dropbox.com/s/ysg3nipuzryzqoc/03sol.pdf?dl=0 solutions 3]
 
|-
 
|-
 
|  
 
|  
 
|| ''Part 2. Distribution independent risk bounds''  
 
|| ''Part 2. Distribution independent risk bounds''  
 
|-
 
|-
| [https://drive.google.com/file/d/1RHz8NgfianUQFlx8VswjiiPRvt0DoBvc/view?usp=sharing 23 Sept]
+
| [https://www.youtube.com/watch?v=IatjLyN3dRk 23 Sept]
 
|| Sample complexity in the realizable setting, simple examples and bounds using VC-dimension
 
|| Sample complexity in the realizable setting, simple examples and bounds using VC-dimension
 
|| [https://www.dropbox.com/s/pi0f3wab1xna6d7/04slides.pdf?dl=0 sl04]
 
|| [https://www.dropbox.com/s/pi0f3wab1xna6d7/04slides.pdf?dl=0 sl04]
|| [https://www.dropbox.com/s/8xrgcugs4xv2r2p/06book_sampleComplexity.pdf?dl=0 ch06]  
+
|| [https://www.dropbox.com/s/nh4puyv7nst4ems/06book_sampleComplexity.pdf?dl=0 ch06]  
||  
+
|| [https://www.dropbox.com/s/u5dtpm69bu52fpn/04sem.pdf?dl=0 list 4]
||  
+
|| [https://www.dropbox.com/s/yurv5s42w3kw5vv/04sol.pdf?dl=0 solutions 4]
 
|-  
 
|-  
| [https://drive.google.com/drive/folders/1jjyJ3eIaed64ogpR11g8M44IOikt5Mj2?usp=sharing 30 Sept]
+
| [https://www.youtube.com/watch?v=8J5B9CCy-ws 30 Sept]
 
|| Growth functions, VC-dimension and the characterization of sample comlexity with VC-dimensions
 
|| Growth functions, VC-dimension and the characterization of sample comlexity with VC-dimensions
 
|| [https://www.dropbox.com/s/rpnh6288rdb3j8m/05slides.pdf?dl=0 sl05]
 
|| [https://www.dropbox.com/s/rpnh6288rdb3j8m/05slides.pdf?dl=0 sl05]
|| [https://www.dropbox.com/s/ctc48w1d2vvyiyt/07book_growthFunctions.pdf?dl=0 ch07] [https://www.dropbox.com/s/jofixf9tstz0f8z/08book_VCdimension.pdf?dl=0 ch08]
+
|| [https://www.dropbox.com/s/eurz2vkvt1wa5zm/07book_growthFunctions.pdf?dl=0 ch07] [https://www.dropbox.com/s/m7xe7k39qzmzapv/08book_VCdimension.pdf?dl=0 ch08]
||  
+
|| [https://www.dropbox.com/s/u1vpi28gwf0zig4/05sem.pdf?dl=0 list 5]
||  
+
|| [https://www.dropbox.com/s/3sq7yzv7v4l9tbb/05sol.pdf?dl=0 solutions 5]
 
|-
 
|-
| [https://drive.google.com/file/d/17zynIg_CZ6cCNBig5QXmBx7VFS8peyuU/view?usp=sharing 7 Oct]
+
| [https://drive.google.com/file/d/17zynIg_CZ6cCNBig5QXmBx7VFS8peyuU/view?usp=sharing 07 Oct]
 
|| Risk decomposition and the fundamental theorem of statistical learning theory
 
|| Risk decomposition and the fundamental theorem of statistical learning theory
|| [https://www.dropbox.com/s/jxijka88vfanv5n/06slides.pdf?dl=0 sl06]
+
|| [https://www.dropbox.com/s/0p8r5wgjy1hlku2/06slides.pdf?dl=0 sl06]
|| [https://www.dropbox.com/s/r44bwxz34qj98gg/09book_riskBounds.pdf?dl=0 ch09]
+
|| [https://www.dropbox.com/s/8c87619ewkyod4f/09book_riskBounds.pdf?dl=0 ch09]
||  
+
|| [https://www.dropbox.com/s/eyfczsuwz60moj7/06sem.pdf?dl=0 list 6]
||  
+
|| [https://www.dropbox.com/s/1te4fzlwj72v6ph/06sol.pdf?dl=0 solutions 6]
 
|-
 
|-
| 14 Oct
+
| [https://www.youtube.com/watch?v=yMsUH1brAs8 14 Oct]
|| Bounded differences inequality and Rademacher complexity
+
|| Bounded differences inequality, Rademacher complexity, symmetrization, contraction lemma, [https://www.dropbox.com/s/8uravgo5shas55g/07quiz.pdf?dl=0 quiz]
 
|| [https://www.dropbox.com/s/kfithyq0dgcq6h8/07slides.pdf?dl=0 sl07]
 
|| [https://www.dropbox.com/s/kfithyq0dgcq6h8/07slides.pdf?dl=0 sl07]
|| [https://www.dropbox.com/s/5quc1jfkrvm3t71/10book_measureConcentration.pdf?dl=0 ch10] [https://www.dropbox.com/s/km0fns8n3aihauv/11book_RademacherComplexity.pdf?dl=0 ch11]
+
|| [https://www.dropbox.com/s/fg4seoqjbeb7a5g/10book_measureConcentration.pdf?dl=0 ch10] [https://www.dropbox.com/s/hfrvhebbsskbk6g/11book_RademacherComplexity.pdf?dl=0 ch11]
||  
+
|| [https://www.dropbox.com/s/qofutar8qy5y53i/07sem.pdf?dl=0 list 7] <span style="color:red">update 15.10</span>
||  
+
|| [https://www.dropbox.com/s/w1ud2okt120vm5j/07sol.pdf?dl=0 solutions 7]
 
|-
 
|-
 
|  
 
|  
Строка 92: Строка 83:
 
| [https://drive.google.com/file/d/1L-BeDxhoHcoDrdlVTlfoMFwnWXKV46cr/view?usp=sharing 21 Oct]
 
| [https://drive.google.com/file/d/1L-BeDxhoHcoDrdlVTlfoMFwnWXKV46cr/view?usp=sharing 21 Oct]
 
|| Simple regression, support vector machines, margin risk bounds, and neural nets  
 
|| Simple regression, support vector machines, margin risk bounds, and neural nets  
|| [https://www.dropbox.com/s/0xrhe4732d0jshb/08slides.pdf?dl=0 sl08]
+
|| [https://www.dropbox.com/s/oo1qny9busp3axn/08slides.pdf?dl=0 sl08]
|| [https://www.dropbox.com/s/cvqlwst3e69709t/12book_regression.pdf?dl=0 ch12] [https://www.dropbox.com/s/dwwxgriiaj4efn0/13book_SVM.pdf?dl=0 ch13]
+
|| [https://www.dropbox.com/s/573a2vtjfx8qqo8/12book_regression.pdf?dl=0 ch12] [https://www.dropbox.com/s/jaym44fmif2uw05/13book_SVM.pdf?dl=0 ch13]
||  
+
|| [https://www.dropbox.com/s/bzo6msrxcfa8tpp/08sem.pdf?dl=0 list 8]
||  
+
|| [https://www.dropbox.com/s/pe7yctcr93yaw95/08sol.pdf?dl=0 solutions 8]
 
|-
 
|-
| [https://youtu.be/9FhFxLHR4eE 4 Nov]
+
| [https://youtu.be/9FhFxLHR4eE 04 Nov]
|| Kernels: risk bounds, RKHS, representer theorem, design
+
|| Kernels: RKHS, representer theorem, risk bounds
|| [https://www.dropbox.com/s/nhqtbekclekf6k7/09slides.pdf?dl=0 sl09]
+
|| [https://www.dropbox.com/s/jst60ww8ev4ypie/09slides.pdf?dl=0 sl09]
|| [https://www.dropbox.com/s/bpb9ijn2p7k19j3/14book_kernels.pdf?dl=0 ch14]
+
|| [https://www.dropbox.com/s/ply602zthd7r3jv/14book_kernels.pdf?dl=0 ch14]
||  
+
|| [https://www.dropbox.com/s/54xdufimavhd646/09sem.pdf?dl=0 list 9]
||  
+
|| [https://www.dropbox.com/s/i3rx26ya6kvm5p2/09sol.pdf?dl=0 solutions 9]
 
|-  
 
|-  
| [https://youtu.be/ZBHe5RhTuzI 11 Nov]
+
| [https://youtu.be/1oUXZy6Sqlk 11 Nov]
|| AdaBoost and risk bounds
+
|| AdaBoost and the margin hypothesis
 
|| [https://www.dropbox.com/s/umum3kd9439dt42/10slides.pdf?dl=0 sl10]
 
|| [https://www.dropbox.com/s/umum3kd9439dt42/10slides.pdf?dl=0 sl10]
|| Mohri et al, chapt 7
+
|| [https://www.dropbox.com/s/e7m1cs7e8ulibsf/15book_AdaBoost.pdf?dl=0 ch15]
 +
|| [https://www.dropbox.com/s/nu4a55qbfqlp3bl/10sem.pdf?dl=0 list 10]
 +
|| [https://www.dropbox.com/s/t64mjapdzcm1313/10sol.pdf?dl=0 solutions 10]
 +
|-
 +
| [https://youtu.be/GL574ljefJ8 18 Nov]
 +
|| Implicit regularization of stochastic gradient descent in neural nets
 
||  
 
||  
 +
|| [https://www.dropbox.com/s/b4xac5uki7l1ysq/16book_implicitRegularization.pdf?dl=0 ch16]
 +
|| no seminar
 
||  
 
||  
 
|-
 
|-
 
|
 
|
|| ''Part 3. Other topics''  
+
|| ''Part 4. Other topics''  
 
|-
 
|-
| 18 Nov
+
| [https://youtu.be/kOXi_m9dBzE 25 Nov]
|| Regression  I
+
|| Regression  I: fixed design with sub-Gaussian noise
||
+
||
+
||
+
 
||  
 
||  
 +
|| [https://disk.yandex.ru/i/tI7NiGsvQP0Jww notes12]
 +
|| [https://disk.yandex.ru/d/9fFxVlMw4kPfEQ list 12]
 +
|| [https://disk.yandex.ru/i/5vBE2VC7zNC3Rg solutions 12]
 
|-
 
|-
| 25 Nov
+
| [https://youtu.be/GEYT_IxXEX0 02 Dec]
|| Regression II
+
|| Multiarmed bandids I
||
+
||  
+
 
||  
 
||  
 +
|| [https://disk.yandex.ru/i/lvqXofEbaFkfAA notes13]
 +
|| [https://disk.yandex.ru/i/ZXXJbBiJUPNiOw list 13]
 
||
 
||
 
|-
 
|-
| 2 Dec
+
| [https://youtu.be/Uybf6mCp2Es 09 Dec]
|| Multiarmed bandids I
+
|| Multiarmed bandids II (optional)
||
+
||
+
||
+
||
+
|-
+
| 9 Dec
+
|| Multiarmed bandids II
+
||
+
||
+
 
||
 
||
 +
|| [https://disk.yandex.ru/i/Nqy9-wmZ-g5o8g notes14]
 +
|| [https://disk.yandex.ru/d/0Qupo2CNSjS_pQ notebook], [https://disk.yandex.ru/d/suY6d58SFf09Bg notebook(solved)]
 
||
 
||
 
|-
 
|-
 
| 16 Dec
 
| 16 Dec
|| Colloquium
+
|| ''Colloquium''
||
+
||
+
||
+
||
+
<!-- | 12 Sept || Introduction and
+
|| [https://www.dropbox.com/s/kicoo9xf356eam5/01lect.pdf?dl=0 lecture1.pdf]
+
|| [https://www.dropbox.com/s/pehka8xyu5hlpis/slides01.pdf?dl=0 slides1.pdf]
+
||
+
|| [https://www.dropbox.com/s/fbdew1vdzskenie/01sem.pdf?dl=0 Problem list 1] <span style="color:red">Update 26.09, prob 1.7</span>
+
|| [https://www.dropbox.com/s/rn8nv9y0db61a0h/01sol.pdf?dl=0 Solutions 1]
+
 
|-
 
|-
| 19 Sept || VC-dimension and sample complexity
 
|| [https://www.dropbox.com/s/ayry6kp91h5s1nv/02lect.pdf?dl=0 lecture2.pdf]
 
|| [https://www.dropbox.com/s/6p6h1ooy4i5wt1t/02slides.pdf?dl=0 slides2.pdf]
 
|| [https://youtu.be/SBoffzKZebg Chapt 2,3]
 
|| [https://www.dropbox.com/s/4qn4qzr6mgu9lt3/02sem.pdf?dl=0 Problem list 2]
 
|| [https://www.dropbox.com/s/0g5gw3yrjjjzz07/02sol.pdf?dl=0 Solutions 2]
 
|-
 
| 26 Sept || Risk bounds and the fundamental theorem of statistical learning theory
 
|| [https://www.dropbox.com/s/njekia6g8t0x5mb/03lect.pdf?dl=0 lecture3.pdf]
 
|| [https://www.dropbox.com/s/at4eph4mv9gfnp1/03slides.pdf?dl=0 slides3.pdf]
 
||
 
|| [https://www.dropbox.com/s/nvb25e0ccebbz2a/03sem.pdf?dl=0 Problem list 3]
 
|| [https://www.dropbox.com/s/5jbl0xul25mrbg1/03sol.pdf?dl=0 Solutions 3]
 
|-
 
| 03 Oct || Rademacher complexity
 
|| [https://www.dropbox.com/s/ggw79gau85a4mcl/04lect.pdf?dl=0 lecture4.pdf]
 
|| [https://www.dropbox.com/s/pd2ockzxqdfo66t/04slides.pdf?dl=0 slides4.pdf]
 
||
 
|| [https://www.dropbox.com/s/rbx6jwlusnwhkzn/04sem.pdf?dl=0 Problem list 4] <span style="color:red">Update 23.10, prob 4.1d</span>
 
|| [https://www.dropbox.com/s/nhxkxfjajzsgfnf/04sol.pdf?dl=0 Solutions 4]
 
|-
 
| 10 Oct || Support vector machines and risk bounds
 
|| Chapt 5, Mohri et al, see below
 
|| [https://www.dropbox.com/s/q2onm9o6wgceg5i/05slides.pdf?dl=0 slides5.pdf]
 
||
 
|| [https://www.dropbox.com/s/upv70of97fqpx5f/05sem.pdf?dl=0 Problem list 5] <span style="color:red">Update 29.10, typo 5.8</span>
 
|| [https://www.dropbox.com/s/jfneptto1qoug1g/05sol.pdf?dl=0 Solutions 5]
 
|-
 
| 17 Oct || Support vector machines and recap
 
|| Chapt 5, Mohri et al.
 
|| [https://www.dropbox.com/s/tot9akaoonja1zp/06slides.pdf?dl=0 slides6.pdf]
 
||
 
|| [https://www.dropbox.com/s/y7w3srgsrp9d7m0/06sem.pdf?dl=0 Problem list 6]  <span style="color:red">Update 10.11</span>
 
|| [https://www.dropbox.com/s/qc0847q8q8llgg2/06sol.pdf?dl=0 Solutions 6] 
 
|-
 
| 31 Oct || Kernels
 
|| [https://www.dropbox.com/s/lzhbe7sb4aw49d4/07lec.pdf?dl=0 lecture7.pdf]
 
|| [https://www.dropbox.com/s/yrptkeaydam7r2v/07slides.pdf?dl=0 slides7.pdf]
 
||
 
|| [https://www.dropbox.com/s/81edvzrgiel3do6/07sem.pdf?dl=0 Problem list 7] <span style="color:red">Update 11.11, prob 7.6</span>
 
|| [https://www.dropbox.com/s/xaoxh2i12x15jz6/07sol.pdf?dl=0 Solutions 7]
 
|-
 
| 07 Nov || Adaboost
 
|| Chapt 6, Mohri et al
 
|| [https://www.dropbox.com/s/2ied3qr0xrsb127/08slides.pdf?dl=0 slides8.pdf]
 
||
 
|| [https://www.dropbox.com/s/i9jo9dlj06t51um/08sem.pdf?dl=0 Problem list 8]
 
|| [https://www.dropbox.com/s/1bxxzvorzbxpgji/08sol.pdf?dl=0 Solutions 8]
 
|-
 
| 14 Nov || Online learning 1, Littlestone dimension, weighted majority algorithm
 
|| Chapt 7, Mohri et al, and [http://machinelearning.ru/wiki/images/9/99/SLT%2C_lecture_85.pdf Животовский]
 
|| [https://www.dropbox.com/s/rtlsy6ssm2yj2p0/09slides.pdf?dl=0 slides9.pdf]
 
||
 
|| [https://www.dropbox.com/s/k0ynyl5x874e0gq/09sem.pdf?dl=0 Problem list 9] <span style="color:red">Update 08.12, 9.4</span>
 
|| [https://www.dropbox.com/s/k2zpqnoiwe19osu/09sol.pdf?dl=0 Solutions 9]
 
|-
 
| 21 Nov || Online learning 2, Exponential weighted average algorithm, preceptron
 
|| Chapt 7, Mohri et al
 
|| [https://www.dropbox.com/s/rtlsy6ssm2yj2p0/09slides.pdf?dl=0 slides9.pdf]
 
||
 
|| [https://www.dropbox.com/s/jh7krrihpc5f3ua/10sem.pdf?dl=0 Problem list 10]
 
|| [https://www.dropbox.com/s/tf8mdjxfbz86lj4/10sol.pdf?dl=0 Solutions 10]
 
|-
 
| 28 Nov || Online learning 3, perception, Winnow and online to batch conversion
 
|| Chapt 7, Mohri et al
 
|| [https://www.dropbox.com/s/ntkmnxhsvk9j38y/11slides.pdf?dl=0 slides11.pdf]
 
||
 
|| [https://www.dropbox.com/s/py43d5k4mr7rv26/11sem.pdf?dl=0 Problem list 11]
 
|| [https://www.dropbox.com/s/fuj1wclaq7wwa7c/11sol.pdf?dl=0 Solutions 11]
 
|-
 
| 5 Dec || Recap of requested topics, Q&A
 
|| [https://www.dropbox.com/s/ugiqfsk2mg01262/QandA.pdf?dl=0 Q&A]
 
||
 
||
 
||
 
||
 
|-
 
-->
 
 
|}
 
|}
  
 +
The lectures in October and November are based on the book:
 +
Foundations of machine learning 2nd ed, Mehryar Mohri, Afshin Rostamizadeh, and Ameet Talwalker, 2018. This book can be downloaded from [https://libgen.is Library Genesis] (the link changes sometimes and sometimes vpn is needed).
  
== Problems exam ==
+
<!-- A gentle introduction to the materials of the first 3 lectures and an overview of probability theory, can be found in chapters 1-6 and 11-12 of the following book:
 +
Sanjeev Kulkarni and Gilbert Harman: An Elementary Introduction to Statistical Learning Theory, 2012.-->
  
Dates, problems TBA
+
== Grading formula ==
  
During the exam<br>
+
Final grade = 0.35 * [score of homeworks] + 0.35 * [score of colloquium] + 0.3 * [score on the exam] + bonus from quizzes.
-- You may consult notes, books and search on the internet <br>
+
-- You may not interact with other humans (e.g. by phone, forums, etc)
+
  
 +
All homework questions have the same weight. Each solved extra homework task increases the score of the final exam by 1 point.
 +
 +
There is no rounding except on the final grade. Grades fractional grades above 5/10 are rounded up, those below 5/10 are rounded down.
 +
 +
Autogrades: if you only need 4/10 to pass with maximal final score, it will be given automatically. This may happen because of extra questions and bonuses from quizzes.
 +
 +
For students who want to pass with 4/10 with minimal effort: each year on the exam, I ask to calculate the VC-dimension or Rademacher complexity of some class. It should be easy to have 4/10 for the final exam. If you understand all lecture notes, you pass the colloquium with maximal score. Together this is enough. If only a few students fail and the grades are at least 3.8/10 then failed students may resubmit a few homework tasks to pull up the grade. (This happened in the last 3 years.)
  
<!-- A gentle introduction to the materials of the first 3 lectures and an overview of probability theory, can be found in chapters 1-6 and 11-12 of the following book:
 
Sanjeev Kulkarni and Gilbert Harman: An Elementary Introduction to Statistical Learning Theory, 2012.-->
 
  
The lectures in October and November are based on the book:
+
== Colloquium ==
Foundations of machine learning 2nd ed, Mehryar Mohri, Afshin Rostamizadeh, and Ameet Talwalker, 2018. This book can be downloaded from http://gen.lib.rus.ec/ .
+
 
 +
[https://www.dropbox.com/s/7djya6nc8ietd32/colloqQuest.pdf?dl=0 Rules and questions.] Update 12/12 added question 24 and corrected typos.
 +
 
 +
<!-- [https://docs.google.com/spreadsheets/d/13ox_EN6YJBEC93A6YgbbzawXE2RfynyQTUf90SXE4GQ/edit?usp=sharing Choose the day: 16 or 17 Dec.] -->
 +
 
 +
== Homeworks ==
 +
 
 +
Email to brbauwens-at-gmail.com. Start the subject line with SLT-HW.
 +
 
 +
Deadline before the start of the lecture, every other lecture.
 +
 
 +
Sat. 17 Sept 18h10: problems 1.7, 1.8, 2.9, and 2.11 <br>
 +
Sat. 01 Oct 18h10: see lists 3 and 4, and 2.10  <br>
 +
Fri. 14 Oct 16h20: see problem lists 5 and 6 <br>
 +
Sat. 05 Nov 20h00: see problem lists 7 and 8 <br>
 +
Sat. 19 Nov 20h00: see problem lists 9 and 10 <br>
 +
Sun. 04 Dec 23h59: see problem list 12 send it to maxkaledin@gmail.com with subject line SLT-HW-Reg <YourName>_<YourSurname>
 +
<br>
 +
 
  
 
== Office hours ==
 
== Office hours ==
Строка 257: Строка 184:
 
! Person !! Monday !! Tuesday !! Wednesday !! Thursday !! Friday !!  
 
! Person !! Monday !! Tuesday !! Wednesday !! Thursday !! Friday !!  
 
|-
 
|-
|   , TBA || ||   || || ||   ||   
+
Bruno Bauwens || 15-20h || || || || 18-20h ||   
 
|-
 
|-
|   , TBA || ||   ||   || || ||   
+
Maxim Kaledin || Write || in  || Telegram  || time is || flexible ||   
 
|-
 
|-
 
|}
 
|}

Текущая версия на 19:22, 4 сентября 2023

General Information

Lectures: Friday 16h20 -- 17h40, Bruno Bauwens, Maxim Kaledin, room M202 and on zoom

Seminars: Saturday 14h40 -- 16h00, Artur Goldman, room M202 and on zoom (the link will be in telegram)

To discuss the materials, join the telegram group The course is similar to last year.


Problems exam

December 21, 13h-16h, computer room G403 (zoomlink for students abroad)
-- You may use handwritten notes, lecture materials from this wiki (either printed or through your PC), Mohri's book
-- You may not search on the internet or interact with other humans (e.g. by phone, forums, etc)


Course materials

Video Summary Slides Lecture notes Problem list Solutions
Part 1. Online learning
02 Sept Philosophy. The online mistake bound model. The halving and weighted majority algorithms movies sl01 ch00 ch01 list 1 update 05.09 solutions 1
09 Sept The perceptron algorithm. The standard optimal algorithm. sl02 ch02 ch03 list 2 update 25.09 solutions 2
16 Sept Kernels and the kernel perceptron algorithm. Prediction with expert advice. Recap probability theory. sl03 ch04 ch05 list 3 solutions 3
Part 2. Distribution independent risk bounds
23 Sept Sample complexity in the realizable setting, simple examples and bounds using VC-dimension sl04 ch06 list 4 solutions 4
30 Sept Growth functions, VC-dimension and the characterization of sample comlexity with VC-dimensions sl05 ch07 ch08 list 5 solutions 5
07 Oct Risk decomposition and the fundamental theorem of statistical learning theory sl06 ch09 list 6 solutions 6
14 Oct Bounded differences inequality, Rademacher complexity, symmetrization, contraction lemma, quiz sl07 ch10 ch11 list 7 update 15.10 solutions 7
Part 3. Margin risk bounds with applications
21 Oct Simple regression, support vector machines, margin risk bounds, and neural nets sl08 ch12 ch13 list 8 solutions 8
04 Nov Kernels: RKHS, representer theorem, risk bounds sl09 ch14 list 9 solutions 9
11 Nov AdaBoost and the margin hypothesis sl10 ch15 list 10 solutions 10
18 Nov Implicit regularization of stochastic gradient descent in neural nets ch16 no seminar
Part 4. Other topics
25 Nov Regression I: fixed design with sub-Gaussian noise notes12 list 12 solutions 12
02 Dec Multiarmed bandids I notes13 list 13
09 Dec Multiarmed bandids II (optional) notes14 notebook, notebook(solved)
16 Dec Colloquium

The lectures in October and November are based on the book: Foundations of machine learning 2nd ed, Mehryar Mohri, Afshin Rostamizadeh, and Ameet Talwalker, 2018. This book can be downloaded from Library Genesis (the link changes sometimes and sometimes vpn is needed).


Grading formula

Final grade = 0.35 * [score of homeworks] + 0.35 * [score of colloquium] + 0.3 * [score on the exam] + bonus from quizzes.

All homework questions have the same weight. Each solved extra homework task increases the score of the final exam by 1 point.

There is no rounding except on the final grade. Grades fractional grades above 5/10 are rounded up, those below 5/10 are rounded down.

Autogrades: if you only need 4/10 to pass with maximal final score, it will be given automatically. This may happen because of extra questions and bonuses from quizzes.

For students who want to pass with 4/10 with minimal effort: each year on the exam, I ask to calculate the VC-dimension or Rademacher complexity of some class. It should be easy to have 4/10 for the final exam. If you understand all lecture notes, you pass the colloquium with maximal score. Together this is enough. If only a few students fail and the grades are at least 3.8/10 then failed students may resubmit a few homework tasks to pull up the grade. (This happened in the last 3 years.)


Colloquium

Rules and questions. Update 12/12 added question 24 and corrected typos.


Homeworks

Email to brbauwens-at-gmail.com. Start the subject line with SLT-HW.

Deadline before the start of the lecture, every other lecture.

Sat. 17 Sept 18h10: problems 1.7, 1.8, 2.9, and 2.11
Sat. 01 Oct 18h10: see lists 3 and 4, and 2.10
Fri. 14 Oct 16h20: see problem lists 5 and 6
Sat. 05 Nov 20h00: see problem lists 7 and 8
Sat. 19 Nov 20h00: see problem lists 9 and 10
Sun. 04 Dec 23h59: see problem list 12 send it to maxkaledin@gmail.com with subject line SLT-HW-Reg <YourName>_<YourSurname>


Office hours

Person Monday Tuesday Wednesday Thursday Friday
Bruno Bauwens 15-20h 18-20h
Maxim Kaledin Write in Telegram time is flexible

It is always good to send an email in advance. Questions and feedback are welcome.