Statistical learning theory 2024/25 — различия между версиями

Материал из Wiki - Факультет компьютерных наук
Перейти к: навигация, поиск
 
(не показано 37 промежуточных версии 2 участников)
Строка 1: Строка 1:
 
== General Information ==
 
== General Information ==
  
Lectures: on TBA in room TBA and in [https://us02web.zoom.us/j/82300259484?pwd=NWxXekxBeE5yMm9UTmwvLzNNNGlnUT09 zoom] by [https://www.hse.ru/en/org/persons/160550073 Bruno Bauwens]
+
Lectures: on Tuesday 9h30--10h50 in room M302 and in [https://us02web.zoom.us/j/82300259484?pwd=NWxXekxBeE5yMm9UTmwvLzNNNGlnUT09 zoom] by [https://www.hse.ru/en/org/persons/160550073 Bruno Bauwens]
  
Seminars: on TBA in room TBA and in TBA by [https://www.hse.ru/org/persons/225553845/ Nikita Lukianenko].
+
Seminars: online in [https://us06web.zoom.us/j/85239566702?pwd=y4uhpPrdjSVKOS2LkDIcKCzBXtCbFb.1 Zoom] by [https://www.hse.ru/org/persons/225553845/ Nikita Lukianenko].
  
To discuss the materials and practical issues, join the [https://t.me/+1begXb8SomhmODI8 telegram group] The course is similar to [http://wiki.cs.hse.ru/Statistical_learning_theory_2022 last year].
+
Please join the [https://t.me/+1begXb8SomhmODI8 telegram group] The course is similar to [http://wiki.cs.hse.ru/Statistical_learning_theory_2023/24 last year].
  
 
== Homeworks ==
 
== Homeworks ==
Строка 11: Строка 11:
 
Deadline every 2 weeks, before the lecture. The tasks are at the end of each problem list. (Problem lists will be updated, check the year.)
 
Deadline every 2 weeks, before the lecture. The tasks are at the end of each problem list. (Problem lists will be updated, check the year.)
  
Before 3rd lecture: see problem lists 1 and 2.  
+
Before 3rd lecture, submit HW from problem lists 1 and 2.  
Before 5th lecture: see problems lists 3 and 4.
+
Before 5th lecture, from lists 3 and 4. Etc.
Etc.
+
  
Email homeworks to brbauwens-at-gmail.com. Start the subject line with SLT-HW. Results will be here.
+
[https://classroom.google.com/c/NzE5NzA4OTg1ODA4?cjc=imgrl43 Classroom] to submit homeworks. You may submit in English or Russian, as latex or as pictures. Results [https://docs.google.com/spreadsheets/d/1k9hivwCzCp3YcR-1n4WnQow94zyQBCjVcgWYaq-PHx8/edit?usp=sharing are here].
  
Late policy: 1 homework can be submitted at most 24 late without explanations.  
+
Late policy: 1 homework can be submitted at most 24 late without explanations.
  
 
== Course materials ==
 
== Course materials ==
Строка 28: Строка 27:
 
|| ''Part 1. Online learning''  
 
|| ''Part 1. Online learning''  
 
|-
 
|-
| [https://www.youtube.com/watch?v=YfMbst5xIH8 ?? Sept]
+
| [https://www.youtube.com/watch?v=N_JUBxw3sZo 21 Sep]
|| Philosophy. The online mistake bound model. The halving and weighted majority algorithms. <!-- [https://drive.google.com/drive/folders/1NXiLbhmO2Ml7jFmnLtjqhOgCoHg7yn9T?usp=sharing movies] -->
+
|| Philosophy. The online mistake bound model. The halving and weighted majority algorithms.  
|| [https://www.dropbox.com/scl/fi/j2vqqp3e86yx7pkgmkky6/01slides_all.pdf?rlkey=lxzhu3xd3epypia8j49v25erg&dl=0 sl01]
+
|| [https://www.dropbox.com/scl/fi/nbxsehlcl8hqodcaho7sg/01slides_all.pdf?rlkey=7u4smvn3jaofhscwrddh6mcoy&st=yb9esz0d&dl=0 sl01]
|| [https://www.dropbox.com/s/oncvg4mxulbt56d/00book_intro.pdf?dl=0 ch00] [https://www.dropbox.com/s/i9pc4kf0zsdeksb/01book_onlineMistakeBound.pdf?dl=0 ch01]
+
|| [https://www.dropbox.com/scl/fi/svgelu3iwijls092ehqqf/00book_intro.pdf?rlkey=jxdya4290kfc0hfl06b0y7k4b&st=lnv8chxf&dl=0 ch00]   [https://www.dropbox.com/scl/fi/uqa9615215wy7ievgr50y/01book_onlineMistakeBound.pdf?rlkey=jiqzz84b5ipaw4t6cff7b17sl&st=mc354l04&dl=0 ch01]  
|| [https://www.dropbox.com/scl/fi/qs5wqr97qoyh3l2gfju48/01sem.pdf?rlkey=6lvzcbfkw6lj9y77ep64nq7lk&dl=0 prob01]
+
|| [https://www.dropbox.com/scl/fi/luee4if0mrd4f440q69hd/01sem.pdf?rlkey=8702taq325mvb4ifh15stvvto&st=sq946cf3&dl=0 prob01]
|| <!-- [https://www.dropbox.com/scl/fi/kksvt6ttgf06u8uce6g9z/01sol.pdf?rlkey=ldcqaewvg7cqdlfqkt7ltckej&dl=0 sol01] -->
+
|| [https://www.dropbox.com/scl/fi/kswtqmyxw3pv336g1vdd6/01sol.pdf?rlkey=bpwnrcsj6ru3nbo4xwq2lp6g0&st=hftnu87m&dl=0 sol01]
 
|-
 
|-
| [https://www.youtube.com/watch?v=gQm1G3Ep-5s ?? Sept]
+
| [https://www.youtube.com/watch?v=gQm1G3Ep-5s 24 Sep]
|| The perceptron algorithm. Kernels. The standard optimal algorithm.
+
|| The standard optimal algorithm. The perceptron algorithm.  
 
|| [https://www.dropbox.com/s/sy959ee81mov5cr/02slides.pdf?dl=0 sl02]  
 
|| [https://www.dropbox.com/s/sy959ee81mov5cr/02slides.pdf?dl=0 sl02]  
|| [https://www.dropbox.com/s/p3auugqwc89132b/02book_sequentialOptimalAlgorithm.pdf?dl=0 ch02] [https://www.dropbox.com/s/b00dcqk1rob7rdz/03book_perceptron.pdf?dl=0 ch03]
+
|| [https://www.dropbox.com/scl/fi/9016w6j87oclagapah8dt/02book_sequentialOptimalAlgorithm.pdf?rlkey=r729ir0a47ncqip8rooq9txxo&st=zx2tu8gp&dl=0 ch02] [https://www.dropbox.com/scl/fi/iwclbc321iv4k9fmljwpb/03book_perceptron.pdf?rlkey=9v27bt1b9qc2q382l6lwyrkic&st=ni0n8482&dl=0 ch03]
|| [https://www.dropbox.com/scl/fi/di1k87aq44ss07mq4s6pi/02sem.pdf?rlkey=yu476v8z77bal6ma029frnilm&dl=0 prob02]
+
|| [https://www.dropbox.com/scl/fi/darlkflu0p8idh1smsvqc/02sem.pdf?rlkey=9rxky51dscu0d1pvh0h3iun1i&st=whkfpp78&dl=0 prob02]
|| <!-- [https://www.dropbox.com/scl/fi/d2wuka77bu18j9plivwl5/02sol.pdf?rlkey=yp2eprgxpc7r2antyidjd8qiw&dl=0 sol02] -->
+
|| [https://www.dropbox.com/scl/fi/d2wuka77bu18j9plivwl5/02sol.pdf?rlkey=yp2eprgxpc7r2antyidjd8qiw&dl=0 sol02]
 
|-
 
|-
| [https://www.youtube.com/watch?v=H7kvz2rxX4o ?? Sept]
+
| [https://www.youtube.com/watch?v=Fk1-QI9PRAI 01 Oct]
|| Prediction with expert advice. Recap probability theory (seminar).  
+
|| Kernel perceptron algorithm. Prediction with expert advice. Recap probability theory (seminar).  
 
|| [https://www.dropbox.com/s/a60p9b76cxusgqy/03slides.pdf?dl=0 sl03]
 
|| [https://www.dropbox.com/s/a60p9b76cxusgqy/03slides.pdf?dl=0 sl03]
|| [https://www.dropbox.com/scl/fi/oetz6dwz77jdlta1k03li/04book_predictionWithExperts.pdf?rlkey=f0u947kiq9bjjaa46iv68lr7j&dl=0 ch04] [https://www.dropbox.com/scl/fi/cx7hsxzwg2f8ep4qcuefc/05book_introProbability.pdf?rlkey=rfq0y9cgzqvl1dlxkccc3qebv&dl=0 ch05]
+
|| [https://www.dropbox.com/scl/fi/7pn3dyf2890p9zuyxleyl/04book_predictionWithExperts.pdf?rlkey=0capmeeu6pwp9wz2mhi0t5h58&st=f4c4n9wo&dl=0 ch04] [https://www.dropbox.com/scl/fi/cx7hsxzwg2f8ep4qcuefc/05book_introProbability.pdf?rlkey=rfq0y9cgzqvl1dlxkccc3qebv&dl=0 ch05]
|| [https://www.dropbox.com/scl/fi/3w9yk9rb7l0pb2l6q94fc/03sem.pdf?rlkey=jt40nnw35t7e8je9nj1ef22f1&dl=0 prob03]
+
|| [https://www.dropbox.com/scl/fi/bkuydm0u3xonnld8qlbl3/03sem.pdf?rlkey=xg2e9sbpe8c2071pxgcohlcab&st=ezxf2zgq&dl=0 prob03]
|| <!-- [https://www.dropbox.com/scl/fi/w26siug25arwoilgb5t7i/03sol.pdf?rlkey=7waf3ddt4fvz0xeicicayoilt&dl=0 sol03] -->
+
|| [https://www.dropbox.com/scl/fi/wjksi4t5r4ng894uiaj8b/03sol.pdf?rlkey=madshl3vupmwkuyzs44ut23ry&st=caroyl3r&dl=0 sol03]
 
|-
 
|-
 
|  
 
|  
 
|| ''Part 2. Distribution independent risk bounds''  
 
|| ''Part 2. Distribution independent risk bounds''  
 
|-
 
|-
| [https://www.youtube.com/watch?v=ycfYXvmKF0I ?? Oct]
+
| [https://www.youtube.com/watch?v=ycfYXvmKF0I 08 Oct]
 
|| Necessity of a hypothesis class. Sample complexity in the realizable setting, examples: threshold functions and finite classes.  
 
|| Necessity of a hypothesis class. Sample complexity in the realizable setting, examples: threshold functions and finite classes.  
 
|| [https://www.dropbox.com/s/pi0f3wab1xna6d7/04slides.pdf?dl=0 sl04]
 
|| [https://www.dropbox.com/s/pi0f3wab1xna6d7/04slides.pdf?dl=0 sl04]
 
|| [https://www.dropbox.com/s/nh4puyv7nst4ems/06book_sampleComplexity.pdf?dl=0 ch06]  
 
|| [https://www.dropbox.com/s/nh4puyv7nst4ems/06book_sampleComplexity.pdf?dl=0 ch06]  
|| [https://www.dropbox.com/scl/fi/n4004z5mmrn3nbr9ggtt9/05sem.pdf?rlkey=sntvs95trliffbc2vh5vge06b&dl=0 prob05]
+
|| [https://www.dropbox.com/scl/fi/x12se5y3heqtyfzo7qx30/04sem.pdf?rlkey=0hd5hphnbj90jc24nqsw63ka7&st=1zie6tp0&dl=0 prob04] ''update 12.10''
|| <!-- [https://www.dropbox.com/scl/fi/55530savq0vn6apra7oo4/05sol.pdf?rlkey=ql9q3a7s7k5dkggymul4p4s2o&dl=0 sol05] -->
+
|| [https://www.dropbox.com/scl/fi/g6j0n39zhm1he8kfena8d/04sol.pdf?rlkey=hcg1cr6s4cca9ekqua67ehlhf&st=81bpsm1a&dl=0 sol04]
 
|-  
 
|-  
| [https://www.youtube.com/watch?v=8J5B9CCy-ws ?? Oct]
+
| [https://www.youtube.com/watch?v=8J5B9CCy-ws 15 Oct]
 
|| Growth functions, VC-dimension and the characterization of sample comlexity with VC-dimensions  
 
|| Growth functions, VC-dimension and the characterization of sample comlexity with VC-dimensions  
 
|| [https://www.dropbox.com/s/rpnh6288rdb3j8m/05slides.pdf?dl=0 sl05]
 
|| [https://www.dropbox.com/s/rpnh6288rdb3j8m/05slides.pdf?dl=0 sl05]
 
|| [https://www.dropbox.com/s/eurz2vkvt1wa5zm/07book_growthFunctions.pdf?dl=0 ch07] [https://www.dropbox.com/scl/fi/50oxlmjkx59hjrq82yqvx/08book_VCdimension.pdf?rlkey=5dtlcis378kqu24ttko6s7zpf&dl=0 ch08]
 
|| [https://www.dropbox.com/s/eurz2vkvt1wa5zm/07book_growthFunctions.pdf?dl=0 ch07] [https://www.dropbox.com/scl/fi/50oxlmjkx59hjrq82yqvx/08book_VCdimension.pdf?rlkey=5dtlcis378kqu24ttko6s7zpf&dl=0 ch08]
|| [https://www.dropbox.com/scl/fi/wvj022mv9w82mlynp4t28/06sem.pdf?rlkey=k8bieoxn7zlkfkzyhi311n26s&dl=0 prob06]
+
|| [https://www.dropbox.com/scl/fi/1n9jdc70ia7vu957mls02/05sem.pdf?rlkey=8x89v3fkm1q61b4frirb9nqke&st=7pfvhuq6&dl=0 prob05]
|| <!-- [https://www.dropbox.com/scl/fi/gcr4n00ef62ezrta7atll/06sol.pdf?rlkey=b9rgqxgmnlxouvsl5eevpwg3d&dl=0 sol06] -->
+
|| <!-- [https://www.dropbox.com/scl/fi/jzm82hqbnzp7931gz8jd2/05sol.pdf?rlkey=o04gco2huwqo4m7rrtp0yd9gl&st=6f0uh0q4&dl=0 sol05] -->
 
|-
 
|-
| [https://www.youtube.com/watch?v=zHau8Br_UFQ ?? Oct]
+
| [https://www.youtube.com/watch?v=zHau8Br_UFQ 22 Oct]
|| Risk decomposition and the fundamental theorem of statistical learning theory
+
|| Risk decomposition and the fundamental theorem of statistical learning theory (previous [https://www.youtube.com/watch?v=zHau8Br_UFQ recording] covers more)
 
|| [https://www.dropbox.com/s/0p8r5wgjy1hlku2/06slides.pdf?dl=0 sl06]
 
|| [https://www.dropbox.com/s/0p8r5wgjy1hlku2/06slides.pdf?dl=0 sl06]
|| [https://www.dropbox.com/scl/fi/15zjsv1w9coq2py9djlai/09book_riskBounds.pdf?rlkey=4lnyo8kcd226qlybrdgyt36i8&dl=0 ch09]
+
|| [https://www.dropbox.com/scl/fi/th4r5t2gm29en4hejareq/09book_riskBounds.pdf?rlkey=4ox3f26kygxorxft8jlijuf0f&st=fg0fdyx2&dl=0 ch09]
|| [https://www.dropbox.com/scl/fi/neso7q9vq8ouix208u841/07sem.pdf?rlkey=k8dxkxwqdxf3kjsclzt9vwiw5&dl=0 prob07]
+
|| [https://www.dropbox.com/scl/fi/15y2x2pq3pp77144nzee5/06sem.pdf?rlkey=72zoca4wgs472df4izvq2dd3t&st=5m9u4q2u&dl=0 prob06]
|| <!-- [https://www.dropbox.com/scl/fi/dw3u10rhy33pv37z5zf5m/07sol.pdf?rlkey=wssi52zoiveccmpy2197ry5pt&dl=0 sol07] -->
+
|| [https://www.dropbox.com/scl/fi/w8kc0izfc12sqjyd8hfou/06sol.pdf?rlkey=a09f6yx9e0ifohus9vt2ybthd&st=09qmm3m6&dl=0 sol06]
 
|-
 
|-
| [https://www.youtube.com/watch?v=yMsUH1brAs8 ?? Oct]
+
| [https://youtube.com/live/G5fglRAaXMo 05 Nov]
 
|| Bounded differences inequality, Rademacher complexity, symmetrization, contraction lemma.  
 
|| Bounded differences inequality, Rademacher complexity, symmetrization, contraction lemma.  
 
|| [https://www.dropbox.com/s/kfithyq0dgcq6h8/07slides.pdf?dl=0 sl07]
 
|| [https://www.dropbox.com/s/kfithyq0dgcq6h8/07slides.pdf?dl=0 sl07]
 
|| [https://www.dropbox.com/scl/fi/ohtmf1fwsu9c6vkrj6e5a/10book_measureConcentration.pdf?rlkey=dqsgskp8slui6xoq9c7tx680b&dl=0 ch10] [https://www.dropbox.com/s/hfrvhebbsskbk6g/11book_RademacherComplexity.pdf?dl=0 ch11]
 
|| [https://www.dropbox.com/scl/fi/ohtmf1fwsu9c6vkrj6e5a/10book_measureConcentration.pdf?rlkey=dqsgskp8slui6xoq9c7tx680b&dl=0 ch10] [https://www.dropbox.com/s/hfrvhebbsskbk6g/11book_RademacherComplexity.pdf?dl=0 ch11]
|| [https://www.dropbox.com/scl/fi/g278mmezenlyxd1my0ta9/08sem.pdf?rlkey=hvqmbumpd0xb6pumdgv5bqx6u&dl=0 prob08]
+
|| [https://www.dropbox.com/scl/fi/701h3asvj5a6kj7d9p1tm/07sem.pdf?rlkey=dsnhc90gp0nd7jqgy3oicds4i&st=fu4nf10i&dl=0 prob07]
|| <!-- [https://www.dropbox.com/scl/fi/06yobqe58fiecsobp4yrb/08sol.pdf?rlkey=9c7t1y4nxxtg14vpndsyyko2u&dl=0 sol08] -->
+
|| [https://www.dropbox.com/scl/fi/kd3osu95m7bmilv6z6bxm/07sol.pdf?rlkey=9ycz3obscp65uc05pg2dt3zww&st=9d8g3jkf&dl=0 sol07]
 
|-
 
|-
 
|  
 
|  
 
|| ''Part 3. Margin risk bounds with applications''  
 
|| ''Part 3. Margin risk bounds with applications''  
 
|-
 
|-
| [https://www.youtube.com/watch?v=oU2AzubDXeo ?? Nov]
+
| [https://www.youtube.com/watch?v=oU2AzubDXeo 12 Nov]
 
|| Simple regression, support vector machines, margin risk bounds, and neural nets with dropout regularization
 
|| Simple regression, support vector machines, margin risk bounds, and neural nets with dropout regularization
 
|| [https://www.dropbox.com/s/oo1qny9busp3axn/08slides.pdf?dl=0 sl08]
 
|| [https://www.dropbox.com/s/oo1qny9busp3axn/08slides.pdf?dl=0 sl08]
 
|| [https://www.dropbox.com/s/573a2vtjfx8qqo8/12book_regression.pdf?dl=0 ch12] [https://www.dropbox.com/scl/fi/hxeh5btc0bb2f52fnqh5f/13book_SVM.pdf?rlkey=dw3u2rtfstpsb8mi9hnuc8poy&dl=0 ch13]
 
|| [https://www.dropbox.com/s/573a2vtjfx8qqo8/12book_regression.pdf?dl=0 ch12] [https://www.dropbox.com/scl/fi/hxeh5btc0bb2f52fnqh5f/13book_SVM.pdf?rlkey=dw3u2rtfstpsb8mi9hnuc8poy&dl=0 ch13]
|| [https://www.dropbox.com/scl/fi/rp2m0dvovdjbvzdl7t1bl/09sem.pdf?rlkey=v1jsm5dagh7tymci5pkqn5gox&dl=0 prob09]
+
|| [https://www.dropbox.com/scl/fi/rp2m0dvovdjbvzdl7t1bl/09sem.pdf?rlkey=v1jsm5dagh7tymci5pkqn5gox&dl=0 prob08]
|| <!-- [https://www.dropbox.com/scl/fi/e598w1t8tzqxfvn1d4ww1/09sol.pdf?rlkey=yr1gzu8kg2rdkubaelicljj46&dl=0 sol09] -->
+
|| <!-- [https://www.dropbox.com/scl/fi/e598w1t8tzqxfvn1d4ww1/09sol.pdf?rlkey=yr1gzu8kg2rdkubaelicljj46&dl=0 sol08] -->
 
|-
 
|-
| [https://youtu.be/9FhFxLHR4eE ?? Nov]
+
| [https://youtu.be/9FhFxLHR4eE 19 Nov]
 
|| Kernels: RKHS, representer theorem, risk bounds
 
|| Kernels: RKHS, representer theorem, risk bounds
 
|| [https://www.dropbox.com/s/jst60ww8ev4ypie/09slides.pdf?dl=0 sl09]
 
|| [https://www.dropbox.com/s/jst60ww8ev4ypie/09slides.pdf?dl=0 sl09]
 
|| [https://www.dropbox.com/scl/fi/lozpqk5nnm8us77qfhn7x/14book_kernels.pdf?rlkey=s8e7a46rm3znkw13ubj3fzzz0&dl=0 ch14]
 
|| [https://www.dropbox.com/scl/fi/lozpqk5nnm8us77qfhn7x/14book_kernels.pdf?rlkey=s8e7a46rm3znkw13ubj3fzzz0&dl=0 ch14]
|| [https://www.dropbox.com/scl/fi/9mjmb6deu08ipf38s57bh/10sem.pdf?rlkey=z1khm4i8r39eeqmhargte24s4&dl=0 prob10]
+
|| [https://www.dropbox.com/scl/fi/9mjmb6deu08ipf38s57bh/10sem.pdf?rlkey=z1khm4i8r39eeqmhargte24s4&dl=0 prob09]
|| <!-- [https://www.dropbox.com/scl/fi/a5c0buap9b1h1ojdbhp3u/10sol.pdf?rlkey=8ft5tjyy1sl5dkj4p4hh8phbc&dl=0 sol10] -->
+
|| <!-- [https://www.dropbox.com/scl/fi/a5c0buap9b1h1ojdbhp3u/10sol.pdf?rlkey=8ft5tjyy1sl5dkj4p4hh8phbc&dl=0 sol09] -->
 
|-  
 
|-  
| [https://www.youtube.com/watch?v=OgiaWrWh_WA ?? Nov]
+
| [https://www.youtube.com/watch?v=OgiaWrWh_WA 26 Nov]
 
|| AdaBoost and the margin hypothesis
 
|| AdaBoost and the margin hypothesis
 
|| [https://www.dropbox.com/s/umum3kd9439dt42/10slides.pdf?dl=0 sl10]
 
|| [https://www.dropbox.com/s/umum3kd9439dt42/10slides.pdf?dl=0 sl10]
 
|| [https://www.dropbox.com/s/e7m1cs7e8ulibsf/15book_AdaBoost.pdf?dl=0 ch15]
 
|| [https://www.dropbox.com/s/e7m1cs7e8ulibsf/15book_AdaBoost.pdf?dl=0 ch15]
|| [https://www.dropbox.com/scl/fi/ykbzx314pdn3mn3jiehli/11sem.pdf?rlkey=hpmtks20a3k5zsvr8jm1iqc35&dl=0 prob11]
+
|| [https://www.dropbox.com/scl/fi/ykbzx314pdn3mn3jiehli/11sem.pdf?rlkey=hpmtks20a3k5zsvr8jm1iqc35&dl=0 prob10]
|| <!-- [https://www.dropbox.com/scl/fi/c805j4f54ioiozphvh9j0/11sol.pdf?rlkey=6rrxlweaiko1lm0z2ua4k7mqk&dl=0 sol11] -->
+
|| <!-- [https://www.dropbox.com/scl/fi/c805j4f54ioiozphvh9j0/11sol.pdf?rlkey=6rrxlweaiko1lm0z2ua4k7mqk&dl=0 sol10] -->
 
|-  
 
|-  
| [https://youtu.be/GL574ljefJ8 ?? Nov]
+
| [https://youtu.be/GL574ljefJ8 03 Dec]
 
|| Implicit regularization of stochastic gradient descent in overparameterized neural nets ([https://www.youtube.com/watch?v=ygVHVW3y3wM recording] with many details about the Hessian)
 
|| Implicit regularization of stochastic gradient descent in overparameterized neural nets ([https://www.youtube.com/watch?v=ygVHVW3y3wM recording] with many details about the Hessian)
 
||  
 
||  
Строка 111: Строка 110:
 
||  
 
||  
 
|-
 
|-
| [https://www.youtube.com/watch?v=RDTK7hBqiJY ?? Dec]
+
| [https://www.youtube.com/watch?v=RDTK7hBqiJY 10 Dec]
 
|| Part 2 of previous lecture: Hessian control and stability of the NTK.  
 
|| Part 2 of previous lecture: Hessian control and stability of the NTK.  
 
||  
 
||  
Строка 151: Строка 150:
 
== Office hours ==
 
== Office hours ==
  
Bruno Bauwens: TBA 
+
Bruno Bauwens: Bruno Bauwens: Tuesday 12h -- 20h. Wednesday 16h -- 18h. Friday 11h -- 17h. Better send me an email in advance.
  
 
Nikita Lukianenko: Write in Telegram, the time is flexible   
 
Nikita Lukianenko: Write in Telegram, the time is flexible   

Текущая версия на 13:45, 5 ноября 2024

General Information

Lectures: on Tuesday 9h30--10h50 in room M302 and in zoom by Bruno Bauwens

Seminars: online in Zoom by Nikita Lukianenko.

Please join the telegram group The course is similar to last year.

Homeworks

Deadline every 2 weeks, before the lecture. The tasks are at the end of each problem list. (Problem lists will be updated, check the year.)

Before 3rd lecture, submit HW from problem lists 1 and 2. Before 5th lecture, from lists 3 and 4. Etc.

Classroom to submit homeworks. You may submit in English or Russian, as latex or as pictures. Results are here.

Late policy: 1 homework can be submitted at most 24 late without explanations.

Course materials

Video Summary Slides Lecture notes Problem list Solutions
Part 1. Online learning
21 Sep Philosophy. The online mistake bound model. The halving and weighted majority algorithms. sl01 ch00 ch01 prob01 sol01
24 Sep The standard optimal algorithm. The perceptron algorithm. sl02 ch02 ch03 prob02 sol02
01 Oct Kernel perceptron algorithm. Prediction with expert advice. Recap probability theory (seminar). sl03 ch04 ch05 prob03 sol03
Part 2. Distribution independent risk bounds
08 Oct Necessity of a hypothesis class. Sample complexity in the realizable setting, examples: threshold functions and finite classes. sl04 ch06 prob04 update 12.10 sol04
15 Oct Growth functions, VC-dimension and the characterization of sample comlexity with VC-dimensions sl05 ch07 ch08 prob05
22 Oct Risk decomposition and the fundamental theorem of statistical learning theory (previous recording covers more) sl06 ch09 prob06 sol06
05 Nov Bounded differences inequality, Rademacher complexity, symmetrization, contraction lemma. sl07 ch10 ch11 prob07 sol07
Part 3. Margin risk bounds with applications
12 Nov Simple regression, support vector machines, margin risk bounds, and neural nets with dropout regularization sl08 ch12 ch13 prob08
19 Nov Kernels: RKHS, representer theorem, risk bounds sl09 ch14 prob09
26 Nov AdaBoost and the margin hypothesis sl10 ch15 prob10
03 Dec Implicit regularization of stochastic gradient descent in overparameterized neural nets (recording with many details about the Hessian) ch16 ch17
10 Dec Part 2 of previous lecture: Hessian control and stability of the NTK.


The lectures in October and November are based on the book: Foundations of machine learning 2nd ed, Mehryar Mohri, Afshin Rostamizadeh, and Ameet Talwalker, 2018.

A gentle introduction to the materials of the first 3 lectures and an overview of probability theory, can be found in chapters 1-6 and 11-12 of the following book: Sanjeev Kulkarni and Gilbert Harman: An Elementary Introduction to Statistical Learning Theory, 2012.

Grading formula

Final grade = 0.35 * [score of homeworks] + 0.35 * [score of colloquium] + 0.3 * [score on the exam] + bonus from quizzes.

All homework questions have the same weight. Each solved extra homework task increases the score of the final exam by 1 point. At the end of the lectures there is a short quiz in which you may earn 0.1 bonus points on the final non-rounded grade.

There is no rounding except for transforming the final grade to the official grade. Arithmetic rounding is used.

Autogrades: if you only need 6/10 on the exam to have the maximal 10/10 for the course, this will be given automatically. This may happen because of extra homework questions and bonuses from quizzes.

Colloquium

Rules and questions from last year.

Date: TBA

Problems exam

TBA
-- You may use handwritten notes, lecture materials from this wiki (either printed or through your PC), Mohri's book
-- You may not search on the internet or interact with other humans (e.g. by phone, forums, etc)

Office hours

Bruno Bauwens: Bruno Bauwens: Tuesday 12h -- 20h. Wednesday 16h -- 18h. Friday 11h -- 17h. Better send me an email in advance.

Nikita Lukianenko: Write in Telegram, the time is flexible