Statistical learning theory 2021 — различия между версиями
Bbauwens (обсуждение | вклад) |
Bbauwens (обсуждение | вклад) |
||
(не показано 48 промежуточных версии 2 участников) | |||
Строка 6: | Строка 6: | ||
Teachers: [https://www.hse.ru/en/org/persons/160550073 Bruno Bauwens] and [https://www.hse.ru/en/org/persons/225553845 Nikita Lukianenko] | Teachers: [https://www.hse.ru/en/org/persons/160550073 Bruno Bauwens] and [https://www.hse.ru/en/org/persons/225553845 Nikita Lukianenko] | ||
− | Lectures: Saturday 14:40 - 16:00. The lectures are | + | Lectures: Saturday 14:40 - 16:00. The lectures are in zoom. |
Seminars: Tuesday 16:20 - 17:40. The seminars are [https://meet.google.com/ber-yzns-hxz here] in google.meet. | Seminars: Tuesday 16:20 - 17:40. The seminars are [https://meet.google.com/ber-yzns-hxz here] in google.meet. | ||
− | + | Practical information on a telegram group. | |
− | + | ||
− | Practical information on a telegram group. | + | |
The course is similar [http://wiki.cs.hse.ru/Statistical_learning_theory_2020 last year], except for the order of topics and part 3. | The course is similar [http://wiki.cs.hse.ru/Statistical_learning_theory_2020 last year], except for the order of topics and part 3. | ||
+ | |||
+ | == Problems exam == | ||
+ | |||
+ | Dec 22, 12:00 -- 15:30 | ||
+ | |||
+ | During the exam<br> | ||
+ | -- You may consult notes, books and search on the internet <br> | ||
+ | -- You may not interact with other humans (e.g. by phone, forums, etc) | ||
+ | |||
+ | == Colloquium == | ||
+ | |||
+ | Saturday December 11 | ||
+ | |||
+ | [https://www.dropbox.com/s/u8hyo1omvaoujle/colloqQuest.pdf?dl=0 rules and list of questions] (version Dec 10) | ||
== Homeworks == | == Homeworks == | ||
− | Email to brbauwens-at-gmail.com. Start the subject line with SLT-HW. [https://www.dropbox.com/s/ | + | Email to brbauwens-at-gmail.com. Start the subject line with SLT-HW. [https://www.dropbox.com/s/prsmzhtr5p5uome/scores.pdf?dl=0 Results] |
Deadline before the lecture, every other lecture. | Deadline before the lecture, every other lecture. | ||
Строка 24: | Строка 36: | ||
25 Sept: see problem lists 1 and 2 <br> | 25 Sept: see problem lists 1 and 2 <br> | ||
09 Oct: see problem lists 3 and 4 <br> | 09 Oct: see problem lists 3 and 4 <br> | ||
− | 29 Oct: see problem lists 5 and 6 <br | + | 29 Oct: see problem lists 5 and 6 <br> |
13 Nov: see problem lists 7 and 8 <br> | 13 Nov: see problem lists 7 and 8 <br> | ||
− | + | 30 Nov, 08:00 [extended]: see problem lists 9 and 10 | |
− | + | ||
== Course materials == | == Course materials == | ||
Строка 40: | Строка 51: | ||
| [https://drive.google.com/file/d/1WL9LSNDD1B_q6LdpfDQ8BPluNfhjWrD9/view?usp=sharing 4 Sept] | | [https://drive.google.com/file/d/1WL9LSNDD1B_q6LdpfDQ8BPluNfhjWrD9/view?usp=sharing 4 Sept] | ||
|| Lecture: philosophy. Seminar: the online mistake bound model, the weighted majority, and perceptron algorithms [https://drive.google.com/drive/folders/1NXiLbhmO2Ml7jFmnLtjqhOgCoHg7yn9T?usp=sharing movies] | || Lecture: philosophy. Seminar: the online mistake bound model, the weighted majority, and perceptron algorithms [https://drive.google.com/drive/folders/1NXiLbhmO2Ml7jFmnLtjqhOgCoHg7yn9T?usp=sharing movies] | ||
− | || [https://www.dropbox.com/s/uk9awkfa827pmtf/01allSlides.pdf?dl=0 | + | || [https://www.dropbox.com/s/uk9awkfa827pmtf/01allSlides.pdf?dl=0 sl01] |
− | || [https://www.dropbox.com/s/uvsfzb997kantoa/00book_intro.pdf?dl=0 | + | || [https://www.dropbox.com/s/uvsfzb997kantoa/00book_intro.pdf?dl=0 ch00] [https://www.dropbox.com/s/6ah70h5loyrz5lx/01book_onlineMistakeBound.pdf?dl=0 ch01] |
|| [https://www.dropbox.com/s/aoma8ma8mkd3885/01sem.pdf?dl=0 01prob (9 Sept)] | || [https://www.dropbox.com/s/aoma8ma8mkd3885/01sem.pdf?dl=0 01prob (9 Sept)] | ||
|| [https://www.dropbox.com/s/sqzqlrtzr2nu8cq/01sol.pdf?dl=0 01sol] | || [https://www.dropbox.com/s/sqzqlrtzr2nu8cq/01sol.pdf?dl=0 01sol] | ||
Строка 47: | Строка 58: | ||
| [https://drive.google.com/file/d/16OoCqhh16BKQzyF-HM8RozigyJ3BBVxA/view?usp=sharing 11 Sept] | | [https://drive.google.com/file/d/16OoCqhh16BKQzyF-HM8RozigyJ3BBVxA/view?usp=sharing 11 Sept] | ||
|| The perceptron algorithm in the agnostic setting. Kernels. The standard optimal algorithm. | || The perceptron algorithm in the agnostic setting. Kernels. The standard optimal algorithm. | ||
− | || [https://www.dropbox.com/s/sy959ee81mov5cr/02slides.pdf?dl=0 | + | || [https://www.dropbox.com/s/sy959ee81mov5cr/02slides.pdf?dl=0 sl02] |
− | || [https://www.dropbox.com/s/0029k15cbnxj2v1/02book_sequentialOptimalAlgorithm.pdf?dl=0 | + | || [https://www.dropbox.com/s/0029k15cbnxj2v1/02book_sequentialOptimalAlgorithm.pdf?dl=0 ch02] [https://www.dropbox.com/s/eggk7kctgox8aza/03book_perceptron.pdf?dl=0 ch03] |
|| [https://www.dropbox.com/s/415nws7qi589bme/02sem.pdf?dl=0 02prob (23 Sept)] | || [https://www.dropbox.com/s/415nws7qi589bme/02sem.pdf?dl=0 02prob (23 Sept)] | ||
|| [https://www.dropbox.com/s/ofcctflbnxt0kx3/02sol.pdf?dl=0 02sol] | || [https://www.dropbox.com/s/ofcctflbnxt0kx3/02sol.pdf?dl=0 02sol] | ||
Строка 54: | Строка 65: | ||
| 18 Sept (rec to do) | | 18 Sept (rec to do) | ||
|| Prediction with expert advice and the exponentially weighted majority algorithm. Recap probability theory. | || Prediction with expert advice and the exponentially weighted majority algorithm. Recap probability theory. | ||
− | || [https://www.dropbox.com/s/a60p9b76cxusgqy/03slides.pdf?dl=0 | + | || [https://www.dropbox.com/s/a60p9b76cxusgqy/03slides.pdf?dl=0 sl03] |
|| [https://www.dropbox.com/s/ytl6q83q6gkax3w/04book_predictionWithExperts.pdf?dl=0 ch04] [https://www.dropbox.com/s/l11afq1d0qn6za7/05book_introProbability.pdf?dl=0 ch05] | || [https://www.dropbox.com/s/ytl6q83q6gkax3w/04book_predictionWithExperts.pdf?dl=0 ch04] [https://www.dropbox.com/s/l11afq1d0qn6za7/05book_introProbability.pdf?dl=0 ch05] | ||
|| [https://www.dropbox.com/s/nsrcy3yxgey67lp/03sem.pdf?dl=0 03prob(30 Sept)] | || [https://www.dropbox.com/s/nsrcy3yxgey67lp/03sem.pdf?dl=0 03prob(30 Sept)] | ||
Строка 93: | Строка 104: | ||
|| Simple regression, support vector machines, margin risk bounds, and neural nets | || Simple regression, support vector machines, margin risk bounds, and neural nets | ||
|| [https://www.dropbox.com/s/0xrhe4732d0jshb/08slides.pdf?dl=0 sl08] | || [https://www.dropbox.com/s/0xrhe4732d0jshb/08slides.pdf?dl=0 sl08] | ||
− | || [https://www.dropbox.com/s/cvqlwst3e69709t/12book_regression.pdf?dl=0 ch12] | + | || [https://www.dropbox.com/s/cvqlwst3e69709t/12book_regression.pdf?dl=0 ch12] [https://www.dropbox.com/s/dwwxgriiaj4efn0/13book_SVM.pdf?dl=0 ch13] |
− | || [https://www.dropbox.com/s/qqdbrh2ll0dv03a/08sem.pdf?dl=0 | + | || [https://www.dropbox.com/s/qqdbrh2ll0dv03a/08sem.pdf?dl=0 08prob] |
− | || [https://www.dropbox.com/s/9o8fyd0ff735hxu/08sol.pdf?dl=0 | + | || [https://www.dropbox.com/s/9o8fyd0ff735hxu/08sol.pdf?dl=0 08sol] |
|- | |- | ||
| [https://youtu.be/9FhFxLHR4eE 6 Nov] | | [https://youtu.be/9FhFxLHR4eE 6 Nov] | ||
Строка 101: | Строка 112: | ||
|| [https://www.dropbox.com/s/nhqtbekclekf6k7/09slides.pdf?dl=0 sl09] | || [https://www.dropbox.com/s/nhqtbekclekf6k7/09slides.pdf?dl=0 sl09] | ||
|| [https://www.dropbox.com/s/bpb9ijn2p7k19j3/14book_kernels.pdf?dl=0 ch14] | || [https://www.dropbox.com/s/bpb9ijn2p7k19j3/14book_kernels.pdf?dl=0 ch14] | ||
− | || [https://www.dropbox.com/s/d2dmh017lw207ns/09sem.pdf?dl=0 | + | || [https://www.dropbox.com/s/d2dmh017lw207ns/09sem.pdf?dl=0 09prob] (Nov 23) |
− | || | + | || [https://www.dropbox.com/s/2wq9mxrqchsqujr/09sol.pdf?dl=0 09sol] |
|- | |- | ||
− | | 13 Nov | + | | [https://youtu.be/ZBHe5RhTuzI 13 Nov] |
|| AdaBoost and risk bounds | || AdaBoost and risk bounds | ||
− | || | + | || [https://www.dropbox.com/s/umum3kd9439dt42/10slides.pdf?dl=0 sl10] |
|| Mohri et al, chapt 7 | || Mohri et al, chapt 7 | ||
− | || | + | || [https://www.dropbox.com/s/j8s197e0mjv9qla/10sem.pdf?dl=0 10prob] (Nov 23) |
− | || | + | || [https://www.dropbox.com/s/7lw1u8750k7s8qt/10sol.pdf?dl=0 10sol] |
|- | |- | ||
| | | | ||
|| ''Part 3. Other topics'' | || ''Part 3. Other topics'' | ||
|- | |- | ||
− | | 20 Nov | + | | [https://youtu.be/L4o7dXcaQrk 20 Nov] |
|| Clustering | || Clustering | ||
− | || | + | || [https://www.dropbox.com/s/5a9flvg95iihz7m/11slides.pdf?dl=0 sl11] |
− | || | + | || Mohri et al, ch7; [https://people.csail.mit.edu/dsontag/courses/ml12/slides/lecture14.pdf lecture] |
− | || | + | || <!-- [https://www.dropbox.com/s/a9459keof3omav1/11sem.pdf?dl=0 11prob] --> |
− | || | + | || <!-- [https://www.dropbox.com/s/kredac52pbn7qvk/11sol.pdf?dl=0 11sol] --> |
|- | |- | ||
− | | 27 Nov | + | | [https://youtu.be/FN6l4Ceq5lE 27 Nov] |
|| Dimensionality reduction and the Johnson-Lindenstrauss lemma | || Dimensionality reduction and the Johnson-Lindenstrauss lemma | ||
− | || | + | || [https://www.dropbox.com/s/wbgwwk7a9mjo1bv/12slides.pdf?dl=0 sl12] |
− | || | + | || Mohri et al, ch15; [https://ramanlab.wustl.edu/Lectures/Lecture12_LDA_CCA.pdf lecture] |
− | || | + | || [https://www.dropbox.com/s/c5anx2htaw9rslr/12sem.pdf?dl=0 12prob] |
|| | || | ||
|- | |- | ||
Строка 141: | Строка 152: | ||
|| | || | ||
|| | || | ||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
|} | |} | ||
Строка 234: | Строка 160: | ||
The lectures in October and November are based on the book: | The lectures in October and November are based on the book: | ||
Foundations of machine learning 2nd ed, Mehryar Mohri, Afshin Rostamizadeh, and Ameet Talwalker, 2018. This book can be downloaded from http://gen.lib.rus.ec/ . | Foundations of machine learning 2nd ed, Mehryar Mohri, Afshin Rostamizadeh, and Ameet Talwalker, 2018. This book can be downloaded from http://gen.lib.rus.ec/ . | ||
− | |||
== Office hours == | == Office hours == | ||
Строка 249: | Строка 174: | ||
It is always good to send an email in advance. Questions and feedback are welcome. | It is always good to send an email in advance. Questions and feedback are welcome. | ||
− | |||
− | |||
<!-- | <!-- |
Текущая версия на 14:35, 16 декабря 2022
Содержание
General Information
Teachers: Bruno Bauwens and Nikita Lukianenko
Lectures: Saturday 14:40 - 16:00. The lectures are in zoom.
Seminars: Tuesday 16:20 - 17:40. The seminars are here in google.meet.
Practical information on a telegram group.
The course is similar last year, except for the order of topics and part 3.
Problems exam
Dec 22, 12:00 -- 15:30
During the exam
-- You may consult notes, books and search on the internet
-- You may not interact with other humans (e.g. by phone, forums, etc)
Colloquium
Saturday December 11
rules and list of questions (version Dec 10)
Homeworks
Email to brbauwens-at-gmail.com. Start the subject line with SLT-HW. Results
Deadline before the lecture, every other lecture.
25 Sept: see problem lists 1 and 2
09 Oct: see problem lists 3 and 4
29 Oct: see problem lists 5 and 6
13 Nov: see problem lists 7 and 8
30 Nov, 08:00 [extended]: see problem lists 9 and 10
Course materials
Video | Summary | Slides | Lecture notes | Problem list | Solutions |
---|---|---|---|---|---|
Part 1. Online learning | |||||
4 Sept | Lecture: philosophy. Seminar: the online mistake bound model, the weighted majority, and perceptron algorithms movies | sl01 | ch00 ch01 | 01prob (9 Sept) | 01sol |
11 Sept | The perceptron algorithm in the agnostic setting. Kernels. The standard optimal algorithm. | sl02 | ch02 ch03 | 02prob (23 Sept) | 02sol |
18 Sept (rec to do) | Prediction with expert advice and the exponentially weighted majority algorithm. Recap probability theory. | sl03 | ch04 ch05 | 03prob(30 Sept) | 03sol |
Part 2. Risk bounds for binary classification | |||||
25 Sept | Sample complexity in the realizable setting, simple examples and bounds using VC-dimension | sl04 | ch06 | 04prob | 04sol |
2 Oct | Growth functions, VC-dimension and the characterization of sample comlexity with VC-dimensions | sl05 | ch07 ch08 | 05prob | 05sol |
9 Oct | Risk decomposition and the fundamental theorem of statistical learning theory | sl06 | ch09 | 06prob | 06sol |
16 Oct | Bounded differences inequality and Rademacher complexity | sl07 | ch10 ch11 | 07prob | 07sol |
30 Oct | Simple regression, support vector machines, margin risk bounds, and neural nets | sl08 | ch12 ch13 | 08prob | 08sol |
6 Nov | Kernels: risk bounds, RKHS, representer theorem, design | sl09 | ch14 | 09prob (Nov 23) | 09sol |
13 Nov | AdaBoost and risk bounds | sl10 | Mohri et al, chapt 7 | 10prob (Nov 23) | 10sol |
Part 3. Other topics | |||||
20 Nov | Clustering | sl11 | Mohri et al, ch7; lecture | ||
27 Nov | Dimensionality reduction and the Johnson-Lindenstrauss lemma | sl12 | Mohri et al, ch15; lecture | 12prob | |
4 Dec | No lecture | ||||
11 Dec | Colloquium |
The lectures in October and November are based on the book: Foundations of machine learning 2nd ed, Mehryar Mohri, Afshin Rostamizadeh, and Ameet Talwalker, 2018. This book can be downloaded from http://gen.lib.rus.ec/ .
Office hours
Person | Monday | Tuesday | Wednesday | Thursday | Friday | |
---|---|---|---|---|---|---|
Bruno Bauwens, Zoom | 12h30-14h30 | 14h-20h | Room S834 Pokrovkaya 11 | |||
Nikita Lukianenko, Telegram | 14h30-16h30 | 14h30-16h30 | Room S831 Pokrovkaya 11 |
It is always good to send an email in advance. Questions and feedback are welcome.