Statistical learning theory 2024/25 — различия между версиями
Bauwens (обсуждение | вклад) |
Bauwens (обсуждение | вклад) |
||
(не показано 6 промежуточных версии этого же участника) | |||
Строка 1: | Строка 1: | ||
− | |||
== General Information == | == General Information == | ||
+ | First lecture Saturday 21.09 at 10h00 in room R208 (and on the above zoom link). | ||
− | Lectures: on | + | Lectures: on Tuesday 9h30--10h50 in room M302 and in [https://us02web.zoom.us/j/82300259484?pwd=NWxXekxBeE5yMm9UTmwvLzNNNGlnUT09 zoom] by [https://www.hse.ru/en/org/persons/160550073 Bruno Bauwens] |
− | Seminars: | + | Seminars: online by [https://www.hse.ru/org/persons/225553845/ Nikita Lukianenko]. |
− | + | Please join the [https://t.me/+1begXb8SomhmODI8 telegram group] The course is similar to [http://wiki.cs.hse.ru/Statistical_learning_theory_2023/24 last year]. | |
+ | == Homeworks == | ||
+ | |||
+ | Deadline every 2 weeks, before the lecture. The tasks are at the end of each problem list. (Problem lists will be updated, check the year.) | ||
+ | |||
+ | Before 3rd lecture: see problem lists 1 and 2. | ||
+ | Before 5th lecture: see problems lists 3 and 4. | ||
+ | Etc. | ||
+ | |||
+ | Email homeworks to brbauwens-at-gmail.com. Start the subject line with SLT-HW. Results will be here. | ||
+ | |||
+ | Late policy: 1 homework can be submitted at most 24 late without explanations. | ||
== Course materials == | == Course materials == | ||
Строка 22: | Строка 33: | ||
|| Philosophy. The online mistake bound model. The halving and weighted majority algorithms. <!-- [https://drive.google.com/drive/folders/1NXiLbhmO2Ml7jFmnLtjqhOgCoHg7yn9T?usp=sharing movies] --> | || Philosophy. The online mistake bound model. The halving and weighted majority algorithms. <!-- [https://drive.google.com/drive/folders/1NXiLbhmO2Ml7jFmnLtjqhOgCoHg7yn9T?usp=sharing movies] --> | ||
|| [https://www.dropbox.com/scl/fi/j2vqqp3e86yx7pkgmkky6/01slides_all.pdf?rlkey=lxzhu3xd3epypia8j49v25erg&dl=0 sl01] | || [https://www.dropbox.com/scl/fi/j2vqqp3e86yx7pkgmkky6/01slides_all.pdf?rlkey=lxzhu3xd3epypia8j49v25erg&dl=0 sl01] | ||
− | || [https://www.dropbox.com/ | + | || [https://www.dropbox.com/scl/fi/svgelu3iwijls092ehqqf/00book_intro.pdf?rlkey=jxdya4290kfc0hfl06b0y7k4b&st=lnv8chxf&dl=0 ch00] [https://www.dropbox.com/s/i9pc4kf0zsdeksb/01book_onlineMistakeBound.pdf?dl=0 ch01] |
|| [https://www.dropbox.com/scl/fi/qs5wqr97qoyh3l2gfju48/01sem.pdf?rlkey=6lvzcbfkw6lj9y77ep64nq7lk&dl=0 prob01] | || [https://www.dropbox.com/scl/fi/qs5wqr97qoyh3l2gfju48/01sem.pdf?rlkey=6lvzcbfkw6lj9y77ep64nq7lk&dl=0 prob01] | ||
|| <!-- [https://www.dropbox.com/scl/fi/kksvt6ttgf06u8uce6g9z/01sol.pdf?rlkey=ldcqaewvg7cqdlfqkt7ltckej&dl=0 sol01] --> | || <!-- [https://www.dropbox.com/scl/fi/kksvt6ttgf06u8uce6g9z/01sol.pdf?rlkey=ldcqaewvg7cqdlfqkt7ltckej&dl=0 sol01] --> | ||
Строка 127: | Строка 138: | ||
Autogrades: if you only need 6/10 on the exam to have the maximal 10/10 for the course, this will be given automatically. This may happen because of extra homework questions and bonuses from quizzes. | Autogrades: if you only need 6/10 on the exam to have the maximal 10/10 for the course, this will be given automatically. This may happen because of extra homework questions and bonuses from quizzes. | ||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
== Colloquium == | == Colloquium == | ||
Строка 150: | Строка 150: | ||
-- You may use handwritten notes, lecture materials from this wiki (either printed or through your PC), Mohri's book <br> | -- You may use handwritten notes, lecture materials from this wiki (either printed or through your PC), Mohri's book <br> | ||
-- You may not search on the internet or interact with other humans (e.g. by phone, forums, etc) | -- You may not search on the internet or interact with other humans (e.g. by phone, forums, etc) | ||
− | |||
− | |||
− | |||
== Office hours == | == Office hours == |
Версия 22:35, 20 сентября 2024
Содержание
General Information
First lecture Saturday 21.09 at 10h00 in room R208 (and on the above zoom link).
Lectures: on Tuesday 9h30--10h50 in room M302 and in zoom by Bruno Bauwens
Seminars: online by Nikita Lukianenko.
Please join the telegram group The course is similar to last year.
Homeworks
Deadline every 2 weeks, before the lecture. The tasks are at the end of each problem list. (Problem lists will be updated, check the year.)
Before 3rd lecture: see problem lists 1 and 2. Before 5th lecture: see problems lists 3 and 4. Etc.
Email homeworks to brbauwens-at-gmail.com. Start the subject line with SLT-HW. Results will be here.
Late policy: 1 homework can be submitted at most 24 late without explanations.
Course materials
Video | Summary | Slides | Lecture notes | Problem list | Solutions |
---|---|---|---|---|---|
Part 1. Online learning | |||||
?? Sept | Philosophy. The online mistake bound model. The halving and weighted majority algorithms. | sl01 | ch00 ch01 | prob01 | |
?? Sept | The perceptron algorithm. Kernels. The standard optimal algorithm. | sl02 | ch02 ch03 | prob02 | |
?? Sept | Prediction with expert advice. Recap probability theory (seminar). | sl03 | ch04 ch05 | prob03 | |
Part 2. Distribution independent risk bounds | |||||
?? Oct | Necessity of a hypothesis class. Sample complexity in the realizable setting, examples: threshold functions and finite classes. | sl04 | ch06 | prob05 | |
?? Oct | Growth functions, VC-dimension and the characterization of sample comlexity with VC-dimensions | sl05 | ch07 ch08 | prob06 | |
?? Oct | Risk decomposition and the fundamental theorem of statistical learning theory | sl06 | ch09 | prob07 | |
?? Oct | Bounded differences inequality, Rademacher complexity, symmetrization, contraction lemma. | sl07 | ch10 ch11 | prob08 | |
Part 3. Margin risk bounds with applications | |||||
?? Nov | Simple regression, support vector machines, margin risk bounds, and neural nets with dropout regularization | sl08 | ch12 ch13 | prob09 | |
?? Nov | Kernels: RKHS, representer theorem, risk bounds | sl09 | ch14 | prob10 | |
?? Nov | AdaBoost and the margin hypothesis | sl10 | ch15 | prob11 | |
?? Nov | Implicit regularization of stochastic gradient descent in overparameterized neural nets (recording with many details about the Hessian) | ch16 ch17 | |||
?? Dec | Part 2 of previous lecture: Hessian control and stability of the NTK. |
The lectures in October and November are based on the book:
Foundations of machine learning 2nd ed, Mehryar Mohri, Afshin Rostamizadeh, and Ameet Talwalker, 2018.
A gentle introduction to the materials of the first 3 lectures and an overview of probability theory, can be found in chapters 1-6 and 11-12 of the following book: Sanjeev Kulkarni and Gilbert Harman: An Elementary Introduction to Statistical Learning Theory, 2012.
Grading formula
Final grade = 0.35 * [score of homeworks] + 0.35 * [score of colloquium] + 0.3 * [score on the exam] + bonus from quizzes.
All homework questions have the same weight. Each solved extra homework task increases the score of the final exam by 1 point. At the end of the lectures there is a short quiz in which you may earn 0.1 bonus points on the final non-rounded grade.
There is no rounding except for transforming the final grade to the official grade. Arithmetic rounding is used.
Autogrades: if you only need 6/10 on the exam to have the maximal 10/10 for the course, this will be given automatically. This may happen because of extra homework questions and bonuses from quizzes.
Colloquium
Rules and questions from last year.
Date: TBA
Problems exam
TBA
-- You may use handwritten notes, lecture materials from this wiki (either printed or through your PC), Mohri's book
-- You may not search on the internet or interact with other humans (e.g. by phone, forums, etc)
Office hours
Bruno Bauwens: TBA
Nikita Lukianenko: Write in Telegram, the time is flexible