Statistical learning theory 2020 — различия между версиями

Материал из Wiki - Факультет компьютерных наук
Перейти к: навигация, поиск
 
(не показана одна промежуточная версия этого же участника)
Строка 11: Строка 11:
 
- Group 1: Saturday 11h10 - 12h30, Bruno Bauwens and Vladimir Podolskii zoom https://zoom.us/j/94186131884, <br>
 
- Group 1: Saturday 11h10 - 12h30, Bruno Bauwens and Vladimir Podolskii zoom https://zoom.us/j/94186131884, <br>
 
- Group 2: Tuesday 18h, Nikita Lukyanenko, see [https://ruz.hse.ru ruz.hse.ru]
 
- Group 2: Tuesday 18h, Nikita Lukyanenko, see [https://ruz.hse.ru ruz.hse.ru]
 
Practical information on [https://t.me/joinchat/DZPMBRkbM5uX1l9e0pj_LQ telegram group]
 
 
[https://www.dropbox.com/s/z5q5ib34abybj06/scores.ods?dl=0 Results.] (There is a problem to show the page in Dropbox.)
 
 
 
[https://www.dropbox.com/s/updek6bpztxwm0m/finalScores.pdf?dl=0 Final grades in PDF]
 
  
  

Текущая версия на 10:02, 3 сентября 2021

General Information

Grading

Teachers: Bruno Bauwens and Vladimir Podolskii

Lectures: Saturday 9h30 - 10h50, zoom https://zoom.us/j/96210489901

Seminars
- Group 1: Saturday 11h10 - 12h30, Bruno Bauwens and Vladimir Podolskii zoom https://zoom.us/j/94186131884,
- Group 2: Tuesday 18h, Nikita Lukyanenko, see ruz.hse.ru


Reexam

Date: Sat 23 Jan and 30 Jan 14h

Consists of a retake of the colloquium and the problems exam. Somewhere in the first hour (depending on the availability of the teacher), you redo the colloquium and in the remaining time, you solve 4 or 5 problems similar as in the exam on Dec 23rd.

In the calculation of the grades, the homework results are dropped, and the final grade consists of the average of the colloquium part and the problems part (with equal weight).

Zoom link 30 Jan: [1]


Colloquium

Saturday 12 Dec and Tuesday 15 Dec, online. Choose your timeslot

Rules and questions. version 06/12. Q&A

Course materials

Date Summary Lecture notes Slides Video Problem list Solutions
12 Sept Introduction and sample complexity in the realizable setting lecture1.pdf slides1.pdf Problem list 1 Update 26.09, prob 1.7 Solutions 1
19 Sept VC-dimension and sample complexity lecture2.pdf slides2.pdf Chapt 2,3 Problem list 2 Solutions 2
26 Sept Risk bounds and the fundamental theorem of statistical learning theory lecture3.pdf slides3.pdf Problem list 3 Solutions 3
03 Oct Rademacher complexity lecture4.pdf slides4.pdf Problem list 4 Update 23.10, prob 4.1d Solutions 4
10 Oct Support vector machines and risk bounds Chapt 5, Mohri et al, see below slides5.pdf Problem list 5 Update 29.10, typo 5.8 Solutions 5
17 Oct Support vector machines and recap Chapt 5, Mohri et al. slides6.pdf Problem list 6 Update 10.11 Solutions 6
31 Oct Kernels lecture7.pdf slides7.pdf Problem list 7 Update 11.11, prob 7.6 Solutions 7
07 Nov Adaboost Chapt 6, Mohri et al slides8.pdf Problem list 8 Solutions 8
14 Nov Online learning 1, Littlestone dimension, weighted majority algorithm Chapt 7, Mohri et al, and Животовский slides9.pdf Problem list 9 Update 08.12, 9.4 Solutions 9
21 Nov Online learning 2, Exponential weighted average algorithm, preceptron Chapt 7, Mohri et al slides9.pdf Problem list 10 Solutions 10
28 Nov Online learning 3, perception, Winnow and online to batch conversion Chapt 7, Mohri et al slides11.pdf Problem list 11 Solutions 11
5 Dec Recap of requested topics, Q&A Q&A


The lectures in October and November are based on the book: Foundations of machine learning 2nd ed, Mehryar Mohri, Afshin Rostamizadeh, and Ameet Talwalker, 2018. This book can be downloaded from http://gen.lib.rus.ec/ .

For online learning, we also study a few topics from lecture notes by Н. К. Животовский

Office hours

Person Monday Tuesday Wednesday Thursday Friday
Bruno Bauwens, Zoom (email in advance) 14h-18h 16h15-20h Room S834 Pokrovkaya 11

It is always good to send an email in advance. Questions are welcome.