Neurobayesian models 2019 — различия между версиями

Материал из Wiki - Факультет компьютерных наук
Перейти к: навигация, поиск
(Новая страница: «'''The page is not ready yet!''' '''Lector:''' [https://www.hse.ru/en/staff/dvetrov Dmitry Vetrov] '''Tutors:''' [https://www.hse.ru/en/org/persons/190884100 Al…»)
 
(Exam)
 
(не показаны 34 промежуточные версии 3 участников)
Строка 1: Строка 1:
'''The page is not ready yet!'''
 
 
 
'''Lector:''' [https://www.hse.ru/en/staff/dvetrov Dmitry Vetrov]
 
'''Lector:''' [https://www.hse.ru/en/staff/dvetrov Dmitry Vetrov]
  
 
'''Tutors:''' [https://www.hse.ru/en/org/persons/190884100 Alexander Grishin], [https://www.hse.ru/en/org/persons/165140955 Kirill Struminsky], [https://www.hse.ru/en/org/persons/205487138 Dmitry Molchanov], [https://www.hse.ru/en/org/persons/191263008 Kirill Neklyudov], [http://artem.sobolev.name/ Artem Sobolev], [https://ars-ashuha.ru/ Arsenii Ashukha], [https://bayesgroup.ru/people/oleg-ivanov/ Oleg Ivanov], [https://www.hse.ru/en/org/persons/131072080 Ekaterina Lobacheva].
 
'''Tutors:''' [https://www.hse.ru/en/org/persons/190884100 Alexander Grishin], [https://www.hse.ru/en/org/persons/165140955 Kirill Struminsky], [https://www.hse.ru/en/org/persons/205487138 Dmitry Molchanov], [https://www.hse.ru/en/org/persons/191263008 Kirill Neklyudov], [http://artem.sobolev.name/ Artem Sobolev], [https://ars-ashuha.ru/ Arsenii Ashukha], [https://bayesgroup.ru/people/oleg-ivanov/ Oleg Ivanov], [https://www.hse.ru/en/org/persons/131072080 Ekaterina Lobacheva].
  
'''Contacts:''' All the questions should be addressed to '''bayesml@gmail.com'''. Theme of any letter must contain the following tag: '''[HSE NBM19]'''. Letters without the tag will be most probably lost in the inbox.
+
'''Contacts:''' All the questions should be addressed to '''bayesml@gmail.com'''. Subject line of any letter must contain the following tag: '''[HSE NBM19]'''. Letters without the tag will be most probably lost in the inbox.
  
We also have a chat in Telegram (link to it was sent to the group email). It's main language is Russian, but all the questions in English will be answered in English. All important news will be announced in English in the chat and also sent to the group e-mail.
+
We also have a chat in Telegram. Its main language is English. All important news will be announced in the chat.  
  
 
===Course description===  
 
===Course description===  
Строка 15: Строка 13:
  
 
=== News ===
 
=== News ===
 +
* The first assignment has been uploaded to anytask. Deadline: 22 February, 23:00.
 +
* The second assignment has been uploaded to anytask. Deadline: 8 March, 23:00.
 +
* The third assignment has been uploaded to anytask. Deadline: 15 March, 23:00.
 +
* A hard deadline for all assignments: 20 March, 23:00.
  
 
===Grading System===
 
===Grading System===
  
The assessment consist of 3 practical assignments and a final oral exam. Practical assignments consist in programming some models/methods from the course in Python and analysing their behavior: VAE, Normalizing flows, Sparse Variational Dropout. At the final exam students have to demonstrate knowledge of the material covered during the entire course.
+
The assessment consists of 3 practical assignments and a final oral exam. Practical assignments consist of programming some models/methods from the course in Python and analysing their behavior: VAE, Normalizing flows, Sparse Variational Dropout. At the final exam students have to demonstrate knowledge of the material covered during the entire course.
  
 
Final course grade is obtained from the following formula:  
 
Final course grade is obtained from the following formula:  
Строка 26: Строка 28:
 
where О_cumulative is an average grade for the practical assignments.  
 
where О_cumulative is an average grade for the practical assignments.  
  
All grades are in ten-point grading scale. If О_cumulative or О_final has a fractional part greater or equal than 0.5 then it is rounded up.  
+
All grades are in ten-point grading scale. If О_cumulative or О_final has a fractional part greater or equal than 0.5 then it is rounded up.
  
 
===Assignments  ===
 
===Assignments  ===
''# В рамках курса предполагается выполнение трёх практических заданий. Задания сдаются в системе [https://anytask.org/course/444 anytask]. Для получения инвайта по курсу просьба писать на почту курса. Интерфейс этой системы, к сожалению, только на русском, поэтому все не русскоязычные студенты могут сдавать задание на почту курса. При этом нужно в теме письма указать ваше имя, фамилию и номер задания.''
+
* The course contains three practical assignments. Solutions should be submitted to [https://anytask.org/course/444 anytask]. To get the invite, please write to the course e-mail. The site has an interface only in Russian, so non-Russian speaking students may submit their solutions to the course e-mail. In this case, the subject line of the letter in addition to the tag should contain your name, surname and assignment number.
''# Все задания сдаются на Python 3. ''
+
* All assignments should be coded in Python 3 using PyTorch.
''# Задания выполняются самостоятельно. Если задание обсуждалось сообща, или использовались какие-либо сторонние коды и материалы, то об этом должно быть написано в отчете. В противном случае „похожие“ решения считаются плагиатом и все задействованные студенты (в том числе те, у кого списали) будут сурово наказаны.''
+
* Students have to complete all assignments by themselves. Using code of your colleagues or code from open implementations is prohibited and will be considered as plagiarism. All involved students (including those who shared their solutions) will be severely punished.
''# Задания оцениваются из 10 баллов. За сдачу заданий позже срока начисляется штраф в размере 0.3 балла за каждый день просрочки, но суммарно не более 6 баллов. ''
+
* Assignments are scored up to 10 points. Each assignment has a deadline, a penalty is charged in the amount of 0.3 points for each day of delay, but in total not more than 6 points. Usually you will have 2 weeks to solve an assignment. Some assignments may contain bonus parts.
  
''Each practical assignment has a deadline, a penalty is charged in the amount of 0.3 points for each day of delay, but in total not more than 6 points. Students have to complete all assignments by themselves, plagiarism is strictly prohibited. Some assignments may contain bonus part.
+
Approximate dates of assignments' upload: 7 February, 14 February, 28 February
Примерные даты выдачи домашних заданий (они могут быть изменены!): TBA''
+
  
At the end of the module before the exam there will be a hard deadline for all assignments! Exact date will be announced later.
+
A hard deadline for all assignments: 20 March, 23:00.
  
 
===Exam  ===
 
===Exam  ===
TBA
+
Exam questions are published [https://goo.gl/hNjh5M here].
 +
 
 +
At the beginning of the exam, we will give you a random question from the main part of the exam questions list. You will have one hour to prepare your answer. During this hour you may use any materials (including the ones on your laptop).
 +
 
 +
Then you discuss your answer with an examiner, answer questions from the theoretical minimum, answer additional questions on the course and solve problems. At this part of the exam, you CAN NOT use any materials. Please pay attention to the theoretical minimum - inability to answer any questions from it automatically entails an unsatisfactory mark for the exam.
  
 
=== Course Plan ===
 
=== Course Plan ===
Строка 46: Строка 51:
 
{| class="wikitable"
 
{| class="wikitable"
 
|-
 
|-
! '''Занятие''' !! '''Дата''' !! '''Название''' !! '''Материалы'''
+
! '''''' !! '''Date''' !! '''Theme'''
 +
|-
 +
| rowspan="2" | 1 ||  24 January || Lecture: Stochastic Variational Inference
 
|-
 
|-
| 1 || 6, 20 сентября || Лекция и семинар: Байесовский подход к теории вероятностей, примеры байесовских рассуждений. || [https://bayesgroup.github.io/bmml/2016/Lectures/lecture01_presentation.pdf лекция (презентация)], [https://drive.google.com/open?id=13Q58mRGh5uN8xyhMiTfoOXOYvxUKbvRY лекция(конспект)], [https://bayesgroup.github.io/bmml/2016/Lectures/lecture01_summary.pdf лекция (саммари)], [https://bayesgroup.github.io/bmml/2016/Seminars/BMML_sem1_2016.pdf семинар(задачи)], [https://drive.google.com/open?id=0B7TWwiIrcJstOWMzYUNPaEM3Wjg семинар(конспект)]
+
|31 January||Seminar: Application of SVI to Latent Dirichlet Allocation model
 +
|-
 +
| rowspan="2" | 2 || 31 January || Lecture: Doubly Stochastic Variational Inference
 
|-
 
|-
| 2 || 20, 27 сентября || Лекция и семинар: Аналитический байесовский вывод, сопряжённые распределения, экспоненциальный класс распределений, примеры. || [https://drive.google.com/file/d/1g9cNLw85MchawKbSV7F0nUXyEi9m36sR/view?usp=sharing лекция(конспект)], [http://bayesgroup.github.io/bmml/2016/Seminars/BMML_sem2_2016.pdf семинар(задачи)] [https://drive.google.com/file/d/0B7TWwiIrcJstZHVSUzBBMHZBRlE/view?usp=sharing семинар(конспект)]
+
| 7 February || Seminar: Doubly Stochastic Variational Inference
 
|-
 
|-
| 3 || 27 сентября, 4 октября || Лекция и семинар: Задача выбора модели по Байесу, принцип наибольшей обоснованности, примеры выбора вероятностной модели. || [http://www.machinelearning.ru/wiki/images/b/bd/BMMO11_5.pdf лекция(презентация)], [https://drive.google.com/file/d/1l8fhZQ5V60wZaL9n_YlKNESW1y01PtX2/view?usp=sharing лекция(конспект)], [https://drive.google.com/open?id=0B7TWwiIrcJstZExmM2ZtTFhPMFk семинар(задачи)], [https://drive.google.com/file/d/1zEz6wy3uF_8i27C-8jTpO7OGSrbwPT5B/view?usp=sharing семинар(конспект)]
+
| rowspan="2" | 3 || 7 February || Lecture: Variational autoencoders (VAE) and normalizing flows (NF)  
 +
|-
 +
| 14 February || Seminar: Importance Weighted Autoencoders + more complex NF
 
|-
 
|-
| 4 || 4, 11 октября|| Лекция и семинар: Метод релевантных векторов для задачи регрессии, автоматическое определение значимости. Матричные вычисления. ||  
+
| rowspan="2" | 4 || 14 February || Lecture: Implicit Variational Inference using Adversarial Training
[https://bayesgroup.github.io/bmml/2016/Lectures/lecture04_presentation.pdf лекция(презентация)], [https://drive.google.com/file/d/1wr6qJCZPZ5W2s4jdJRpoFO_E2J3oPPP1/view?usp=sharing лекция(конспект)], [http://www.machinelearning.ru/wiki/images/2/2a/Matrix-Gauss.pdf семинар(задачи 1 с разбором)], [http://www.machinelearning.ru/wiki/images/1/16/S04_matrix_calculations.pdf семинар(задачи 2 с разбором)]
+
|-
 +
| 21 February || Seminar: f-GAN
 
|-
 
|-
| 5 || 11, 18 октября || Лекция и семинар: Метод релевантных векторов для задачи классификации, приближение Лапласа. || [http://www.machinelearning.ru/wiki/images/6/6c/BMMO11_8.pdf лекция(саммари)], [https://drive.google.com/file/d/1cDEShfLPKXSc-OPUXm4nCYZLPvzaBVHg/view?usp=sharing лекция(конспект)], [https://github.com/bayesgroup/bayesgroup.github.io/blob/master/bmml/2016/Seminars/BMML_sem5_2016.pdf семинар(задачи)], [https://drive.google.com/file/d/0B7TWwiIrcJstcU1iRi1Ldy1zY0k/view?usp=sharing семинар(конспект)]
+
| rowspan="2" | 5 || 21 February || Lecture: Bayesian neural networks
 +
|-
 +
| 28 February || Seminar: Local reparametrization trick
 
|-
 
|-
| 6 || 18 октября, 1 ноября || Лекция и семинар: Обучение при скрытых переменных, ЕМ-алгоритм в общем виде, байесовская модель метода главных компонент. || [http://www.machinelearning.ru/wiki/images/7/73/BMMO11_11.pdf лекция], [https://drive.google.com/file/d/13bmPc3sJJLgN45j75DqlcBgEzqL2u4Rv/view?usp=sharing лекция(конспект)], [https://github.com/bayesgroup/bayesgroup.github.io/blob/master/bmml/2016/Seminars/BMML_sem6_2016.pdf семинар(задачи)], [https://drive.google.com/file/d/1vHt8Zul2igQ-rS2lpWjTq43TeYj15dKC/view?usp=sharing семинар(конспект)]
+
| 6 || 28 February || Lecture: Bayesian compression of neural networks
 +
|-  
 +
| rowspan="2" | 7 || 7 March || Lecture: Discrete Latent Variables and Variance Reduction
 
|-
 
|-
| 7 || 1, 8 ноября || Лекция и семинар: Вариационный подход для приближённого байесовского вывода. || [http://www.machinelearning.ru/wiki/images/5/57/BMMO11_9.pdf лекция], [http://www.machinelearning.ru/wiki/images/6/60/BMMO14_variational_lecture.pdf лекция (саммари)], [https://drive.google.com/file/d/18UP8ic6lq1DOOJZlKfGhHr46PJAb6oMY/view?usp=sharing лекция(конспект)],[http://www.machinelearning.ru/wiki/images/8/80/BMML15_S08_variational_inference.pdf (задачи)], [https://drive.google.com/file/d/0B7TWwiIrcJstTEpMUkRSTEk0VDA/view?usp=sharing семинар(конспект)]
+
| 7 March || Seminar: Discrete Latent Variables and Variance Reduction
 +
|-
 +
| 8||14 March || Seminar: Deep Markov chain Monte Carlo (MCMC)
 
|-
 
|-
| 8 || 8, 15 ноября || Лекция и семинар: Методы Монте-Карло с марковскими цепями (MCMC). || [http://www.machinelearning.ru/wiki/images/6/6b/BMMO11_10.pdf лекция], [http://www.machinelearning.ru/wiki/images/b/b9/BMMO8_2.pdf семинар(задачи)]
+
| rowspan="2" | 9 || 14 March || Lecture: Semi-implicit variational inference
|-
+
|-  
| 9 || 15, 22 ноября || Лекция и семинар: Гибридный метод Монте-Карло с марковскими цепями и его масштабируемые обобщения. || [https://arxiv.org/abs/1206.1901 Hamiltonian dynamics], [https://www.ics.uci.edu/~welling/publications/papers/stoclangevin_v6.pdf Langevin Dynamics], [http://www.machinelearning.ru/wiki/images/c/c3/S09_HMC.pdf семинар(задачи)]
+
| 21 March || Seminar: VampPrior
|-
+
| 10 || 22, 29 ноября || Лекция и семинар: Гауссовские процессы для регрессии и классификации. || материалы лекции изложены в разделе 6.4 Бишопа, [http://www.machinelearning.ru/wiki/images/8/81/S11_GP_BMML16.pdf семинар(задачи)], [https://drive.google.com/file/d/1piwFueUpkLkMY2Vi1XWZK4L2-WSnVpEQ/view?usp=sharing семинар(конспект)]
+
|-
+
| 11 || 29 ноября, 6 декабря || Лекция и семинар: Модель LDA для тематического моделирования. || [http://www.machinelearning.ru/wiki/images/8/82/BMMO11_14.pdf лекция], [https://drive.google.com/file/d/1SZlHTCrPW0x4xSfeSd3EIG-JserQ_VrU/view?usp=sharing семинар(конспект)], [http://www.cs.berkeley.edu/~jordan/papers/hierarchical-dp.pdf Статья по HDP]
+
|-
+
| 12 || 13 декабря || Лекция: Стохастический вариационный вывод. Вариационный автокодировщик. || [http://jmlr.org/papers/v14/hoffman13a.html статья 1], [https://arxiv.org/abs/1312.6114 статья 2]
+
 
|}
 
|}
 +
 +
=== Course Materials ===
 +
[https://docs.google.com/document/d/1Mqzo89OeX5gteQ9qYF5rhz6Ap00JQEyoiq2WKo54vA8/edit?usp=sharing List of all materials (relevant papers, blogposts, etc.)]
  
 
====Reading List  ====
 
====Reading List  ====
Строка 82: Строка 97:
 
====Useful links  ====
 
====Useful links  ====
  
[The same course in Russian at MSU] (contains more materials in Russian).<br />
+
[The same course at CS MSU] (contains more materials in Russian)<br />
[http://bayesgroup.ru BayesGroup page.]
+
[http://bayesgroup.ru BayesGroup page]

Текущая версия на 14:23, 15 апреля 2019

Lector: Dmitry Vetrov

Tutors: Alexander Grishin, Kirill Struminsky, Dmitry Molchanov, Kirill Neklyudov, Artem Sobolev, Arsenii Ashukha, Oleg Ivanov, Ekaterina Lobacheva.

Contacts: All the questions should be addressed to bayesml@gmail.com. Subject line of any letter must contain the following tag: [HSE NBM19]. Letters without the tag will be most probably lost in the inbox.

We also have a chat in Telegram. Its main language is English. All important news will be announced in the chat.

Course description

This course is devoted to Bayesian reasoning in application to deep learning models. Attendees would learn how to use probabilistic modeling to construct neural generative and discriminative models, how to use the paradigm of generative adversarial networks to perform approximate Bayesian inference and how to model the uncertainty about the weights of neural networks. Selected open problems in the field of deep learning would also be discussed. The practical assignments will cover implementation of several modern Bayesian deep learning models.

Course syllabus

News

  • The first assignment has been uploaded to anytask. Deadline: 22 February, 23:00.
  • The second assignment has been uploaded to anytask. Deadline: 8 March, 23:00.
  • The third assignment has been uploaded to anytask. Deadline: 15 March, 23:00.
  • A hard deadline for all assignments: 20 March, 23:00.

Grading System

The assessment consists of 3 practical assignments and a final oral exam. Practical assignments consist of programming some models/methods from the course in Python and analysing their behavior: VAE, Normalizing flows, Sparse Variational Dropout. At the final exam students have to demonstrate knowledge of the material covered during the entire course.

Final course grade is obtained from the following formula:

О_final = 0,7 * О_cumulative + 0,3 * О_exam,

where О_cumulative is an average grade for the practical assignments.

All grades are in ten-point grading scale. If О_cumulative or О_final has a fractional part greater or equal than 0.5 then it is rounded up.

Assignments

  • The course contains three practical assignments. Solutions should be submitted to anytask. To get the invite, please write to the course e-mail. The site has an interface only in Russian, so non-Russian speaking students may submit their solutions to the course e-mail. In this case, the subject line of the letter in addition to the tag should contain your name, surname and assignment number.
  • All assignments should be coded in Python 3 using PyTorch.
  • Students have to complete all assignments by themselves. Using code of your colleagues or code from open implementations is prohibited and will be considered as plagiarism. All involved students (including those who shared their solutions) will be severely punished.
  • Assignments are scored up to 10 points. Each assignment has a deadline, a penalty is charged in the amount of 0.3 points for each day of delay, but in total not more than 6 points. Usually you will have 2 weeks to solve an assignment. Some assignments may contain bonus parts.

Approximate dates of assignments' upload: 7 February, 14 February, 28 February

A hard deadline for all assignments: 20 March, 23:00.

Exam

Exam questions are published here.

At the beginning of the exam, we will give you a random question from the main part of the exam questions list. You will have one hour to prepare your answer. During this hour you may use any materials (including the ones on your laptop).

Then you discuss your answer with an examiner, answer questions from the theoretical minimum, answer additional questions on the course and solve problems. At this part of the exam, you CAN NOT use any materials. Please pay attention to the theoretical minimum - inability to answer any questions from it automatically entails an unsatisfactory mark for the exam.

Course Plan

Date Theme
1 24 January Lecture: Stochastic Variational Inference
31 January Seminar: Application of SVI to Latent Dirichlet Allocation model
2 31 January Lecture: Doubly Stochastic Variational Inference
7 February Seminar: Doubly Stochastic Variational Inference
3 7 February Lecture: Variational autoencoders (VAE) and normalizing flows (NF)
14 February Seminar: Importance Weighted Autoencoders + more complex NF
4 14 February Lecture: Implicit Variational Inference using Adversarial Training
21 February Seminar: f-GAN
5 21 February Lecture: Bayesian neural networks
28 February Seminar: Local reparametrization trick
6 28 February Lecture: Bayesian compression of neural networks
7 7 March Lecture: Discrete Latent Variables and Variance Reduction
7 March Seminar: Discrete Latent Variables and Variance Reduction
8 14 March Seminar: Deep Markov chain Monte Carlo (MCMC)
9 14 March Lecture: Semi-implicit variational inference
21 March Seminar: VampPrior

Course Materials

List of all materials (relevant papers, blogposts, etc.)

Reading List

  • Murphy K.P. Machine Learning: A Probabilistic Perspective. The MIT Press, 2012.
  • Bishop C.M. Pattern Recognition and Machine Learning. Springer, 2006.
  • Mackay D.J.C. Information Theory, Inference, and Learning Algorithms. Cambridge University Press, 2003.
  • Ian Goodfellow, Yoshua Bengio & Aaron Courville. Deep Learning. MIT Press, 2016.

Useful links

[The same course at CS MSU] (contains more materials in Russian)
BayesGroup page