Neurobayesian models 2019 — различия между версиями
Tipt0p (обсуждение | вклад) (Новая страница: «'''The page is not ready yet!''' '''Lector:''' [https://www.hse.ru/en/staff/dvetrov Dmitry Vetrov] '''Tutors:''' [https://www.hse.ru/en/org/persons/190884100 Al…») |
Alexgri (обсуждение | вклад) |
||
Строка 29: | Строка 29: | ||
===Assignments === | ===Assignments === | ||
− | + | There are three practical assignments. Usually, they are submitted in [https://anytask.org/course/444 anytask]. To get the invite please write to course mail. The site has only Russian interface so the foreign student can submit to course mail. In this case, the subject line should consist of your name, surname and assignment number. | |
− | + | All assignments should be coded in Python 3. | |
− | + | Students have to complete all assignments by themselves. If the solution was discussed together, or any third-party codes and materials were used, then this should be written in the report. Otherwise, “similar” solution would be considered as plagiarism and all involved students (including those who share his solution) will be severely punished. | |
− | + | Assignments are scored up to 10 points. Each practical assignment has a deadline, a penalty is charged in the amount of 0.3 points for each day of delay, but in total not more than 6 points. Some assignments may contain bonus part. | |
− | + | Approximate dates for homework assignments (they can change!): TBA | |
− | + | ||
At the end of the module before the exam there will be a hard deadline for all assignments! Exact date will be announced later. | At the end of the module before the exam there will be a hard deadline for all assignments! Exact date will be announced later. | ||
Строка 48: | Строка 47: | ||
! '''Занятие''' !! '''Дата''' !! '''Название''' !! '''Материалы''' | ! '''Занятие''' !! '''Дата''' !! '''Название''' !! '''Материалы''' | ||
|- | |- | ||
− | | 1 || | + | | 1 || 24 September || Lecture: Stochastic Variational Inference || [http://jmlr.org/papers/volume14/hoffman13a/hoffman13a.pdf article] |
+ | |- | ||
+ | | rowspan="2" | 2 || rowspan="2" | 31 September || Seminar: Application of SVI to Latent Dirichlet Allocation model || [http://jmlr.org/papers/volume14/hoffman13a/hoffman13a.pdf article] | ||
|- | |- | ||
− | | 2 || | + | | Lecture: Doubly Stochastic Variational Inference || TBA |
+ | |- | ||
+ | | rowspan="2" | 3 || rowspan="2" | 7 October || Seminar: Doubly Stochastic Variational Inference || TBA | ||
|- | |- | ||
− | | | + | | Lecture: Variational autoencoders (VAE) and normalizing flows (NF) || [https://arxiv.org/abs/1312.6114 VAE article], [https://arxiv.org/abs/1505.05770 NF article] |
+ | |- | ||
+ | | rowspan="2" | 3 || rowspan="2" | 14 October || Seminar: Importance Weighted Autoencoders + more complex NF || TBA | ||
|- | |- | ||
− | | | + | | Lecture: Density ratio estimation + alpha-GAN || [https://arxiv.org/abs/1701.04722 article] |
− | [https:// | + | |- |
+ | | rowspan="2" | 3 || rowspan="2" | 21 October || Seminar: f-GAN || [https://arxiv.org/abs/1606.00709 article] | ||
|- | |- | ||
− | | | + | | Lecture: Bayesian neural networks || [https://arxiv.org/abs/1505.05424 article], [http://proceedings.mlr.press/v28/wang13a.pdf article], [https://arxiv.org/abs/1703.01961 article ] |
+ | |- | ||
+ | | rowspan="2" | 3 || rowspan="2" | 28 October || Seminar: Local reparametrization trick || [https://arxiv.org/abs/1506.02557 article] | ||
|- | |- | ||
− | | | + | | Lecture: Bayesian compression of neural networks || [https://arxiv.org/abs/1701.05369 article], [https://arxiv.org/abs/1702.04008 article] |
+ | |- | ||
+ | | rowspan="2" | 3 || rowspan="2" | 7 November || Seminar: Deep Marcov chain Monte Carlo (MCMC)|| [https://arxiv.org/abs/1706.07561 article] [https://arxiv.org/abs/1711.09268 article] | ||
|- | |- | ||
− | | | + | | Lecture: Variance Reduction || [https://arxiv.org/abs/1711.00123 article] |
+ | |- | ||
+ | | rowspan="2" | 3 || rowspan="2" | 14 November || Seminar: Discrete latent variables || [https://arxiv.org/abs/1611.01144 article] [https://arxiv.org/abs/1611.00712 article] [https://arxiv.org/abs/1711.00123 article] | ||
|- | |- | ||
− | | | + | | Lecture: Semi-implicit variational inference || [https://arxiv.org/abs/1805.11183 article], [https://arxiv.org/abs/1810.02789 article] |
− | + | |- | |
− | + | | 3 || 21 November || Seminar: VampPrior || [https://arxiv.org/abs/1705.07120 article], [https://arxiv.org/abs/1809.05284 article] | |
− | |- | + | |
− | | | + | |
− | + | ||
− | + | ||
− | + | ||
− | + | ||
|} | |} | ||
Версия 22:19, 6 февраля 2019
The page is not ready yet!
Lector: Dmitry Vetrov
Tutors: Alexander Grishin, Kirill Struminsky, Dmitry Molchanov, Kirill Neklyudov, Artem Sobolev, Arsenii Ashukha, Oleg Ivanov, Ekaterina Lobacheva.
Contacts: All the questions should be addressed to bayesml@gmail.com. Theme of any letter must contain the following tag: [HSE NBM19]. Letters without the tag will be most probably lost in the inbox.
We also have a chat in Telegram (link to it was sent to the group email). It's main language is Russian, but all the questions in English will be answered in English. All important news will be announced in English in the chat and also sent to the group e-mail.
Содержание
[убрать]Course description
This course is devoted to Bayesian reasoning in application to deep learning models. Attendees would learn how to use probabilistic modeling to construct neural generative and discriminative models, how to use the paradigm of generative adversarial networks to perform approximate Bayesian inference and how to model the uncertainty about the weights of neural networks. Selected open problems in the field of deep learning would also be discussed. The practical assignments will cover implementation of several modern Bayesian deep learning models.
News
Grading System
The assessment consist of 3 practical assignments and a final oral exam. Practical assignments consist in programming some models/methods from the course in Python and analysing their behavior: VAE, Normalizing flows, Sparse Variational Dropout. At the final exam students have to demonstrate knowledge of the material covered during the entire course.
Final course grade is obtained from the following formula:
О_final = 0,7 * О_cumulative + 0,3 * О_exam,
where О_cumulative is an average grade for the practical assignments.
All grades are in ten-point grading scale. If О_cumulative or О_final has a fractional part greater or equal than 0.5 then it is rounded up.
Assignments
There are three practical assignments. Usually, they are submitted in anytask. To get the invite please write to course mail. The site has only Russian interface so the foreign student can submit to course mail. In this case, the subject line should consist of your name, surname and assignment number. All assignments should be coded in Python 3. Students have to complete all assignments by themselves. If the solution was discussed together, or any third-party codes and materials were used, then this should be written in the report. Otherwise, “similar” solution would be considered as plagiarism and all involved students (including those who share his solution) will be severely punished. Assignments are scored up to 10 points. Each practical assignment has a deadline, a penalty is charged in the amount of 0.3 points for each day of delay, but in total not more than 6 points. Some assignments may contain bonus part.
Approximate dates for homework assignments (they can change!): TBA
At the end of the module before the exam there will be a hard deadline for all assignments! Exact date will be announced later.
Exam
TBA
Course Plan
Занятие | Дата | Название | Материалы |
---|---|---|---|
1 | 24 September | Lecture: Stochastic Variational Inference | article |
2 | 31 September | Seminar: Application of SVI to Latent Dirichlet Allocation model | article |
Lecture: Doubly Stochastic Variational Inference | TBA | ||
3 | 7 October | Seminar: Doubly Stochastic Variational Inference | TBA |
Lecture: Variational autoencoders (VAE) and normalizing flows (NF) | VAE article, NF article | ||
3 | 14 October | Seminar: Importance Weighted Autoencoders + more complex NF | TBA |
Lecture: Density ratio estimation + alpha-GAN | article | ||
3 | 21 October | Seminar: f-GAN | article |
Lecture: Bayesian neural networks | article, article, article | ||
3 | 28 October | Seminar: Local reparametrization trick | article |
Lecture: Bayesian compression of neural networks | article, article | ||
3 | 7 November | Seminar: Deep Marcov chain Monte Carlo (MCMC) | article article |
Lecture: Variance Reduction | article | ||
3 | 14 November | Seminar: Discrete latent variables | article article article |
Lecture: Semi-implicit variational inference | article, article | ||
3 | 21 November | Seminar: VampPrior | article, article |
Reading List
- Murphy K.P. Machine Learning: A Probabilistic Perspective. The MIT Press, 2012.
- Bishop C.M. Pattern Recognition and Machine Learning. Springer, 2006.
- Mackay D.J.C. Information Theory, Inference, and Learning Algorithms. Cambridge University Press, 2003.
- Ian Goodfellow, Yoshua Bengio & Aaron Courville. Deep Learning. MIT Press, 2016.
Useful links
[The same course in Russian at MSU] (contains more materials in Russian).
BayesGroup page.