Stochastic analysis 2020 2021
Содержание
Lecturers and Seminarists
Lecturer | Naumov Alexey | [anaumov@hse.ru] | T924 |
Seminarist | Samsonov Sergey | [svsamsonov@hse.ru] | T926 |
About the course
This page contains materials for Stochastic Analysis course in 2020/2021 year, mandatory one for 1st year master students of Statistical Learning Theory program (HSE and Skoltech).
Grading formula
The final grade consists of 4 components (each is non-negative real number from 0 to 10, without any intermediate rounding) :
- OHW for the hometasks
- OMid-term for the midterm exam
- OExam for the final exam
The formula for the final grade is
- OFinal = 0.3*OHW + 0.3*OMid-term + 0.4*OExam
with the usual (arithmetical) rounding rule.
Lectures
Seminars
- To be filled
Midterm
Exam
Exam will be held on 21.12.2019 at 10:30 in the following mixed form. First, you will have 1.5 hour to solve 4 problems. During this part you may use any resources (books, notes, laptops, etc). Then the oral part begins: you go to examinator with your solutions and answer some additional questions on the course. You are supposed to answer without preparation, so no proofs are needed. But you need to know the definitions, formulations of main results, and explain the key concepts. The final grade consists of the grade for the problems (20%) and for the oral answer (20%).
Hometasks
- Homework №1, deadline - 12.10.2019, 23:59
- Homework №2, deadline - 06.11.2019, 23:59
- Homework №3, deadline - 01.12.2019, 23:59, Explicit recurrence for №4, do not open until emergency
- Homework №4, deadline - 22.12.2019, 23:59
Deadline postponed by 1 day
- Bonus hometask, deadline - 22.12.2019, 23:59 dataset
Grades and results
Recommended literature (1st term)
- http://www.statslab.cam.ac.uk/~james/Markov/ - Cambridge lecture notes on discrete-time Markov Chains
- https://link.springer.com/book/10.1007%2F978-3-319-97704-1 - book by E. Moulines et al, you are mostly interested in chapters 1,2,7 and 9 (book is accessible for download through HSE network)
- https://link.springer.com/book/10.1007%2F978-3-319-62226-2 - Stochastic Calculus by P. Baldi, good overview of conditional probabilities and expectations (part 4, also accessible through HSE network)
- https://link.springer.com/book/10.1007%2F978-1-4419-9634-3 - Probability for Statistics and Machine Learning by A. Dasgupta, chapter 19 (MCMC), also accessible through HSE network
Recommended literature (2nd term)
- https://link.springer.com/book/10.1007%2F978-1-4419-9634-3 - Probability for Statistics and Machine Learning by A. Dasgupta, chapter 12, 14
- https://link.springer.com/book/10.1007/978-3-540-68829-7 - Probability theory and Random Processes by L. Koralov and Y. Sinai, lecture 13 (Conditional expectations and martingales)
- https://link.springer.com/book/10.1007%2F978-3-319-62226-2 - Stochastic Calculus by P. Baldi - chapter 7,8 (Denis followed this book);
- http://th.if.uj.edu.pl/~gudowska/dydaktyka/Oksendal.pdf - Stochastic Differential Equations by B. Oksendal (another exposition of the stochastic calculus) - chapters 3-5;
- https://web.math.princeton.edu/~rvan/APC550.pdf- Ramon van Handel, Probability in High Dimension, chapter 2 (but I strongly recommend this book for your future course with Q. Paris, it is amazing);
- https://www.springer.com/gp/book/9783319002262 - Gentil, Barky, Ledoux. Classical book on Markovian semigroups, not very easy to read;