Icef-dse-2024-fall — различия между версиями
Bdemeshev (обсуждение | вклад) |
Bdemeshev (обсуждение | вклад) |
||
Строка 23: | Строка 23: | ||
* Artem Kirsanov, Key equation behind probability, [https://www.youtube.com/watch?v=KHVR587oW8I youtube]. Be careful, Artem uses notation H(P, Q) for Cross entropy (we use CE(P||Q)). | * Artem Kirsanov, Key equation behind probability, [https://www.youtube.com/watch?v=KHVR587oW8I youtube]. Be careful, Artem uses notation H(P, Q) for Cross entropy (we use CE(P||Q)). | ||
− | *[https://exuberant-arthropod-be8.notion.site/1-02-09-5e107ea1c4054594b8f37d955db8a2b0 Конспект] аналогичной лекции на фкн на русском. | + | * [https://exuberant-arthropod-be8.notion.site/1-02-09-5e107ea1c4054594b8f37d955db8a2b0 Конспект] аналогичной лекции на фкн на русском. |
+ | |||
+ | * Keith Conrad, [https://kconrad.math.uconn.edu/blurbs/analysis/entropypost.pdf Maximal entropy] distributions. | ||
2024-09-12, lecture 2: Expected value of log-likelihood is zero. Kullback-Leibler divergence definition. Expected value calculation example. Optimizing long-run profit. Horse betting: optimal bet under private signal. | 2024-09-12, lecture 2: Expected value of log-likelihood is zero. Kullback-Leibler divergence definition. Expected value calculation example. Optimizing long-run profit. Horse betting: optimal bet under private signal. |
Версия 23:18, 13 октября 2024
General course info
Fall grade = 0.2 Small HAs + 0.2 Group project + 0.3 Midterm + 0.3 Final
We expect 3 practice HA and 3 theory HA.
Lecturer: Boris Demeshev
Class teachers: Yana Khassan, Shuana Pirbudagova
Lecture video recordings
Telegram group
Log Book or Tentative Plan
2024-09-05, lecture 1: Entropy, conditional entropy, joint entropy, mutual information, cross-entropy.
- Cristopher Olah, Visual Information Theory, https://colah.github.io/posts/2015-09-Visual-Information/
- Grand Sanderson, Solving Wordle using information theory, youtube.
- Artem Kirsanov, Key equation behind probability, youtube. Be careful, Artem uses notation H(P, Q) for Cross entropy (we use CE(P||Q)).
- Конспект аналогичной лекции на фкн на русском.
- Keith Conrad, Maximal entropy distributions.
2024-09-12, lecture 2: Expected value of log-likelihood is zero. Kullback-Leibler divergence definition. Expected value calculation example. Optimizing long-run profit. Horse betting: optimal bet under private signal.
- Marcin Anforowicz, Just one more paradox youtube
- Wikipedia, [Kelly Criterion https://en.wikipedia.org/wiki/Kelly_criterion]: a good article
- Kelly, A new interpretation of information rate: original paper, very well written
2024-09-19, lecture 3: Horse betting: optimal bet under signal. Optimal long-term interest rate as entropy difference. How to build a tree? Entropy drop as splitting criterion. Dealing with missing values. How to stop? Tree pruning.
- R2D3, Visual introduction to machine learning: decision tree
2024-09-26, lecture 4: Random forest
- R2D3, Visual introduction to machine learning-2: bias-variance tradeoff and many trees
2024-10-03, lecture 5: Bootstrap: Naive bootstrap, t-stat bootstrap, bootstrap in bootstrap.
- Tim Hesterberg, What teachers should know about bootstrap?
2024-10-10, lecture 6: Gradient boosting for regression. Residual vector as minus gradient. Properties of logistic function.