Icef-dse-2024-fall — различия между версиями
Bdemeshev (обсуждение | вклад) |
Bdemeshev (обсуждение | вклад) |
||
Строка 20: | Строка 20: | ||
https://colah.github.io/posts/2015-09-Visual-Information/ | https://colah.github.io/posts/2015-09-Visual-Information/ | ||
− | Grand Sanderson, Solving Wordle using information theory | + | Grand Sanderson, Solving Wordle using information theory, [https://www.youtube.com/watch?v=v68zYyaEmEA youtube]. |
− | https://www.youtube.com/watch?v=v68zYyaEmEA | + | |
+ | Artem Kirsanov, Key equation behind probability, [https://www.youtube.com/watch?v=KHVR587oW8I youtube]. | ||
+ | Be careful, Artem uses notation H(P, Q) for Cross entropy (we use CE(P||Q)). | ||
конспект аналогичной лекции на фкн на русском: | конспект аналогичной лекции на фкн на русском: |
Версия 10:15, 6 сентября 2024
General course info
Fall grade = 0.2 Small HAs + 0.2 Group project + 0.3 Midterm + 0.3 Final
We expect 3 practice HA and 3 theory HA.
Lecturer: Boris Demeshev
Class teachers: Yana Khassan, Shuana Pirbudagova
Lecture video recordings
Telegram group
Log Book or Tentative Plan
2024-09-05, lecture 1: Entropy, conditional entropy, joint entropy, mutual information, cross-entropy.
Cristopher Olah, Visual Information Theory https://colah.github.io/posts/2015-09-Visual-Information/
Grand Sanderson, Solving Wordle using information theory, youtube.
Artem Kirsanov, Key equation behind probability, youtube. Be careful, Artem uses notation H(P, Q) for Cross entropy (we use CE(P||Q)).
конспект аналогичной лекции на фкн на русском: https://exuberant-arthropod-be8.notion.site/1-02-09-5e107ea1c4054594b8f37d955db8a2b0