Icef-dse-2024-fall — различия между версиями
Материал из Wiki - Факультет компьютерных наук
Bdemeshev (обсуждение | вклад) |
Bdemeshev (обсуждение | вклад) |
||
Строка 21: | Строка 21: | ||
* Grand Sanderson, Solving Wordle using information theory, [https://www.youtube.com/watch?v=v68zYyaEmEA youtube]. | * Grand Sanderson, Solving Wordle using information theory, [https://www.youtube.com/watch?v=v68zYyaEmEA youtube]. | ||
− | * Artem Kirsanov, Key equation behind probability, [https://www.youtube.com/watch?v=KHVR587oW8I youtube]. | + | * Artem Kirsanov, Key equation behind probability, [https://www.youtube.com/watch?v=KHVR587oW8I youtube]. Be careful, Artem uses notation H(P, Q) for Cross entropy (we use CE(P||Q)). |
− | Be careful, Artem uses notation H(P, Q) for Cross entropy (we use CE(P||Q)). | + | |
*[https://exuberant-arthropod-be8.notion.site/1-02-09-5e107ea1c4054594b8f37d955db8a2b0 Конспект] аналогичной лекции на фкн на русском. | *[https://exuberant-arthropod-be8.notion.site/1-02-09-5e107ea1c4054594b8f37d955db8a2b0 Конспект] аналогичной лекции на фкн на русском. |
Версия 10:16, 6 сентября 2024
General course info
Fall grade = 0.2 Small HAs + 0.2 Group project + 0.3 Midterm + 0.3 Final
We expect 3 practice HA and 3 theory HA.
Lecturer: Boris Demeshev
Class teachers: Yana Khassan, Shuana Pirbudagova
Lecture video recordings
Telegram group
Log Book or Tentative Plan
2024-09-05, lecture 1: Entropy, conditional entropy, joint entropy, mutual information, cross-entropy.
- Cristopher Olah, Visual Information Theory, https://colah.github.io/posts/2015-09-Visual-Information/
- Grand Sanderson, Solving Wordle using information theory, youtube.
- Artem Kirsanov, Key equation behind probability, youtube. Be careful, Artem uses notation H(P, Q) for Cross entropy (we use CE(P||Q)).
- Конспект аналогичной лекции на фкн на русском.