Intro to DL Blended — различия между версиями
Материал из Wiki - Факультет компьютерных наук
Zimovnov (обсуждение | вклад) |
Zimovnov (обсуждение | вклад) |
||
Строка 1: | Строка 1: | ||
'''Course program:''' | '''Course program:''' | ||
+ | |||
https://www.hse.ru/data/2018/06/05/1150113338/program-2129241367-JndYcQjSAq.pdf | https://www.hse.ru/data/2018/06/05/1150113338/program-2129241367-JndYcQjSAq.pdf | ||
'''Grading:''' | '''Grading:''' | ||
+ | |||
Cumulative grade = 80% online course + 20% additional project | Cumulative grade = 80% online course + 20% additional project | ||
Final grade = 75% cumulative grade + 25% final exam | Final grade = 75% cumulative grade + 25% final exam | ||
'''Additional project:''' | '''Additional project:''' | ||
+ | |||
Homework with Kaggle competition | Homework with Kaggle competition | ||
'''Exam:''' | '''Exam:''' | ||
+ | |||
In writing, theoretical questions, for instance: | In writing, theoretical questions, for instance: | ||
# SGD variations: Moment, RMSProp, Adam with explanation | # SGD variations: Moment, RMSProp, Adam with explanation |
Версия 20:31, 3 марта 2019
Course program:
https://www.hse.ru/data/2018/06/05/1150113338/program-2129241367-JndYcQjSAq.pdf
Grading:
Cumulative grade = 80% online course + 20% additional project Final grade = 75% cumulative grade + 25% final exam
Additional project:
Homework with Kaggle competition
Exam:
In writing, theoretical questions, for instance:
- SGD variations: Moment, RMSProp, Adam with explanation
- Description of backprop and proof of its efficiency
- Gradient of a dense layer in matrix notation (with proof)
- Typical CNN architecture, purpose of each layer, how to do backprop
- Inception V3 architecture choices
- Gradient of RNN cell (with proof)