Intro to DL Blended — различия между версиями

Материал из Wiki - Факультет компьютерных наук
Перейти к: навигация, поиск
Строка 8: Строка 8:
  
 
Final grade = 75% cumulative grade + 25% final exam
 
Final grade = 75% cumulative grade + 25% final exam
 
Оценки: https://docs.google.com/spreadsheets/d/1D83ngomh1Y8GXfNGs8cmoT-VYeIOYE3xEVT5JFOeBHQ/edit?usp=sharing
 
  
 
'''Additional project:'''
 
'''Additional project:'''
Строка 15: Строка 13:
 
Homework with Kaggle competition: https://docs.google.com/document/d/1kTMYq21UFqZOqftjKAPq8G7RRkO7kX3MomsVIVhW830/edit?usp=sharing
 
Homework with Kaggle competition: https://docs.google.com/document/d/1kTMYq21UFqZOqftjKAPq8G7RRkO7kX3MomsVIVhW830/edit?usp=sharing
  
Release date: 10-03-2019 16:00
+
Release date: 18-02-2020 00:00
  
Deadline: 24-03-2019 03:00
+
Deadline: 03-03-2020 00:00
  
 
'''Exam:'''
 
'''Exam:'''
Строка 29: Строка 27:
 
# Description of auto-encoder, application to images
 
# Description of auto-encoder, application to images
 
# Gradient of RNN cell (with proof)
 
# Gradient of RNN cell (with proof)
 
'''Семинары:'''
 
 
1. Keras Tutorial https://colab.research.google.com/drive/1HoEsK580KAzMGuvFyYwUFdnRzuZ_hC13
 

Версия 11:36, 18 февраля 2020

Course program:

https://www.hse.ru/data/2018/06/05/1150113338/program-2129241367-JndYcQjSAq.pdf

Grading:

Cumulative grade = 80% online course + 20% additional project

Final grade = 75% cumulative grade + 25% final exam

Additional project:

Homework with Kaggle competition: https://docs.google.com/document/d/1kTMYq21UFqZOqftjKAPq8G7RRkO7kX3MomsVIVhW830/edit?usp=sharing

Release date: 18-02-2020 00:00

Deadline: 03-03-2020 00:00

Exam:

In writing, theoretical questions, for instance:

  1. SGD variations: Moment, RMSProp, Adam with explanation
  2. Description of backprop and proof of its efficiency (линейное время работы)
  3. Gradient of a dense layer in matrix notation (with proof)
  4. Typical CNN architecture, purpose of each layer, how to do backprop
  5. Inception V3 architecture choices
  6. Description of auto-encoder, application to images
  7. Gradient of RNN cell (with proof)