Data analysis (Software Engineering) 2018 — различия между версиями
(→Seminars) |
|||
(не показано 60 промежуточных версии этого же участника) | |||
Строка 1: | Строка 1: | ||
'''Class email:''' cshse.ml@gmail.com<br /> | '''Class email:''' cshse.ml@gmail.com<br /> | ||
'''Anonymous feedback form:''' [https://goo.gl/forms/xTfnM328m8ulT4FF2 here]<br /> | '''Anonymous feedback form:''' [https://goo.gl/forms/xTfnM328m8ulT4FF2 here]<br /> | ||
− | ''' | + | '''[https://docs.google.com/spreadsheets/d/13lR38aAYlXT_oznEJYkKrGtS0L2AnOSMTMM_LA4xBrU/edit?usp=sharing Scores]''' <br /> |
'''[[ Data_analysis_(Software_Engineering)_2017 | Previous Course Page ]]''' <br /> | '''[[ Data_analysis_(Software_Engineering)_2017 | Previous Course Page ]]''' <br /> | ||
'''[https://github.com/shestakoff/hse_se_ml Course repo]<br /> | '''[https://github.com/shestakoff/hse_se_ml Course repo]<br /> | ||
Строка 21: | Строка 21: | ||
# Midterm theoretical colloquium | # Midterm theoretical colloquium | ||
# Final exam | # Final exam | ||
+ | |||
+ | == Final Exam == | ||
+ | Final exam will be held in the '''22nd of June''' at 10:30 in room 301. | ||
+ | |||
+ | Questions list is awailable [https://github.com/shestakoff/hse_se_ml/blob/master/2018/exam/exam.pdf here]. | ||
== Colloquium == | == Colloquium == | ||
− | Colloquium will be held | + | Colloquium will be held on the '''6th of April'''. |
− | You may not use any materials during colloquium except single A4 prepared before the exam and handwritten personally by you (from two sides). You will have 2 questions from | + | You may not use any materials during colloquium except single A4 prepared before the exam and handwritten personally by you (from two sides). You will have 2 questions from [https://github.com/shestakoff/hse_se_ml/blob/master/2018/colloq/colloq2018.pdf '''question list'''] with 20 minutes for preparation and may receive additional questions or tasks. |
+ | |||
+ | == Kaggle == | ||
+ | [https://www.kaggle.com/t/6d3fc375fd254010a1e781f91d6f6fc9 Link to competition] | ||
+ | |||
+ | You should send presentations before June 3 23:59. Presentations should be [https://www.dropbox.com/request/nCeSWSzw5kUvq2aA7DCw here]. On the title page of the presentation you should list all team participants. | ||
+ | |||
+ | Presentation should be in pdf or ppt format and have all components listed in [https://www.kaggle.com/c/prod-price-prediction/rules competition rules]. Code in py or ipynb format should also be attached to the letter (it may consist of several files). | ||
== Lecture materials == | == Lecture materials == | ||
'''Lecture 1. Introduction to data science and machine learning. ''' <br/> | '''Lecture 1. Introduction to data science and machine learning. ''' <br/> | ||
− | [ | + | [https://shestakoff.github.io/hse_se_ml/2018/lecture-intro/lecture-intro Slides] <br/> |
Additional materials: [https://github.com/esokolov/ml-course-hse/blob/master/2016-fall/seminars/sem01-tools.ipynb 1], [https://drive.google.com/open?id=0B7TWwiIrcJstRzVRSlRFcEl3VGM 2] | Additional materials: [https://github.com/esokolov/ml-course-hse/blob/master/2016-fall/seminars/sem01-tools.ipynb 1], [https://drive.google.com/open?id=0B7TWwiIrcJstRzVRSlRFcEl3VGM 2] | ||
'''Lecture 2. Metric methods. ''' <br/> | '''Lecture 2. Metric methods. ''' <br/> | ||
− | [ | + | [https://shestakoff.github.io/hse_se_ml/2018/lecture-knn/lecture-knn.slides.html Slides] <br/> |
'''Lecture 3. Decision trees. ''' <br/> | '''Lecture 3. Decision trees. ''' <br/> | ||
− | [ | + | [https://shestakoff.github.io/hse_se_ml/2018/lecture-trees/lecture-trees.slides.html Slides] <br/> |
+ | |||
+ | '''Lecture 4. Linear regression. Gradient descent. ''' <br/> | ||
+ | [https://shestakoff.github.io/hse_se_ml/2018/lecture-linreg/lecture-linreg.slides.html Slides] <br/> | ||
+ | |||
+ | '''Lecture 5. Linear classifiction. Logistic Regression ''' <br/> | ||
+ | [https://shestakoff.github.io/hse_se_ml/2018/lecture-linclass/lecture-linclass.slides.html Slides] <br/> | ||
+ | |||
+ | '''Lecture 6. Supervised learning quality measures ''' <br/> | ||
+ | [https://shestakoff.github.io/hse_se_ml/2018/lecture-metrics/lecture-metrics.slides.html Slides] <br/> | ||
+ | |||
+ | '''Lecture 7. Support Vector Machines. Kernel Trick ''' <br/> | ||
+ | [https://shestakoff.github.io/hse_se_ml/2018/lecture-svm/lecture-svm.slides.html#/ Slides] <br/> | ||
+ | |||
+ | '''Lecture 8. Dim. Reduction. PCA, t-SNE ''' <br/> | ||
+ | [https://shestakoff.github.io/hse_se_ml/2018/lecture-pca/lecture-pca.slides.html#/ Slides] <br/> | ||
+ | |||
+ | '''Lecture 9. Ensembles. Bagging, Stacking, Blending ''' <br/> | ||
+ | [https://shestakoff.github.io/hse_se_ml/2018/lecture-ensembles/lecture-ensemble.slides.html#/ Slides] <br/> | ||
+ | |||
+ | '''Lecture 10. Ensembles. Boosting ''' <br/> | ||
+ | [https://shestakoff.github.io/hse_se_ml/2018/lecture-boosting/lecture-boosting.slides.html#/ Slides] <br/> | ||
+ | |||
+ | '''Lecture 11. Neural Networks 1 ''' <br/> | ||
+ | [https://shestakoff.github.io/hse_se_ml/2018/lecture-nn1/lecture-nn1.slides.html#/ Slides] <br/> | ||
+ | |||
+ | '''Lecture 12. Neural Networks 2 ''' <br/> | ||
+ | [https://shestakoff.github.io/hse_se_ml/2018/lecture-nn2/lecture-nn2-v01.slides.html#/ Slides] <br/> | ||
+ | |||
+ | '''Lecture 13. Clustering ''' <br/> | ||
+ | [https://shestakoff.github.io/hse_se_ml/2018/lecture-clust/lecture-clust.slides.html#/ Slides] <br/> | ||
+ | |||
+ | '''Lecture 14. Clustering 2 ''' <br/> | ||
+ | [https://shestakoff.github.io/hse_se_ml/2018/lecture-clust2/lecture-clust2-v01.slides.html#/ Slides] <br/> | ||
+ | |||
+ | '''Lecture 15. Intro to Recsys ''' <br/> | ||
+ | [https://shestakoff.github.io/hse_se_ml/2018/lecture-recsys/lecture-recsys.slides.html#/ Slides] <br/> | ||
== Seminars == | == Seminars == | ||
Строка 51: | Строка 99: | ||
'''Seminar 3. Decision Trees '''<br/> | '''Seminar 3. Decision Trees '''<br/> | ||
[https://cloud.mail.ru/public/M1tY/5qNaMyNx2 Practice in class], [https://github.com/shestakoff/hse_se_ml/blob/master/2018/seminar-trees/seminar-trees-proc.ipynb partially filled]<br/> | [https://cloud.mail.ru/public/M1tY/5qNaMyNx2 Practice in class], [https://github.com/shestakoff/hse_se_ml/blob/master/2018/seminar-trees/seminar-trees-proc.ipynb partially filled]<br/> | ||
− | [https://github.com/shestakoff/hse_se_ml/blob/master/2018/hw/theoretical/trees_theory.pdf Theoretical task 2]. Due date: February | + | [https://github.com/shestakoff/hse_se_ml/blob/master/2018/hw/theoretical/trees_theory.pdf Theoretical task 2]. Due date: February **10** 23:59 <br/> |
− | Practical task 2. Due date: February | + | [http://nbviewer.jupyter.org/github/shestakoff/hse_se_ml/blob/master/2018/hw/practical/dec-trees/hw-trees.ipynb Practical task 2], [https://github.com/shestakoff/hse_se_ml/raw/master/2018/hw/practical/dec-trees/adult_data_small.csv dataset] Due date: February 17 23:59 <br/> |
+ | |||
+ | '''Seminar 4. Linear regression '''<br/> | ||
+ | [https://cloud.mail.ru/public/XXth/iET1kHGRT Practice in class], [https://github.com/shestakoff/hse_se_ml/blob/master/2018/seminar-linreg/seminar-linreg-proc.ipynb partially filled]<br/> | ||
+ | [https://github.com/shestakoff/hse_se_ml/blob/master/2018/hw/theoretical/linreg_theory.pdf Theoretical task 3]. Due date: February 17 23:59 <br/> | ||
+ | |||
+ | '''Seminar 5. Linear classification '''<br/> | ||
+ | [https://cloud.mail.ru/public/8os2/GVqPwL9vH Practice in class], [https://github.com/shestakoff/hse_se_ml/blob/master/2018/seminar-linclass/seminar-linclass-proc.ipynb partially filled]<br/> | ||
+ | [https://github.com/shestakoff/hse_se_ml/blob/master/2018/hw/theoretical/linclass_theory.pdf Theoretical task 4]. Due date: February 25 23:59 <br/> | ||
+ | [http://nbviewer.jupyter.org/github/shestakoff/hse_se_ml/blob/master/2018/hw/practical/lin-class/hw-linclass.ipynb Practical task 3], [https://cloud.mail.ru/public/DGLh/CFtjeXkyX first_dataset], [https://cloud.mail.ru/public/Dc7B/KXcYCe56q diabetes]. Due date: March 11 23:59 <br/> | ||
+ | |||
+ | '''Seminar 6. Quality measures '''<br/> | ||
+ | [https://cloud.mail.ru/public/LZzH/PXvqb2UCN Practice in class], [https://github.com/shestakoff/hse_se_ml/blob/master/2018/seminar-metrics/seminar-metrics-proc.ipynb partially filled]<br/> | ||
+ | [https://github.com/shestakoff/hse_se_ml/blob/master/2018/hw/theoretical/metrics_theory.pdf Theoretical task 5]. Due date: March 11 23:59 <br/> | ||
+ | |||
+ | '''Seminar 7. SVM '''<br/> | ||
+ | [https://cloud.mail.ru/public/7fsX/SrJCNSm65 Practice in class], [https://github.com/shestakoff/hse_se_ml/blob/master/2018/seminar-svm/seminar-svm-proc.ipynb partially filled]<br/> | ||
+ | [https://github.com/shestakoff/hse_se_ml/blob/master/2018/hw/theoretical/svm_theory.pdf Theoretical task 6]. Due date: March 25 23:59 <br/> | ||
+ | |||
+ | '''Seminar 8.1. Feature selection'''<br/> | ||
+ | [https://cloud.mail.ru/public/FVUw/yscd5zci9 Practice in class], [https://github.com/shestakoff/hse_se_ml/blob/master/2018/seminar-feature-selection/seminar-feature-select-proc.ipynb partially filled]<br/> | ||
+ | No theoretical task | ||
+ | |||
+ | '''Seminar 8.2. PCA, t-SNE'''<br/> | ||
+ | [https://cloud.mail.ru/public/9UBc/zbgKua8VU Practice in class], [https://github.com/shestakoff/hse_se_ml/blob/master/2018/seminar-pca/seminar-pca-proc.ipynb partially filled]<br/> | ||
+ | [https://github.com/shestakoff/hse_se_ml/blob/master/2018/hw/theoretical/pca_theory.pdf Theoretical task 7]. Due date: April 23 23:59 <br/> | ||
+ | |||
+ | '''Seminar 9. Ensembles. Bagging, Stacking'''<br/> | ||
+ | [https://cloud.mail.ru/public/6BgB/ePFjkxNa1 Practice in class], [https://github.com/shestakoff/hse_se_ml/blob/master/2018/seminar-ensembles/seminar-ensembles-proc.ipynb partially filled]<br/> | ||
+ | [https://github.com/shestakoff/hse_se_ml/blob/master/2018/hw/practical/kaggle/competition-baseline.ipynb Practical task 4]. Due date: April 29 23:59. [https://www.dropbox.com/request/xa7rcfJ13sP5JXPJj33a Link to load solution] | ||
+ | <br/> | ||
+ | |||
+ | '''Seminar 10. Ensembles. Boosting'''<br/> | ||
+ | [https://cloud.mail.ru/public/G29n/pFkEBRkJC Practice in class], [https://github.com/shestakoff/hse_se_ml/blob/master/2018/seminar-boosting/seminar-boosting-proc.ipynb partially filled]<br/> | ||
+ | [https://github.com/shestakoff/hse_se_ml/blob/master/2018/hw/theoretical/boosting_theory.pdf Theoretical task 8]. Due date: May 13 23:59 <br/> | ||
+ | [https://github.com/shestakoff/hse_se_ml/blob/master/2018/hw/practical/ensembles/hw-ensembles.ipynb Practical task 5]. Due date: May 20 23:59. [https://cloud.mail.ru/public/451j/xfgRNNSPg Data file] | ||
+ | <br/> | ||
+ | |||
+ | '''Seminar 11. Neural Networks 1'''<br/> | ||
+ | [https://cloud.mail.ru/public/77C3/vQ8W9TYSf Practice in class], [https://github.com/shestakoff/hse_se_ml/blob/master/2018/seminar-nn1/seminar-nn1-proc.ipynb partially filled]<br/> | ||
+ | [https://github.com/shestakoff/hse_se_ml/blob/master/2018/hw/theoretical/nn1_theory.pdf Theoretical task 9]. Due date: May 20 23:59 <br/> | ||
+ | |||
+ | '''Seminar 12. Neural Networks 2'''<br/> | ||
+ | [https://cloud.mail.ru/public/CEGn/9MtLbkeEE Practice in class]<br/> | ||
+ | |||
+ | '''Seminar 13. Clustering'''<br/> | ||
+ | [https://cloud.mail.ru/public/CxtF/G2YFspEjK Practice in class], [https://github.com/shestakoff/hse_se_ml/blob/master/2018/seminar-clust/seminar-clust-proc.ipynb partially filled]<br/> | ||
+ | [https://github.com/shestakoff/hse_se_ml/blob/master/2018/hw/theoretical/clust_theory.pdf Theoretical task 10]. Due date: June 12 23:59 <br/> | ||
+ | |||
+ | '''Seminar 14. Kaggle Presentations'''<br/> | ||
+ | |||
+ | '''Seminar 15. Intro to recsys'''<br/> | ||
+ | [https://cloud.mail.ru/public/HSAL/4waXpQg6w Practice in class], [https://github.com/shestakoff/hse_se_ml/blob/master/2018/seminar-recsys/seminar-recsys-proc.ipynb partially filled]<br/> | ||
+ | |||
+ | <br/> | ||
== Evaluation criteria == | == Evaluation criteria == |
Текущая версия на 05:39, 17 июня 2018
Class email: cshse.ml@gmail.com
Anonymous feedback form: here
Scores
Previous Course Page
Course repo
Telegram Group
Содержание
Course description
In this class we consider the main problems of data mining and machine learning: classification, clustering, regression, dimensionality reduction, ranking, collaborative filtering. We will also study mathematical methods and concepts which data analysis is based on as well as formal assumptions behind them and various aspects of their implementation.
A significant attention is given to practical skills of data analysis that will be developed on seminars by studying the Python programming language and relevant libraries for scientific computing.
The knowledge of linear algebra, real analysis and probability theory is required.
The class consists of:
- Lectures and seminars
- Practical and theoretical homework assignments
- A machine learning competition (more information will be available later)
- Midterm theoretical colloquium
- Final exam
Final Exam
Final exam will be held in the 22nd of June at 10:30 in room 301.
Questions list is awailable here.
Colloquium
Colloquium will be held on the 6th of April.
You may not use any materials during colloquium except single A4 prepared before the exam and handwritten personally by you (from two sides). You will have 2 questions from question list with 20 minutes for preparation and may receive additional questions or tasks.
Kaggle
You should send presentations before June 3 23:59. Presentations should be here. On the title page of the presentation you should list all team participants.
Presentation should be in pdf or ppt format and have all components listed in competition rules. Code in py or ipynb format should also be attached to the letter (it may consist of several files).
Lecture materials
Lecture 1. Introduction to data science and machine learning.
Slides
Additional materials: 1, 2
Lecture 2. Metric methods.
Slides
Lecture 3. Decision trees.
Slides
Lecture 4. Linear regression. Gradient descent.
Slides
Lecture 5. Linear classifiction. Logistic Regression
Slides
Lecture 6. Supervised learning quality measures
Slides
Lecture 7. Support Vector Machines. Kernel Trick
Slides
Lecture 8. Dim. Reduction. PCA, t-SNE
Slides
Lecture 9. Ensembles. Bagging, Stacking, Blending
Slides
Lecture 10. Ensembles. Boosting
Slides
Lecture 11. Neural Networks 1
Slides
Lecture 12. Neural Networks 2
Slides
Lecture 13. Clustering
Slides
Lecture 14. Clustering 2
Slides
Lecture 15. Intro to Recsys
Slides
Seminars
Seminar 1. Introduction to Data Analysis in Python
Python Intro, NumPy Tutorial, Pandas Tutorial
Complete tutorials by 28th of January 23:59 and submit an achive (with your name on it) to this link
Seminar 2. Metric Methods
Practice in class, partially filled
Theoretical task 1. Due date: February 2 23:59
Seminar 3. Decision Trees
Practice in class, partially filled
Theoretical task 2. Due date: February **10** 23:59
Practical task 2, dataset Due date: February 17 23:59
Seminar 4. Linear regression
Practice in class, partially filled
Theoretical task 3. Due date: February 17 23:59
Seminar 5. Linear classification
Practice in class, partially filled
Theoretical task 4. Due date: February 25 23:59
Practical task 3, first_dataset, diabetes. Due date: March 11 23:59
Seminar 6. Quality measures
Practice in class, partially filled
Theoretical task 5. Due date: March 11 23:59
Seminar 7. SVM
Practice in class, partially filled
Theoretical task 6. Due date: March 25 23:59
Seminar 8.1. Feature selection
Practice in class, partially filled
No theoretical task
Seminar 8.2. PCA, t-SNE
Practice in class, partially filled
Theoretical task 7. Due date: April 23 23:59
Seminar 9. Ensembles. Bagging, Stacking
Practice in class, partially filled
Practical task 4. Due date: April 29 23:59. Link to load solution
Seminar 10. Ensembles. Boosting
Practice in class, partially filled
Theoretical task 8. Due date: May 13 23:59
Practical task 5. Due date: May 20 23:59. Data file
Seminar 11. Neural Networks 1
Practice in class, partially filled
Theoretical task 9. Due date: May 20 23:59
Seminar 12. Neural Networks 2
Practice in class
Seminar 13. Clustering
Practice in class, partially filled
Theoretical task 10. Due date: June 12 23:59
Seminar 14. Kaggle Presentations
Seminar 15. Intro to recsys
Practice in class, partially filled
Evaluation criteria
The course lasts during the 3rd and 4th modules. Knowledge of students is assessed by evaluation of their home assignments and exams. Home assignments divide into theoretical tasks and practical tasks. There are two exams during the course – after the 3rd module and after the 4th module respectively. Each of the exams evaluates theoretical knowledge and understanding of the material studied during the respective module.
Grade takes values 4,5,…10. Grades, corresponding to 1,2,3 are assumed unsatisfactory. Exact grades are calculated using the following rule:
- score ≥ 35% => 4,
- score ≥ 45% => 5,
- ...
- score ≥ 95% => 10,
where score is calculated using the following rule:
score = 0.7 * Scumulative + 0.3 * Sexam2
cumulative score = 0.8 * Shomework + 0.2 * Sexam1 + 0.2 * Scompetition
- Shomework – proportion of correctly solved homework,
- Sexam1 – proportion of successfully answered theoretical questions during exam after module 3,
- Sexam2 – proportion of successfully answered theoretical questions during exam after module 4,
- Scompetition – score for the competition in machine learning (it's also from 0 to 1).
Participation in machine learning competition is optional and can give students extra points.
"Automative" passing of the course based on cumulative score may be issued.
Plagiarism
In case of discovered plagiarism zero points will be set for the home assignemets - for both works, which were found to be identical. In case of repeated plagiarism by one and the same person a report to the dean will be made.
Deadlines
Assignments sent after late deadlines will not be scored (assigned with zero score) in the absence of legitimate reasons for late submission which do not include high load on other classes.
Structure of emails and homework submissions
Practical assignments must be implemented in jupyter notebook format, theoretical ones in pdf. Practical assignments must use Python 2 (or Python 2 compatible). Use your surname as a filename for assignments (e.g. Ivanov.ipynb). Do not archive your assignments.
Assignments can be performed in either Russian or English.
Assignments can be submitted only once!
Useful links
Machine learning
- Machine learning course from Evgeny Sokolov on Github
- machinelearning.ru
- Video-lectures of K. Vorontsov on machine learning
- On of the classic ML books. Elements of Statistical Learning (Trevor Hastie, Robert Tibshirani, Jerome Friedman)
Python
- Official website
- Libraries: NumPy, Pandas, SciKit-Learn, Matplotlib.
- A little example for the begginers: краткое руководство с примерами по Python 2
- Python from scratch: A Crash Course in Python for Scientists
- Lectures Scientific Python
- A book: Wes McKinney «Python for Data Analysis»
- Коллекция интересных IPython ноутбуков