Introduction to Machine Learning and Data Mining — различия между версиями

Материал из Wiki - Факультет компьютерных наук
Перейти к: навигация, поиск
(2 TA's classes)
Строка 41: Строка 41:
 
Frequent itemsets and association rules. Apriori and FP-growth algorithms. Interestingness measures. Compact representations of frequent itemsets: closed itemsets and association rules.  
 
Frequent itemsets and association rules. Apriori and FP-growth algorithms. Interestingness measures. Compact representations of frequent itemsets: closed itemsets and association rules.  
 
Applications: taxonomies of web-site users and contextual advertisement.
 
Applications: taxonomies of web-site users and contextual advertisement.
 +
 +
=== Lecture on 15.05.2019 ===
 +
 +
Linear regression (simple regression, multivariative regression), RMS solution, gradient descent solution, Logistic regression, Multilayer neural net, chain rule, Cross-Entropy as loss function,  introduciton in convolutional neural nets (convolution, pooling).
 +
 +
Applications: general purpose regression and classification tasks, computer vision.
 +
 +
=== Lecture on 29.05.2019 ===
 +
 +
ConvNets regularization (L1+L2 weight decay, soft labels, early stopping), ConvNet debug(monitoring of metrics and tuning Learning Rate, checklist for debug), Image augmentations, Advanced tips&trics (pseudo-labeling, test-time augmentation, pretraining, Ensemble of nets with SGD), Common image-specific problems (Segmentation: Semantics and Instance, Detection, Identification; their metrics: IoU, mAP).
 +
 +
Applications: computer vision.

Версия 15:45, 30 мая 2019

Lecturers: Dmitry Ignatov

TAs: Ivan Zaputliaev (Module 3 and 4), Alexander Korabelnikov (Module 4).


Homeworks

Homework 1: Spam classification.

Soft deadline (up to 10 points): March 9 March 19

Hard deadline (-2 points): March 15 March 25

Lecture on 23.01.2019

Intro slides.

Practice: demonstration with Orange.

Lecture on 06.02.2019

Slides: Introduction to classification techniques (1-rule, kNN, Naive Bayes, Logistic Regression).

Practice: demonstration with Orange and scikit-learn.

Lecture on 22.02.2019

Practice with scikit-learn (kNN, Naive Bayes, Logistic Regression, basic quality metrics, cross-validation, error plots)

Slides: Decision trees. Entropy and information gain. ID3 algorithm. Gini impurity. Tree pruning.

Lecture on 06.03.2019

Slides: 1. Clustering. K-means, k-medoids, fuzzy c-means. The number of clusters problem and related heuristics. Hierarchical clustering. Density-based clustering: DBscan and Mean-shift.

2. Spectral Clustering for graph partition. Min-cut, Laplace matrix, Fiedler vector. Bipartite spectral clustering.

Lecture on 20.03.2019

Frequent itemsets and association rules. Apriori and FP-growth algorithms. Interestingness measures. Compact representations of frequent itemsets: closed itemsets and association rules. Applications: taxonomies of web-site users and contextual advertisement.

Lecture on 15.05.2019

Linear regression (simple regression, multivariative regression), RMS solution, gradient descent solution, Logistic regression, Multilayer neural net, chain rule, Cross-Entropy as loss function, introduciton in convolutional neural nets (convolution, pooling).

Applications: general purpose regression and classification tasks, computer vision.

Lecture on 29.05.2019

ConvNets regularization (L1+L2 weight decay, soft labels, early stopping), ConvNet debug(monitoring of metrics and tuning Learning Rate, checklist for debug), Image augmentations, Advanced tips&trics (pseudo-labeling, test-time augmentation, pretraining, Ensemble of nets with SGD), Common image-specific problems (Segmentation: Semantics and Instance, Detection, Identification; their metrics: IoU, mAP).

Applications: computer vision.