Into to DataMining and Machine Learning 2020 2021 — различия между версиями
Machine (обсуждение | вклад) (→Lecture on 16 Jan 2020) |
Machine (обсуждение | вклад) (→Lecture on 16 Jan 2020) |
||
Строка 10: | Строка 10: | ||
* Homework 3: Recommender Systems | * Homework 3: Recommender Systems | ||
− | === Lecture on 16 | + | === Lecture on 16 January 2020=== |
Intro slides. Course plan. Assessment criteria. ML&DM libraries. What to read and watch? | Intro slides. Course plan. Assessment criteria. ML&DM libraries. What to read and watch? |
Версия 18:28, 11 июня 2021
Lecturer: Dmitry Ignatov
TA: Stefan Nikolić
Содержание
[убрать]Homeworks
- Homework 1: Spectral Clustering
- Homework 2:
- Homework 3: Recommender Systems
Lecture on 16 January 2020
Intro slides. Course plan. Assessment criteria. ML&DM libraries. What to read and watch?
Practice: demonstration with Orange.
Lecture on 26 Jan 2020
Classification (continued). Quality metrics. ROC curves.
Practice: demonstration with Orange.
Lecture on 2 February 2021
Introduction to Clustering. Taxonomy of clustering methods. K-means. K-medoids. Fuzzy C-means. Types of distance metrics. Hierarchical clustering. DBScan
Practice: DBScan Demo.
Lecture on 09 Feb 2020
Introduction to Clustering (continued). Density-based techniques. DBScan and Mean-shift.
Practice on 16 Feb 2020
Clustering with scikit-learn (k-means, hierarchical clustering, DBScan, MeanShift, Spectral Clustering).
Lecture on 6 Feb 2020
Graph and spectral clustering. Min-cuts and normalized cuts. Laplacian matrix. Fiedler vector. Applications.
Practice on 13 Feb 2020
Clustering with scikit-learn (k-means, hierarchical clustering, DBScan, MeanShift, Spectral Clustering). Parameters tuning and results' evaluation. Continued.