Optimization methods — различия между версиями
NATab (обсуждение | вклад) м |
NATab (обсуждение | вклад) м |
||
Строка 3: | Строка 3: | ||
The course gives a comprehensive foundation for theory, methods and algorithms of mathematical optimization. The prerequisites are linear algebra and calculus. [https://www.hse.ru/ba/data/courses/376703665.html official HSE course page] | The course gives a comprehensive foundation for theory, methods and algorithms of mathematical optimization. The prerequisites are linear algebra and calculus. [https://www.hse.ru/ba/data/courses/376703665.html official HSE course page] | ||
− | ''' Lecturer: Oleg Khamisov ''' | + | ''' Lecturer: [https://www.hse.ru/org/persons/435958676 Oleg Khamisov] ''' |
The course will cover:<br/> | The course will cover:<br/> |
Текущая версия на 13:55, 9 апреля 2021
Содержание
[убрать]About the course
The course gives a comprehensive foundation for theory, methods and algorithms of mathematical optimization. The prerequisites are linear algebra and calculus. official HSE course page
Lecturer: Oleg Khamisov
The course will cover:
1.One-dimensional optimization: unimodal functions, convex and quasiconvex functions, zero and first-order methods, local and global minima.
2.Existence of solutions: continuous and lower semicontinuous functions, coercive functions, Weierstrass theorem, unique and nonunique solutions.
2.Linear optimization: primal and dual linear optimization problems, the simplex methods, interior-point methods, post-optimal analysis.
4.Theory of optimality conditions: Fermat principle, the Hessian matrix, positive and negative semidefinite matrices, the Lagrange function and Lagrange multipliers, the Karush-Kuhn-Tucker conditions, regularity, complementarity constraints, stationary points.
5.First-order optimization methods: the steepest descent method, conjugate directions, gradient-based methods.
6.Second order optimization methods: Newton's method and modifications, trust-region methods.
7.Convex optimization: optimality conditions, duality, subgradients and subdifferential, cutting planes and bundle methods, the complexity of convex optimization.
8.Decomposition: Dantzig-Wolfe decomposition, Benders decomposition, distributed optimization.
9.Conic programming: conic quadratic programming, semidefinite programming, interior point polynomial time methods for conic programming.
10.Nonconvex optimization: weakly and d.c. functions, convex envelopes and underestimators, branch and bound technique.
Grading system
Grade= 0.600 Control assignments + 0.400 Exam
Result Sheet
to be published
Recommended Core Bibliography
- Bazaraa M. S., Sherali H.D, Shetty C. M. Nonlinear Programming: Theory and Algorithms 3rd Edition, John Wiley & Sons, 2006
- Beck, A. First-Order Methods in Optimization, MOS-SIAM Series on Optimization, 2017
- Ben-Tal A., Arkadi Nemirovski A. Lectures on Modern Convex Optimization: Analysis, Algorithms, and Engineering Applications, MOS-SIAM Series on Optimization, 2001
- Nesterov Yu. Introductory Lectures on Convex Optimization, Springer US, 2004