Optimization methods — различия между версиями

Материал из Wiki - Факультет компьютерных наук
Перейти к: навигация, поиск
м
м
 
(не показаны 4 промежуточные версии этого же участника)
Строка 1: Строка 1:
=== About the course ===
+
== About the course ==
  
The course gives a comprehensive foundation for theory, methods and algorithms of mathematical optimization. The prerequisites are  
+
The course gives a comprehensive foundation for theory, methods and algorithms of mathematical optimization. The prerequisites are linear algebra and calculus. [https://www.hse.ru/ba/data/courses/376703665.html official HSE course page]
linear algebra and calculus.
+
  
=== [https://www.dropbox.com/s/83rudu241umptnz/OptimizationMethods.txt?dl=0 Course Syllabus link] ===
+
''' Lecturer: [https://www.hse.ru/org/persons/435958676 Oleg Khamisov] '''
 +
 
 +
The course will cover:<br/>
 +
1.One-dimensional optimization: unimodal functions, convex and quasiconvex functions, zero and first-order methods, local and global minima.<br/>
 +
2.Existence of solutions: continuous and lower semicontinuous functions, coercive functions, Weierstrass theorem, unique and nonunique solutions.<br/>
 +
2.Linear optimization: primal and dual linear optimization problems, the simplex methods, interior-point methods, post-optimal analysis.<br/>
 +
4.Theory of optimality conditions: Fermat principle, the Hessian matrix, positive and negative semidefinite matrices, the Lagrange function and Lagrange multipliers, the Karush-Kuhn-Tucker conditions, regularity, complementarity constraints, stationary points.<br/>
 +
5.First-order optimization methods: the steepest descent method, conjugate directions, gradient-based methods.<br/>
 +
6.Second order optimization methods: Newton's method and modifications, trust-region methods.<br/>
 +
7.Convex optimization: optimality conditions, duality, subgradients and subdifferential, cutting planes and bundle methods, the complexity of convex optimization.<br/>
 +
8.Decomposition: Dantzig-Wolfe decomposition, Benders decomposition, distributed optimization.<br/>
 +
9.Conic programming: conic quadratic programming, semidefinite programming, interior point polynomial time methods for conic programming.<br/>
 +
10.Nonconvex optimization: weakly and d.c. functions, convex envelopes and underestimators, branch and bound technique.<br/>
 +
 
 +
[https://www.dropbox.com/s/83rudu241umptnz/OptimizationMethods.txt?dl=0 txt Course plan]
 +
 
 +
==Grading system==
 +
 
 +
Grade= 0.600 Control assignments + 0.400 Exam
 +
 
 +
==Result Sheet==
 +
to be published
 +
 
 +
==Recommended Core Bibliography==
 +
 
 +
- Bazaraa M. S., Sherali H.D, Shetty C. M. Nonlinear Programming: Theory and Algorithms 3rd Edition, John Wiley & Sons, 2006
 +
 
 +
- Beck, A. First-Order Methods in Optimization, MOS-SIAM Series on Optimization, 2017
 +
 
 +
- Ben-Tal A., Arkadi Nemirovski A. Lectures on Modern Convex Optimization: Analysis, Algorithms, and Engineering Applications, MOS-SIAM Series on Optimization, 2001
 +
 
 +
- Nesterov Yu. Introductory Lectures on Convex Optimization, Springer US, 2004

Текущая версия на 13:55, 9 апреля 2021

About the course

The course gives a comprehensive foundation for theory, methods and algorithms of mathematical optimization. The prerequisites are linear algebra and calculus. official HSE course page

Lecturer: Oleg Khamisov

The course will cover:
1.One-dimensional optimization: unimodal functions, convex and quasiconvex functions, zero and first-order methods, local and global minima.
2.Existence of solutions: continuous and lower semicontinuous functions, coercive functions, Weierstrass theorem, unique and nonunique solutions.
2.Linear optimization: primal and dual linear optimization problems, the simplex methods, interior-point methods, post-optimal analysis.
4.Theory of optimality conditions: Fermat principle, the Hessian matrix, positive and negative semidefinite matrices, the Lagrange function and Lagrange multipliers, the Karush-Kuhn-Tucker conditions, regularity, complementarity constraints, stationary points.
5.First-order optimization methods: the steepest descent method, conjugate directions, gradient-based methods.
6.Second order optimization methods: Newton's method and modifications, trust-region methods.
7.Convex optimization: optimality conditions, duality, subgradients and subdifferential, cutting planes and bundle methods, the complexity of convex optimization.
8.Decomposition: Dantzig-Wolfe decomposition, Benders decomposition, distributed optimization.
9.Conic programming: conic quadratic programming, semidefinite programming, interior point polynomial time methods for conic programming.
10.Nonconvex optimization: weakly and d.c. functions, convex envelopes and underestimators, branch and bound technique.

txt Course plan

Grading system

Grade= 0.600 Control assignments + 0.400 Exam

Result Sheet

to be published

Recommended Core Bibliography

- Bazaraa M. S., Sherali H.D, Shetty C. M. Nonlinear Programming: Theory and Algorithms 3rd Edition, John Wiley & Sons, 2006

- Beck, A. First-Order Methods in Optimization, MOS-SIAM Series on Optimization, 2017

- Ben-Tal A., Arkadi Nemirovski A. Lectures on Modern Convex Optimization: Analysis, Algorithms, and Engineering Applications, MOS-SIAM Series on Optimization, 2001

- Nesterov Yu. Introductory Lectures on Convex Optimization, Springer US, 2004