Optimization Methods 2022 — различия между версиями
NATab (обсуждение | вклад) (Новая страница: «Abstract The course gives a comprehansive foundation for theory, methods and algorithms of mathematical optimization. The prerequisites are linear algebra and c…») |
NATab (обсуждение | вклад) м |
||
Строка 1: | Строка 1: | ||
− | Abstract | + | = Abstract = |
− | The course gives a | + | The course gives a comprehensive foundation for theory, methods and algorithms of mathematical optimization. The prerequisites are |
linear algebra and calculus. | linear algebra and calculus. | ||
− | Learning Objectives | + | = Learning Objectives = |
− | - Students will study main concepts of optimization theory and develop methodology for theoretical | + | - Students will study the main concepts of optimization theory and develop a methodology for theoretical investigation of optimization problems. |
− | - Students will obtain an understanding of creation, effectiveness | + | - Students will obtain an understanding of creation, effectiveness and application optimization methods and algorithms on practice. |
− | - The course will give students the possibility of solving | + | - The course will give students the possibility of solving standard and nonstandard mathematical problems connected to finding optimal solutions. |
− | Expected | + | = Expected Learning Outcomes = |
- Students should be able to classify optimization problems according to their mathematical properties. | - Students should be able to classify optimization problems according to their mathematical properties. | ||
Строка 18: | Строка 18: | ||
- Students should be able to perform a theoretical investigation of a given optimization problem in order to assess its complexity. | - Students should be able to perform a theoretical investigation of a given optimization problem in order to assess its complexity. | ||
− | - Students should be able to write down first and second order optimality | + | - Students should be able to write down first and second-order optimality conditions. |
- Students should be able to provide a dual analysis of linear and convex optimization problems. | - Students should be able to provide a dual analysis of linear and convex optimization problems. | ||
− | - Students should be able to | + | - Students should be able to assess the rate of convergence of the first and second-order optimization methods. |
− | - Students should be able to solve simple optimization problems without computer. | + | - Students should be able to solve simple optimization problems without a computer. |
− | - Students should be able to implement different optimization codes in a computer | + | - Students should be able to implement different optimization codes in a computer environment. |
- Students should be able to analyse the obtained solutions. | - Students should be able to analyse the obtained solutions. | ||
− | Course content | + | = Course content = |
- Unvariate optimization: unimodal (invex) functions, golden section method, Lipschitzian optimization. | - Unvariate optimization: unimodal (invex) functions, golden section method, Lipschitzian optimization. | ||
− | - Unconstrained quadratic optimization: algebraic approach, complete characterization of stationary points, gradient method with exact line | + | - Unconstrained quadratic optimization: algebraic approach, complete characterization of stationary points, gradient method with exact line search, conjugate gradient method. |
− | - General unconstrained optimization: first and second order optimality conditions, descent directions, steepest descent method, conjugate | + | - General unconstrained optimization: first and second-order optimality conditions, descent directions, steepest descent method, conjugate gradient methods, Newton method, quasi-Newton methods, inexact line search, rate of convergence. |
− | - Optimization problems with linear equality constraints: first and second order optimality conditions, Lagrange function. | + | - Optimization problems with linear equality constraints: first and second-order optimality conditions, Lagrange function. |
− | - General smooth optimization problems with equality and inequality constraints: first and second order optimality conditions, dual problems, | + | - General smooth optimization problems with equality and inequality constraints: first and second-order optimality conditions, dual problems, |
− | penalty and augmented Lagrangian methods, | + | penalty and augmented Lagrangian methods, sequential quadratic programming. |
− | - Convex optimization: optimality conditions, duality, subgradients and subdifferential, cutting planes and bundle methods, complexity of | + | - Convex optimization: optimality conditions, duality, subgradients and subdifferential, cutting planes and bundle methods, the complexity of convex optimization. |
- Linear programming: duality, decomposition. | - Linear programming: duality, decomposition. | ||
Строка 52: | Строка 52: | ||
− | Recommended Bibliography | + | = Recommended Bibliography = |
- Nocedal J., Wright S.J. Numerical optimization. Second Edition. Springer, 2006 | - Nocedal J., Wright S.J. Numerical optimization. Second Edition. Springer, 2006 |
Версия 10:55, 14 февраля 2022
Содержание
[убрать]Abstract
The course gives a comprehensive foundation for theory, methods and algorithms of mathematical optimization. The prerequisites are linear algebra and calculus.
Learning Objectives
- Students will study the main concepts of optimization theory and develop a methodology for theoretical investigation of optimization problems.
- Students will obtain an understanding of creation, effectiveness and application optimization methods and algorithms on practice.
- The course will give students the possibility of solving standard and nonstandard mathematical problems connected to finding optimal solutions.
Expected Learning Outcomes
- Students should be able to classify optimization problems according to their mathematical properties.
- Students should be able to perform a theoretical investigation of a given optimization problem in order to assess its complexity.
- Students should be able to write down first and second-order optimality conditions.
- Students should be able to provide a dual analysis of linear and convex optimization problems.
- Students should be able to assess the rate of convergence of the first and second-order optimization methods.
- Students should be able to solve simple optimization problems without a computer.
- Students should be able to implement different optimization codes in a computer environment.
- Students should be able to analyse the obtained solutions.
Course content
- Unvariate optimization: unimodal (invex) functions, golden section method, Lipschitzian optimization.
- Unconstrained quadratic optimization: algebraic approach, complete characterization of stationary points, gradient method with exact line search, conjugate gradient method.
- General unconstrained optimization: first and second-order optimality conditions, descent directions, steepest descent method, conjugate gradient methods, Newton method, quasi-Newton methods, inexact line search, rate of convergence.
- Optimization problems with linear equality constraints: first and second-order optimality conditions, Lagrange function.
- General smooth optimization problems with equality and inequality constraints: first and second-order optimality conditions, dual problems,
penalty and augmented Lagrangian methods, sequential quadratic programming.
- Convex optimization: optimality conditions, duality, subgradients and subdifferential, cutting planes and bundle methods, the complexity of convex optimization.
- Linear programming: duality, decomposition.
- Interior-point methods.
Recommended Bibliography
- Nocedal J., Wright S.J. Numerical optimization. Second Edition. Springer, 2006
- Bazaraa M. S., Sherali H.D, Shetty C. M. Nonlinear Programming: Theory and Algorithms 3rd Edition, John Wiley & Sons, 2006
- Beck, A. First-Order Methods in Optimization, MOS-SIAM Series on Optimization, 2017
- Ben-Tal A., Arkadi Nemirovski A. Lectures on Modern Convex Optimization: Analysis, Algorithms, and Engineering Applications, MOS-SIAM Series on Optimization, 2001
- Nesterov Yu. Introductory Lectures on Convex Optimization, Springer US, 2004