Instructor: Hasan A. Poonawala
Mechanical and Aerospace Engineering
University of Kentucky, Lexington, KY, USA
Topics:
Optimization
Types of Problems
Algorithms
Examples
We have a complex decision problem to solve.
Philosophy
Approach the problem of selecting values for a number of interrelated decision variables by focusing attention on a single objective function that quantifies the quality of the decision.
Important
This single objective is optimized subject to the constraints that may limit the selection of decision variable values.
: decision variables
: objective function
: admissible/allowable options for
: equality constraints
: inequality constraints
The type of functions , , determine the βtypeβ of optimization problem
Most optimization books disagree on notation.
A quadratic program (QP)
A linearly constrained quadratic program (QP)
A linear program (LP)
Conic Linear Programming, (CLP), is a natural extension of linear programming.
Constraints form a vector in the nonnegative orthant cone.
A cone is a set where for all .
Constraints form a vector in a cone shaped like an ice-cream cone, called a second-order cone.
Constraints form a -dimensional symmetric matrix required to be positive semidefinite or to be in a semidefinite cone.
A convex nonlinear problem
Important
Convex vs non-convex is more important than linear vs nonlinear.
A nonlinear program (NLP)
The point has an objective value , which is higher than any of its βnearbyβ feasible points (local optimizer).
In contrast, the point has an objective value , which is the best among all feasible points (global optimizer).
Scene / Data
Task
Model the problem of choosing how much product to ship from each factory to each outlet so that the company meets demand at the least cost.
Task
Modify the problem to reflect that the company has one truck for each store, and the trucks can make only one round-trip journey to a factory each week
Task
Modify the problem to reflect that the company has one truck for each store, and the trucks can make only one round-trip journey to a factory each week
Basic optimization theory builds necessary and sufficient optimality conditions that end up being a set of equations or inequalities.
Most algorithms are programs that find decision variables that satisfy (βsolveβ) the conditions.
Computers perform repetitive operations efficiently - optimization algorithms are usually iterative in nature.
An initial variable is selected and the algorithm generates an improved variable .
This iteration is repeated to get a still better variable .
Continuing this way, a sequence of ever-improving variables is found that approaches a solution point .
For linear programming this sequence may be made finite: simplex method.
Algorithms must be analyzed to check if they will converge to a solution point.
The rate at which they converge (if they do) are also analyzed: complexity analysis.
In some cases, convergence at a slow rate is acceptable; usually not.
Size: # of unknown variables and/or # of constraints.
Complexity theory studies how fast the computation time increases as a function of the increases in the number of variables and constraints
Python
, Matlab
, Julia
are widely usedC++
or Rust
if you are comfortableSystems Optimization I β’ Hasan Poonawala