Guide for Week 9
-
Reading Assignment:
Homework Assignment:
Vocabulary Words
- Optimality Conditions
- Weierstrass extreme value theorem
- coercive functions
- The coercivity and compactness theorem (proof not required)
- The coercivity and existence theorem (proof required)
- The Basic first-order optimality result (proof not required)
- The first-order necessary conditions for optimality
- the second-order necessary and sufficient conditions for optimality
- Tangent cone
- Basic Constrained 1st-Order Optimality Conditions
- Nonlinear Programming
- Regularity and constraint qualifications LICQ and MFCQ
- the Lagrangian and Lagrange multipliers
- Constrained 1st-Order Optimality Conditions
- Karush-Kuhn-Tucker (KKT) conditions
- Constrained 2nd-Order Sufficient Conditions
- Convexity
- convex set
- epigraph ond domain of a function
- convex functio
- closed or lsc function
- convex hull
- support function
- local = global min
- sublinearity of the directional derivative
- subdifferential inequality
- first-order necessary and sufficient condition for optimality
- tangent and normal cones
- test for convexity
- convex NLP
- Slater CQ
- necessary and sufficient first-order optimality in convex NLP
- saddle points
- the Lagrangian
- the primal and dual problems
- Lagrangian duality
-
- Line Search Methods
- search direction
- descent direction
- direction of strict descent
- direction of steepest descent
- the Cauchy direction
- The Newton direction
- step length , step size
- Curry step length
- Armijo-Goldstein inequality
- backtracking line search
- The Basic Backtracking Algorithm
- Convergence Theorem for the backtracking line search
- the weak Wolfe conditions
- the strong Wolfe conditions
- the Goldstein conditions
- The Bisection Method for the Weak Wolf Conditions
- Search Directions
- Steepest Descent direction
- Newton's method for solving equations
- Newton-Like methods
- Q-convergence rates
- linear, superlinear, and quadratic convergence
- Newton's method for minimization
- The rate of convergence of Newton's method
- Matrix secant equation
- Broyden's method (both direct and inverse updates)
- the BFGS update (both direct and inverse updates)
-
Key Concepts:
- Convexity
- the epigraphical perspective
- local = global
- sublinearity of the directional derivative
- test for convexity
- Slater CQ
- Lagrangian Duality
- Line Search Methods
- Goldstein conditions
- Wolfe conditions
- Basic Backtracking line search
- Bisection method for the weak Wolfe conditions
- Search Directions
- Steepest descent
- Newton's method for equations and minimization
- Broyden's method
- BFGS
-
Skills to Master:
- testing for convexity
- testing for optimality
- testing the Slater CQ
- constructing Lagrangian duals
- executing the backtracking and bisection line searches.
- Implementing Newton's method
- Implementing Broyden and BFGS in Matlab
-
Quiz:
-
This quiz will focus on convexity. The first question will be
more theoretical focusing on verifying whether a class of sets and/or
sets or functions is convex or strictly convex, optimality conditions,
establishing polarity for convex cones,
or checking the Slater constraint qualification is satisfied.
The second question is computational, for example checking if a specific
set and/or function is convex, computing a KKT point, and verifying optimality
condiditons are, or are not satisfied. You will not
be asked to
compute a Lagrangian dual.