Instructor: James Burke
Lecture: MW 9:00-10:20 AM in a Zoom chatroom
Office hours:MW after class and T 1:00-2:00 PM in a Zoom chatroom
Email: jvburke at uw edu
TA: Ravil Mussabayev
Office hours: Friday 6-7pm in a Zoom chatroom
Email: ravmus at uw edu
Both the lecture and the office hours will take place through Zoom. You have a free subscription to Zoom through U. Washington. I am also enabling a discussion boarad for the course through Piazza. You will receive an email with instructions on how to register.
This is an introductory course in convex analysis and nonsmooth optimization. We will cover elements of convex geometry and analysis, (stochastic) first-order methods for convex optimization, introductory variational analysis, and algorithms for nonsmooth and nonconvex optimization problems.
We will primarily use the evolving course notes (written by the Dmitriy Drusvyatskiy) posted on this webpage:
Some relevent textbooks are the following.
Linear Algebra:
Topics in Matrix Analysis by Horn and Johnson.Convex Analysis:
First-order methods for convex optimization
Variational Analysis:
Problem sets are due every two weeks. The solutions must be written in LaTex (100% of the grade). Homework is to be submitted by email to both the TA and I.
You may work together on problem sets, but you must write up your own solutions. You must also cite any resources which helped you obtain your solutions.
The following is a very rough schedule for the course that will evolve as the term progresses.
Lectures 1-2: Background
Inner products and linear maps
Norms
Eigenvalue and singular value decompositions of matrices
Set operations: addition, scaling, image/preimage
Differentiability
Fundamental theorem of calculus and accuracy in approximation
Optimality conditions for smooth unconstrained minimization
Lectures 3-5: Convex Geometry
Operations preserving convexity
Convex Hull
Affine Hull and relative interior
Separation theorem
Cones and polarity
Tangent and normal cones
Lectures 6-11: Convex Analysis
Convex functions: basic operations and continuity
The Fenchel conjugate
Differential properties
Directional derivative
Monotonicity of the subdifferential
Convexity and smoothness
The optimal value function, duality, and subdifferential calculus
Moreau envelope and the proximal map
Convex functions of eigenvalues
Lectures 12-15: First-order algorithms for convex optimization
The classical viewpoint: gradient and subgradient methods
Model-based viewoint for composite minimization
Proximal gradient and acceleration
Proximal subgradient
Smoothing based mathods
Primal-dual algorithms for saddle point problems
Stochastic (sub)gradient methods
Coordinate Descent
Variance Reduction
Non-Euclidean extensions
Lectures 16-18: Introduction to variational analysis
Normal and tangent cones
Transversality
Slope, subderivatives, and the subdifferential
The descent principle and error bounds
The extremal principle
Subdifferential calculus
Metric regularity and the normal criterion
The radius of metric regularity
Tame variational analysis: desingularization and Kurdyka-Lojasiewicz (KL) inequality, chain rule along paths, and Sard's theorem.
Lectures 18-20: Algorithms for nonsmooth and nonconvex optimization
Weakly convex functions and compositional problems
Model-based algorithms as approximate proximal point methods: (stochastic) Gauss-Newton and proximal (sub)gradient methods
Iterate convergence of desent methods under KL inequality
Assymptotic convergence of the subgradient algorithm: the ODE method
the Goldstein subdifferential and random search