Instructor: Dmitriy Drusvyatskiy
Lecture: MW 11:00-12:20 PM in a Zoom chatroom (links sent by email)
Office hours: 10:00-11:00 AM Tueasdays in Zoom chatroom (same link as lecture)
Email: ddrusv at uw.edu
The lecture will take place through Zoom. I am also enabling a discussion board for the course through Piazza. You will receive an email with instructions on how to register.
This is an introductory course in convex analysis and nonsmooth optimization. We will cover elements of convex geometry and analysis, (stochastic) first-order methods for convex optimization, introductory variational analysis, and algorithms for nonsmooth and nonconvex optimization problems.
We will primarily use the evolving course notes (written by the instructor) posted on this webpage:
Some relevent textbooks are the following.
Convex Analysis:
First-order methods for convex optimization
Variational Analysis:
Problem sets are due every two weeks. The solutions must be written in LaTex (100% of the grade). You can slide them under my office door. You get extra credit if you solve the problem, ask a large language model to solve it, and find a mistake. In this case, attach the conversation to the submitted homework.
You may work together on problem sets, but you must write up your own solutions. You must also cite any resources which helped you obtain your solutions.
The following is a very rough schedule for the course that will evolve as the term progresses.
Lectures 1-3: Background
Inner products and linear maps
Norms
Eigenvalue and singular value decompositions of matrices
Set operations: addition, scaling, image/preimage
Differentiability
Fundamental theorem of calculus and accuracy in approximation
Optimality conditions for smooth unconstrained minimization
Lectures 4-8: Convex Geometry
Operations preserving convexity
Convex Hull
Affine Hull and relative interior
Separation theorem
Cones and polarity
Tangent and normal cones
Lectures 9-14: Convex Analysis
Convex functions: basic operations and continuity
The Fenchel conjugate
Differential properties
Directional derivative
Monotonicity of the subdifferential
Convexity and smoothness
The optimal value function, duality, and subdifferential calculus
Moreau envelope and the proximal map
Convex functions of eigenvalues
Lectures 15-20: First-order algorithms for convex optimization
The classical viewpoint: gradient and subgradient methods
Model-based viewoint for composite minimization
Proximal gradient and acceleration
Proximal subgradient
Smoothing based mathods
Primal-dual algorithms for saddle point problems
Stochastic (sub)gradient methods
Coordinate Descent
Variance Reduction
Non-Euclidean extensions