Instructor: Dmitriy Drusvyatskiy
Lecture: MW 9:00-10:20 AM in a Zoom chatroom until October 20th and in CMU B006 afterwards
Office hours: 10:30-11:30 AM Mondays in a Zoom chatroom
Email: ddrusv at uw.edu
TA: Catherine Babecki
Office hours: 4-5 PM on Thursday in a Zoom chatroom
Email: cbabecki at uw.edu
The lecture will take place through Zoom until October 20th and in CMU B006 afterwards. The office hours will be held over zoom. You have a free subscription to Zoom through U. Washington. Those registered for the course will receive an email on September 28th with the instructions for how to join the Lecture and Office hours. If you are registed for the course and did not receive the email or are not registered and would like to audit, please send me an email. I am also enabling a discussion boarad for the course through Piazza. You will receive an email with instructions on how to register.
This is an introductory course in convex analysis and nonsmooth optimization. We will cover elements of convex geometry and analysis, (stochastic) first-order methods for convex optimization, introductory variational analysis, and algorithms for nonsmooth and nonconvex optimization problems.
We will primarily use the evolving course notes (written by the instructor) posted on this webpage:
Some relevent textbooks are the following.
Convex Analysis:
First-order methods for convex optimization
Variational Analysis:
Problem sets are due every two weeks. The solutions must be written in LaTex (100% of the grade). You have to submit the HW assignments through Gradescope. Make sure to sign up for the course on the gradescope webpage using the passcode ERPDY8.
You may work together on problem sets, but you must write up your own solutions. You must also cite any resources which helped you obtain your solutions.
The following is a very rough schedule for the course that will evolve as the term progresses.
Lectures 1-3: Background
Inner products and linear maps
Norms
Eigenvalue and singular value decompositions of matrices
Set operations: addition, scaling, image/preimage
Differentiability
Fundamental theorem of calculus and accuracy in approximation
Optimality conditions for smooth unconstrained minimization
Lectures 4-8: Convex Geometry
Operations preserving convexity
Convex Hull
Affine Hull and relative interior
Separation theorem
Cones and polarity
Tangent and normal cones
Lectures 9-14: Convex Analysis
Convex functions: basic operations and continuity
The Fenchel conjugate
Differential properties
Directional derivative
Monotonicity of the subdifferential
Convexity and smoothness
The optimal value function, duality, and subdifferential calculus
Moreau envelope and the proximal map
Convex functions of eigenvalues
Lectures 15-20: First-order algorithms for convex optimization
The classical viewpoint: gradient and subgradient methods
Model-based viewoint for composite minimization
Proximal gradient and acceleration
Proximal subgradient
Smoothing based mathods
Primal-dual algorithms for saddle point problems
Stochastic (sub)gradient methods
Coordinate Descent
Variance Reduction
Non-Euclidean extensions