OPTIMIZATION SEMINAR

Tuesday, November 14, 2:35--3:20pm

Padelford Hall Room C-036


A coordinate Gradient Descent method for Nonsmooth Separable Minimization and Support Vector Machines

Sangwoon Yun

Graduate Student, UW, Mathematics

In the first part of the talk, we consider the problem of minimizing the sum of a smooth function and a separable convex function. This problem includes as special cases bound-constrained smooth optimization and smooth optimization with L1-regularization. We propose a (block) coordinate gradient descent (abbreviated as CGD) method for solving this class of nonsmooth separable problems. The method is simple, highly parallelizable, and suited for large-scale applications in signal/image denoising, regression, and data mining/classification. We establish global convergence and, under a local Lipschitzian error bound assumption, local linear rate of convergence for this method. We compare the CGD method with L-BFGS-B and MINOS, applied to solving L1-regularization of large nonlinear least square problems from More et al.'s test set.

In the second part of the talk, we discuss extensions of CGD method and its analysis to linearly constrained smooth optimization, with application to Support Vector Machines training.

Joint work with Paul Tseng.


Mathematics Department University of Washington