My name is Courtney Paquette (nee Kempton) and I just received my Ph.D. from the University of Washington Mathematics Department (Aug. 2017). Currently, I am Ross Assistant Professor (Post Doctoral position) at Ohio State University. Starting in January 2018, I will be a post doc in the Industrial and Systems Engineering Department of Lehigh University. I will be co-advised by Prof. Katya Scheinberg , Prof. Frank E. Curtis , and Prof. Martin Takáč .
I study optimization, in particular continuous optimization and first order methods. My Ph.D advisor at the University of Washington was Prof. Dmitriy Drusvyatskiy.
Both UW and Lehigh University have strong optimization groups which spans across many departments: Math, Stats, CSE, EE, and ISE. If you are interested in optimization talks at either UW or Lehigh, check out the following seminars:
EMAIL: yumiko88(at)uw(dot)edu or cop318(at)lehigh(dot)edu
OFFICE: Lehigh University, Department of Industrial and Systems Engineering, Mohler, 474
I study continuous optimization. My work has centered on various aspects of convex optimization with an emphasis on the continuous side, with connections to practical applications particularly machine learning. I work in variational analysis, a study which generalizes the concepts of differential analysis to functions that lack differentiability, but I also pursue research in first order algorithms to solve large sums of composition functions efficiently. My current work focuses on using first-order methods on structured non-convex and non-smooth problems.
You can view my CV here if you are interested in more details.
You can view my thesis titled: Structure and complexity in non-convex and nonsmooth optimization.
I have the following papers:
- D. Davis, D. Drusvyatskiy, K. MacPhee, and C. Paquette. Subgradient methods for sharp weakly convex functions arXiv (2018)
- D. Davis, D. Drusvyatskiy, and C. Paquette. The nonsmooth landscape of phase retrieval. arXiv (2017)
- C. Paquette, H. Lin, D. Drusvyatskiy, J. Mairal, and Z. Harchaoui. Acceleration for Gradient-Based Non-Convex Optimization. AISTAT (2017)
- D. Drusvyatskiy and C. Paquette. Efficiency of minimizing compositions of convex functions and smooth maps. arXiv (submitted to Math. Prog. 2nd round) (2016)
- D. Drusvyatskiy and C. Paquette. Variational analysis of spectral functions simplified. J. Convex Anal. 25(1), 2018.
I have given talks on the research above at the following conferences:
- Generic Acceleration Schema Beyond Convexity , INFORMS annual meeting (2017), Houston, TX (Oct. 2017); My slides can be found here
- Minimization of convex composite, Lehigh University Optimization Seminar, Bethlehem, PA (Sept 2017); My slides can be found here
- Proximal methods for minimizing convex compositions, SIAM-optimization, Vancouver, BC (May 2017)
- Catalyst for Gradient-based Nonconvex Optimization, Inria-Grenoble Seminar, Grenoble (April 2017)
- Generic acceleration schema beyond convexity, Optimization and Statistical Learning, Les Houches (April 2017)
- Proximal methods for minimizing convex compositions, West Coast Optimization Meeting, University of British Columbia (September 2016); My slides can be found here
Math 1152: Calculus II, Autumn 2017 Website
I have taught the following courses:
- Math 125 BC/BD: Calculus II Quiz Section, Winter 2017; course webpage
- Math 307 E: Intro to Differential Equations, Winter 2016
- Math 124 CC: Calculus 1, Autumn 2015
- Math 307 I: Intro to Differential Equations, Spring 2015
- Math 125 BA/BC: Calculus 2, Winter 2015
- Math 307 K: Intro to Differential Equations, Autumn 2014
- Math 307 L: Intro to Differential Equations, Spring 2014