Damek Davis

I am an associate professor of operations research at Cornell University. I received my Ph.D. in mathematics from the University of California, Los Angeles in 2015. My PhD advisors were Wotao Yin and Stefano Soatto.


I am broadly interested in the mathematics of data science, particularly the interplay of optimization, signal processing, statistics, and machine learning.


My research has received several awards, including the INFORMS Optimization Society Young Researchers Prize in (2019), a Sloan Research Fellowship in Mathematics (2020), an NSF CAREER Award (2021), and the SIAM Activity Group on Optimization Best Paper Prize (2023)


CV | Email | Github | Google Scholar


Note: I am on sabbatical until July 2023.


Expository

Subgradient methods under weak convexity and tame geometry

Damek Davis, Dmitriy Drusvyatskiy

SIAG/OPT Views and News (2020)


Selected Talks

Avoiding saddle points in nonsmooth optimization

Updated (11/2021) | video


Stochastic subgradient method converges on tame functions

Updated (8/2019) | abstract


Nonsmooth and nonconvex optimization under statistical assumptions

Updated (4/2019) | abstract


Selected papers

A nearly linearly convergent first-order method for nonsmooth functions with quadratic growth

Damek Davis, Liwei Jiang

Manuscript (2022)


A superlinearly convergent subgradient method for sharp semismooth problems

Vasileios Charisopoulos, Damek Davis

Mathematics of Operations Research (2023) | code


Asymptotic normality and optimality in nonsmooth stochastic approximation

Damek Davis, Dmitriy Drusvyatskiy, Liwei Jiang

Manuscript (2023)


Active manifolds, stratifications, and convergence to local minima in nonsmooth optimization

Damek Davis, Dmitriy Drusvyatskiy, Liwei Jiang

Manuscript (2022)


Clustering a Mixture of Gaussians with Unknown Covariance

Damek Davis, Mateo Diaz, Kaizheng Wang

Manuscript (2021)


Proximal methods avoid active strict saddles of weakly convex functions

Damek Davis, Dmitriy Drusvyatskiy

Foundations of Computational Mathematics (2021)


Low-rank matrix recovery with composite optimization: good conditioning and rapid convergence

Vasileios Charisopoulos, Yudong Chen, Damek Davis, Mateo Díaz, Lijun Ding, Dmitriy Drusvyatskiy

Foundations of Computational Mathematics (2019) | code


Stochastic model-based minimization of weakly convex functions

Damek Davis, Dmitriy Drusvyatskiy

SIAM Journal on Optimization (2018) | blog


Stochastic subgradient method converges on tame functions

Damek Davis, Dmitriy Drusvyatskiy, Sham Kakade, Jason D. Lee

Foundations of Computational Mathematics (2018)


A Three-Operator Splitting Scheme and its Optimization Applications

Damek Davis, Wotao Yin

Set-Valued and Variational Analysis (2017)


Convergence rate analysis of several splitting schemes

Damek Davis, Wotao Yin

Splitting Methods in Communication and Imaging, Science and Engineering (2017)


Lecture notes

Optimization: Structure, Duality, Calculus, and Algorithms

Draft of F’19 notes for my course ORIE 6300

(Last Update: 1/2020)


Phd students

Current

Vasilis Charisopoulos

Liwei Jiang

Tao Jiang


Graduated

Mateo Díaz

Ben Grimmer