Damek Davis

Publications Research CV X Github (current) Github (old) Google Scholar

I’m an Associate Professor in Wharton’s Department of Statistics and Data Science. I was previously an Associate Professor at Cornell ORIE, an NSF Postdoctoral Fellow, and a PhD student in Math at UCLA under Wotao Yin (Alibaba) and Stefano Soatto (AWS AI). I was a long term visitor at the Simon’s Institute in Fall 2017 (bridging discrete and continuous optimization) and Fall 2024 (LLM program).

Research Interests. Optimization and machine learning.

Selected Works. I recently developed exponential accelerations of gradient descent, semismooth Newton, and the subgradient method. Read more about my research here.

Selected Awards. I received a Sloan Research Fellowship in Mathematics, an NSF CAREER Award, and the SIAM Activity Group on Optimization Best Paper Prize.

Students. I’ve advised 5 PhD students. If you are a Penn student and wish to discuss advising/collaboration, send me a concise, informative email to set up a meeting. I am an active advisor–students who work best with me tend to have energy levels that match or exceed mine.

Current: Tao Jiang (Cornell) → Meta (Postdoc)

Graduated PhD Students from Cornell:

Teaching. I’m teaching STAT 4830: “Optimization in PyTorch” in Spring 2025.

Service. I am currently an associate editor at Mathematical Programming and Foundations of Computational Mathematics.

You may not know that…

Please use my email sparingly for correspondence related to research questions, teaching, or other professional inquiries.

Publications

Preprints Conference papers Journal papers Book chapters Expository Reports

Preprints

Gradient descent with adaptive stepsize converges (nearly) linearly under fourth-order growth Damek Davis, Dmitriy Drusvyatskiy, Liwei Jiang Manuscript (2024)

Conference papers

Aiming towards the minimizers: fast convergence of SGD for overparametrized problems Chaoyue Liu, Dmitriy Drusvyatskiy, Mikhail Belkin, Damek Davis, Yi-An Ma NeurIPS (2023)

A gradient sampling method with complexity guarantees for Lipschitz functions in high and low dimensions Damek Davis, Dmitriy Drusvyatskiy, Yin Tat Lee, Swati Padmanabhan, Guanghao Ye NeurIPS (2022) Oral Presentation (top ~1%)

High probability guarantees for stochastic convex optimization Damek Davis, Dmitriy Drusvyatskiy In Conference on Learning Theory (2020)

Global Convergence of EM Algorithm for Mixtures of Two Component Linear Regression Jeongyeol Kwon, Wei Qian, Constantine Caramanis, Yudong Chen, and Damek Davis Conference on Learning Theory (2019)

The Sound of APALM Clapping: Faster Nonsmooth Nonconvex Optimization with Stochastic Asynchronous PALM Damek Davis, Brent Edmunds, Madeleine Udell Neural Information Processing Systems (2016) | report

Multiview Feature Engineering and Learning Jingming Dong, Nikos Karianakis, Damek Davis, Joshua Hernandez, Jonathan Balzer and Stefano Soatto In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (2015)

Asymmetric sparse kernel approximations for large-scale visual search. Damek Davis, Jonathan Balzer, Stefano Soatto In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (2014)

Journal papers

Active manifolds, stratifications, and convergence to local minima in nonsmooth optimization Damek Davis, Dmitriy Drusvyatskiy, Liwei Jiang Foundations of Computational Mathematics (to appear)

Stochastic optimization over proximally smooth sets Damek Davis, Dmitriy Drusvyatskiy, Zhan Shi SIAM Journal on Optimization (to appear)

Computational Microscopy beyond Perfect Lenses Xingyuan Lu, Minh Pham, Elisa Negrini, Damek Davis, Stanley J. Osher, Jianwei Miao Physical Review E (to appear)

Global Optimality of the EM Algorithm for Mixtures of Two-Component Linear Regressions Jeongyeol Kwon, Wei Qian, Constantine Caramanis, Yudong Chen, Damek Davis, Nhat Ho: IEEE Transactions on Information Theory (2024)

Clustering a Mixture of Gaussians with Unknown Covariance Damek Davis, Mateo Diaz, Kaizheng Wang Bernoulli (to appear)

Asymptotic normality and optimality in nonsmooth stochastic approximation Damek Davis, Dmitriy Drusvyatskiy, Liwei Jiang The Annals of Statistics (to appear) Second Place in INFORMS Optimization Society 2024 Student Paper Prize

A nearly linearly convergent first-order method for nonsmooth functions with quadratic growth Damek Davis, Liwei Jiang Foundations of Computational Mathematics (to appear) | code | Twitter thread

Stochastic algorithms with geometric step decay converge linearly on sharp functions Damek Davis, Dmitriy Drusvyatskiy, Vasileios Charisopoulos Mathematical Programming (to appear) | code

A superlinearly convergent subgradient method for sharp semismooth problems Vasileios Charisopoulos, Damek Davis Mathematics of Operations Research (2023) | code | Twitter Thread

Escaping strict saddle points of the Moreau envelope in nonsmooth optimization Damek Davis, Mateo Díaz, Dmitriy Drusvyatskiy SIAM Journal on Optimization (2022)

Variance reduction for root-finding problems Damek Davis Mathematical Programming (to appear)

Conservative and semismooth derivatives are equivalent for semialgebraic maps Damek Davis, Dmitriy Drusvyatskiy Set-Valued and Variational Analysis (to appear)

From low probability to high confidence in stochastic convex optimization Damek Davis, Dmitriy Drusvyatskiy, Lin Xiao, Junyu Zhang Journal of Machine Learning Research (to appear)

Proximal methods avoid active strict saddles of weakly convex functions Damek Davis, Dmitriy Drusvyatskiy Foundations of Computational Mathematics (2021)

Low-rank matrix recovery with composite optimization: good conditioning and rapid convergence Vasileios Charisopoulos, Yudong Chen, Damek Davis, Mateo Díaz, Lijun Ding, Dmitriy Drusvyatskiy Foundations of Computational Mathematics (to appear) | code

Composite optimization for robust rank one bilinear sensing Vasileios Charisopoulos, Damek Davis, Mateo Diaz, Dmitriy Drusvyatskiy IMA Journal on Information and Inference (2020) | code

Graphical Convergence of Subgradients in Nonconvex Optimization and Learning Damek Davis, Dmitriy Drusvyatskiy Mathematics of Operations Research (to appear)

Proximally Guided Stochastic Subgradient Method for Nonsmooth, Nonconvex Problems. Damek Davis, Benjamin Grimmer SIAM Journal on Optimization (to appear) | code

Trimmed Statistical Estimation via Variance Reduction Aleksandr Aravkin, Damek Davis Mathematics of Operations Research (2019) | video

Stochastic subgradient method converges on tame functions. Damek Davis, Dmitriy Drusvyatskiy, Sham Kakade, Jason D. Lee Foundations of Computational Mathematics (to appear) Finalist for the Best Paper Prize for Young Researchers in Continuous Optimization (2019)

The nonsmooth landscape of phase retrieval Damek Davis, Dmitriy Drusvyatskiy, Courtney Paquette IMA Journal on Numerical Analysis (2018)

Stochastic model-based minimization of weakly convex functions. Damek Davis, Dmitriy Drusvyatskiy SIAM Journal on Optimization (2019) | blog This is the combination of the two arXiv preprints arXiv:1802.02988 and arXiv:1803.06523 Supplementary technical note: Complexity of finding near-stationary points of convex functions stochastically Related report on nonsmooth nonconvex mirror descent Stochastic model-based minimization under high-order growth (2018) INFORMS Optimization Society Young Researchers Prize (2019)

Subgradient methods for sharp weakly convex functions Damek Davis, Dmitriy Drusvyatskiy, Kellie J. MacPhee, Courtney Paquette Journal of Optimization Theory and Applications (2018)

Forward-Backward-Half Forward Algorithm for Solving Monotone Inclusions Luis M. Briceño-Arias, Damek Davis SIAM Journal on Optimization (2018)

Convergence rate analysis of the forward-Douglas-Rachford splitting scheme. Damek Davis SIAM Journal on Optimization (2015)

Convergence rate analysis of primal-dual splitting schemes Damek Davis SIAM Journal on Optimization (2015)

Faster convergence rates of relaxed Peaceman-Rachford and ADMM under regularity assumptions Damek Davis, Wotao Yin Mathematics of Operations Research (2016)

A Three-Operator Splitting Scheme and its Optimization Applications. Damek Davis, Wotao Yin Set-Valued and Variational Analysis (2017) | code | slides

Beating level-set methods for 5D seismic data interpolation: a primal-dual alternating approach Rajiv Kumar, Oscar López, Damek Davis, Aleksandr Y. Aravkin, Felix J. Herrmann IEEE Transactions on Computational Imaging (2017)

Tactical Scheduling for Precision Air Traffic Operations: Past Research and Current Problems Douglas R. Isaacson, Alexander V. Sadovsky, Damek Davis Journal of Aerospace Information Systems, April, Vol. 11, No. 4 : pp. 234-257

Efficient computation of separation-compliant speed advisories for air traffic arriving in terminal airspace. Alexander V. Sadovsky, Damek Davis, Douglas R. Isaacson. Journal of Dynamic Systems Measurement and Control 136(4), 041027 (2014)

Separation-compliant, optimal routing and control of scheduled arrivals in a terminal airspace. Alexander V. Sadovsky, Damek Davis, and Douglas R. Isaacson. Transportation Research Part C: Emerging Technologies 37 (2013): 157-176

Factorial and Noetherian Subrings of Power Series Rings. Damek Davis, Daqing Wan Proceedings of the American Mathematical Society 139 (2011), no. 3, 823-834

Book chapters

Convergence rate analysis of several splitting schemes Damek Davis, Wotao Yin Splitting Methods in Communication and Imaging, Science and Engineering (2017) video | slides | summary Winner of the 2014 INFORMS optimization society best student paper prize.

Expository

A Short Course on Convex Analysis and First-Order Methods Damek Davis Manuscript (2023)

Subgradient methods under weak convexity and tame geometry Damek Davis, Dmitriy Drusvyatskiy SIAG/OPT News and Views (2020)

Convergence Rate Analysis of Several Splitting Schemes Damek Davis INFORMS OS Today (2015)

Technical reports

A linearly convergent Gauss-Newton subgradient method for ill-conditioned problems Damek Davis, Tao Jiang Technical report (2023) | code

Stochastic model-based minimization under high-order growth. Damek Davis, Dmitriy Drusvyatskiy, Kellie J. MacPhee Technical Report (2018)

An (O(n\log(n))) algorithm for projecting onto the ordered weighted (\ell_1) norm ball Damek Davis UCLA CAM report 15-32 (2015) | code

SMART: The Stochastic Monotone Aggregated Root-Finding Algorithm. Damek Davis Manuscript (2015) [ slides ] [ video ]