Publications

Preprints in Review

  1. A nearly linearly convergent first-order method for nonsmooth functions with quadratic growth
    Damek Davis, Liwei Jiang
    Manuscript (2022)

  2. A gradient sampling method with complexity guarantees for Lipschitz functions in high and low dimensions
    Damek Davis, Dmitriy Drusvyatskiy, Yin Tat Lee, Swati Padmanabhan, Guanghao Ye
    Manuscript (2022)

  3. A superlinearly convergent subgradient method for sharp semismooth problems
    Vasileios Charisopoulos, Damek Davis
    Manuscript (2021) [code]

  4. Subgradient methods near active manifolds: saddle point avoidance, local convergence, and asymptotic normality
    Damek Davis, Dmitriy Drusvyatskiy, Liwei Jiang
    Manuscript (2021)

  5. Clustering a Mixture of Gaussians with Unknown Covariance
    Damek Davis, Mateo Diaz, Kaizheng Wang
    Manuscript (2021)

  6. Escaping strict saddle points of the Moreau envelope in nonsmooth optimization
    Damek Davis, Mateo Díaz, Dmitriy Drusvyatskiy
    Manuscript (2021)

  7. Stochastic optimization over proximally smooth sets
    Damek Davis, Dmitriy Drusvyatskiy, Zhan Shi
    Manuscript (2020)

  8. Stochastic algorithms with geometric step decay converge linearly on sharp functions
    Damek Davis, Dmitriy Drusvyatskiy, Vasileios Charisopoulos
    Manuscript (2019) [code]

Journal Publications (Accepted or to Appear)

  1. Variance reduction for root-finding problems
    Damek Davis
    Mathematical Programming (to appear)

  2. Conservative and semismooth derivatives are equivalent for semialgebraic maps
    Damek Davis, Dmitriy Drusvyatskiy
    Set-Valued and Variational Analysis (to appear)

  3. From low probability to high confidence in stochastic convex optimization
    Damek Davis, Dmitriy Drusvyatskiy, Lin Xiao, Junyu Zhang
    Journal of Machine Learning Research (to appear)

  4. Proximal methods avoid active strict saddles of weakly convex functions
    Damek Davis, Dmitriy Drusvyatskiy
    Foundations of Computational Mathematics (2021)

  5. Low-rank matrix recovery with composite optimization: good conditioning and rapid convergence
    Vasileios Charisopoulos, Yudong Chen, Damek Davis, Mateo Díaz, Lijun Ding, Dmitriy Drusvyatskiy
    Foundations of Computational Mathematics (to appear) [code]

  6. Composite optimization for robust rank one bilinear sensing
    Vasileios Charisopoulos, Damek Davis, Mateo Diaz, Dmitriy Drusvyatskiy
    IMA Journal on Information and Inference (2020) [code]

  7. Graphical Convergence of Subgradients in Nonconvex Optimization and Learning
    Damek Davis, Dmitriy Drusvyatskiy
    Mathematics of Operations Research (to appear)

  8. Proximally Guided Stochastic Subgradient Method for Nonsmooth, Nonconvex Problems.
    Damek Davis, Benjamin Grimmer
    SIAM Journal on Optimization (to appear) [code]

  9. Trimmed Statistical Estimation via Variance Reduction
    Aleksandr Aravkin, Damek Davis
    Mathematics of Operations Research (2019) [video]

  10. Stochastic subgradient method converges on tame functions.
    Damek Davis, Dmitriy Drusvyatskiy, Sham Kakade, Jason D. Lee
    Foundations of Computational Mathematics (to appear)
    Finalist for the Best Paper Prize for Young Researchers in Continuous Optimization (2019)

  11. The nonsmooth landscape of phase retrieval
    Damek Davis, Dmitriy Drusvyatskiy, Courtney Paquette
    IMA Journal on Numerical Analysis (to appear)

  12. Stochastic model-based minimization of weakly convex functions.
    Damek Davis, Dmitriy Drusvyatskiy
    SIAM Journal on Optimization (2019) [blog]
    This is the combination of the two arXiv preprints arXiv:1802.02988 and arXiv:1803.06523
    Supplementary technical note: Complexity of finding near-stationary points of convex functions stochastically
    Related report on nonsmooth nonconvex mirror descent Stochastic model-based minimization under high-order growth (2018)
    INFORMS Optimization Society Young Researchers Prize (2019)

  13. Subgradient methods for sharp weakly convex functions
    Damek Davis, Dmitriy Drusvyatskiy, Kellie J. MacPhee, Courtney Paquette
    Journal of Optimization Theory and Applications (2018)

  14. Forward-Backward-Half Forward Algorithm for Solving Monotone Inclusions
    Luis M. Briceño-Arias, Damek Davis
    SIAM Journal on Optimization (2018)

  15. Convergence rate analysis of the forward-Douglas-Rachford splitting scheme.
    Damek Davis
    SIAM Journal on Optimization (2015)

  16. Convergence rate analysis of primal-dual splitting schemes
    Damek Davis
    SIAM Journal on Optimization (2015)

  17. Faster convergence rates of relaxed Peaceman-Rachford and ADMM under regularity assumptions
    Damek Davis, Wotao Yin
    Mathematics of Operations Research (2016)

  18. A Three-Operator Splitting Scheme and its Optimization Applications.
    Damek Davis, Wotao Yin
    Set-Valued and Variational Analysis (2017) [code] [slides]

  19. Beating level-set methods for 5D seismic data interpolation: a primal-dual alternating approach
    Rajiv Kumar, Oscar López, Damek Davis, Aleksandr Y. Aravkin, Felix J. Herrmann
    IEEE Transactions on Computational Imaging (2017)

  20. Tactical Scheduling for Precision Air Traffic Operations: Past Research and Current Problems
    Douglas R. Isaacson, Alexander V. Sadovsky, Damek Davis
    Journal of Aerospace Information Systems, April, Vol. 11, No. 4 : pp. 234-257

  21. Efficient computation of separation-compliant speed advisories for air traffic arriving in terminal airspace.
    Alexander V. Sadovsky, Damek Davis, Douglas R. Isaacson.
    Journal of Dynamic Systems Measurement and Control 136(4), 041027 (2014)

  22. Separation-compliant, optimal routing and control of scheduled arrivals in a terminal airspace.
    Alexander V. Sadovsky, Damek Davis, and Douglas R. Isaacson.
    Transportation Research Part C: Emerging Technologies 37 (2013): 157-176

  23. Factorial and Noetherian Subrings of Power Series Rings.
    Damek Davis, Daqing Wan
    Proceedings of the American Mathematical Society 139 (2011), no. 3, 823-834

Conference Proceedings (Accepted or to Appear)

  1. High probability guarantees for stochastic convex optimization
    Damek Davis, Dmitriy Drusvyatskiy
    In Conference on Learning Theory (2020)

  2. Global Convergence of EM Algorithm for Mixtures of Two Component Linear Regression
    Jeongyeol Kwon, Wei Qian, Constantine Caramanis, Yudong Chen, and Damek Davis
    Conference on Learning Theory (2019)

  3. The Sound of APALM Clapping: Faster Nonsmooth Nonconvex Optimization with Stochastic Asynchronous PALM
    Damek Davis, Brent Edmunds, Madeleine Udell
    Neural Information Processing Systems (2016) [report]

  4. Multiview Feature Engineering and Learning
    Jingming Dong, Nikos Karianakis, Damek Davis, Joshua Hernandez, Jonathan Balzer and Stefano Soatto
    In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (2015)

  5. Asymmetric sparse kernel approximations for large-scale visual search.
    Damek Davis, Jonathan Balzer, Stefano Soatto
    In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (2014)

Book Chapters

  1. Convergence rate analysis of several splitting schemes
    Damek Davis, Wotao Yin
    Splitting Methods in Communication and Imaging, Science and Engineering (2017) [video] [slides] [summary]
    Winner of the 2014 INFORMS optimization society best student paper prize.

Expository

  1. Subgradient methods under weak convexity and tame geometry
    Damek Davis, Dmitriy Drusvyatskiy
    SIAG/OPT News and Views (2020)

  2. Convergence Rate Analysis of Several Splitting Schemes
    Damek Davis
    INFORMS OS Today (2015)

Technical Reports

  1. Stochastic model-based minimization under high-order growth.
    Damek Davis, Dmitriy Drusvyatskiy, Kellie J. MacPhee
    Technical Report (2018)

  2. An \(O(n\log(n))\) algorithm for projecting onto the ordered weighted \(\ell_1\) norm ball
    Damek Davis
    UCLA CAM report 15-32 (2015) [code]

  3. SMART: The Stochastic Monotone Aggregated Root-Finding Algorithm.
    Damek Davis
    Manuscript (2015) [slides] [video]