A growing line of work has shown that a variety of signal and matrix recovery problems become tractable under proper statistical assumptions. In the last two decades, such problems have increasingly been solved through convex relaxations based on semi-definite programming. In the very-high dimensional regime, such relaxations are intractable as they often square the ambient dimension of the problem. Consequently, focus has shifted back to directly optimizing nonconvex problems through iterative methods. For such methods, complexity guarantees rely on fine classification of problem geometry, particularly the structure of the Hessian. This traditional approach fails in the presence of nonsmooth geometry, which is much more difficult to characterize. In this talk, we show that nonsmooth formulations of many problems, such as phase retrieval, matrix completion, robust PCA, and blind deconvolution, exhibit sharp growth and can be convexified by adding a small quadratic. These two properties in turn lead to rapidly convergent algorithms for robust signal and low rank matrix recovery at near optimal sample complexity cost.