Precise Error Analysis of Regularized M-estimators in High-dimensions
2016
A popular approach for estimating an unknown signal from noisy, linear measurements is via solving a so called \emph{regularized M-estimator}, which minimizes a weighted combination of a convex loss function and of a convex (typically, non-smooth) regularizer. We accurately predict the squared error performance of such estimators in the high-dimensional proportional regime. The random measurement matrix is assumed to have entries iid Gaussian, only minimal and rather mild regularity conditions are imposed on the loss function, the regularizer, and on the noise and signal distributions. We show that the error converges in probability to a nontrivial limit that is given as the solution to a minimax convex-concave optimization problem on four scalar optimization variables. We identify a new summary parameter, termed the Expected Moreau envelope to play a central role in the error characterization. The \emph{precise} nature of the results permits an accurate performance comparison between different instances of regularized M-estimators and allows to optimally tune the involved parameters (e.g. regularizer parameter, number of measurements). The key ingredient of our proof is the \emph{Convex Gaussian Min-max Theorem} (CGMT) which is a tight and strengthened version of a classical Gaussian comparison inequality that was proved by Gordon in 1988.
Keywords:
- Correction
- Cite
- Save
- Machine Reading By IdeaReader
52
References
3
Citations
NaN
KQI