Half-Space Proximal Stochastic Gradient Method for Group-Sparsity Regularized Problem
2020
Optimizing with group-sparsity is significant in enhancing model interpretation in machining learning applications, e.g., model compression. However, for large-scale training problems, fast convergence and effective group-sparsity exploration are hard to achieved simultaneously in stochastic settings. Particularly, existing state-of-the-art methods, e.g., Prox-SG, RDA, Prox-SVRG and Prox-Spider, usually generate merely dense solutions. To overcome this shortage, we propose a novel stochastic method -- Half-Space Proximal Stochastic Gradient Method (HSProx-SG) to promote the group sparsity of the solutions and maintain the convergence guarantee. In general, the HSProx-SG method contains two steps: (i) the proximal stochastic gradient step searches a near-optimal non-sparse solution estimate; and (ii) the half-space step substantially boosts the sparsity level. Numerically, HSProx-SG demonstrates its superiority in both convex settings and non-convex deep neural networks, e.g., VGG16 and ResNet18, by achieving solutions of much higher group sparsity and competitive objective values or generalization accuracy.
Keywords:
- Correction
- Source
- Cite
- Save
- Machine Reading By IdeaReader
53
References
4
Citations
NaN
KQI