Rescaling nonsmooth optimization using BFGS and Shor updates
2018
The BFGS quasi-Newton methodology, popular for smooth minimization, has also proved surprisingly effective in nonsmooth optimization. Through a variety of simple examples and computational experiments, we explore how the BFGS matrix update improves the local metric associated with a convex function even in the absence of smoothness and without using a line search. We compare the behavior of the BFGS and Shor r-algorithm updates.
Keywords:
- Correction
- Source
- Cite
- Save
- Machine Reading By IdeaReader
19
References
1
Citations
NaN
KQI