Regularization methods for separable nonlinear models

2019 
Separable nonlinear models frequently arise in system identification, signal analysis, electrical engineering, and machine learning. Their parameter optimization belongs to a class of separable nonlinear least squares (SNLLS) problem. Applying the classical variable projection algorithm to the SNLLS problems may give poor generalization. In order to handle complexity control and ill-conditioned nonlinear least squares problems, we consider in this paper two \(L_2\) regularization algorithms for the SNLLS problems. The first approach is to directly add a Tikhonov penalty to the objective function of the SNLLS problem. The second approach is to replace the ordinary linear least squares problem in the SNLLS problem by a Tikhonov one. We give their difference from the perspective of Bayesian. Numerical experiments are also presented to compare the performance of the two regularized algorithms. Results show that the first regularization method is more robust than the second one.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    45
    References
    0
    Citations
    NaN
    KQI
    []