Convergence and performance analysis of kernel regularized robust recursive least squares.

2020 
Abstract Kernel recursive least squares (KRLS) is very sensitive to non-Gaussian noise and hence, robust extensions are proposed using maximum correntropy criterion or generalized maximum correntropy. However, because of the complex form of the model, there is no theoretical analysis on the convergence of these filters. In this paper, we propose a new alternative: Kernel Regularized Robust RLS (KR 3 LS). It uses half-quadratic technique to simplify the form of the loss function. Our major contribution is then proving the convergence of the filter to the target weights and desired output. The bounds of regularization factor is also obtained. KR 3 LS is experimentally tested using synthetic and real data and is shown to perform superior compared to other robust alternatives.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    25
    References
    0
    Citations
    NaN
    KQI
    []