$H^2$--Convergence of least-squares kernel collocation methods

2018 
The strong-form asymmetric kernel-based collocation method, commonly referred to as the Kansa method, is easy to implement and hence is widely used for solving engineering problems and partial differential equations despite the lack of theoretical support. The simple least-squares (LS) formulation, on the other hand, makes the study of its solvability and convergence rather nontrivial. In this paper, we focus on general second order linear elliptic differential equations in $\Omega \subset R^d$ under Dirichlet boundary conditions. With kernels that reproduce $H^m(\Omega)$ and some smoothness assumptions on the solution, we provide denseness conditions for a constrained least-squares method and a class of weighted least-squares algorithms to be convergent. Theoretically, we identify some $H^2(\Omega)$ convergent LS formulations that have an optimal error behavior like $h^{m-2}$. We also demonstrate the effects of various collocation settings on the respective convergence rates, as well as how these formulations perform with high order kernels and when coupled with the stable evaluation technique for the Gaussian kernel.
    • Correction
    • Cite
    • Save
    • Machine Reading By IdeaReader
    0
    References
    0
    Citations
    NaN
    KQI
    []