Information bound for bandwidth selection in kernel estimation of density derivatives

2000 
Based on a random sample of size n from an unknown density f on the real line, several data-driven methods for selecting the bandwidth in kernel estimation of f (k) , k =0 , 1,..., have recently been proposed which have a very fast asymptotic rate of convergence to the optimal bandwidth, where f (k) denotes the kth derivative of f . In particular, for all k and sufficiently smooth f , the best possible relative rate of convergence is Op(n −1/2 ). For k = 0, Fan and Marron (1992) employed semiparametric arguments to obtain the best possible constant coefficient, that is, an analog of the usual Fisher information bound, in this convergence. The purpose of this paper is to show that their arguments can be extended to establish information bounds for all k. The extension from the special case k = 0 to the case of general k requires some nontrival work and gives a significant benchmark as to how well a bandwidth selector can hope to perform in kernel estimation of f (k) .
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    39
    References
    3
    Citations
    NaN
    KQI
    []