Uniform inference for value functions.

2020 
This paper develops a novel approach to uniform inference for the optimal value function, that is, the function that results from optimizing an objective function marginally over one of its arguments. The marginal optimization map is nonlinear and not differentiable, which complicates inference procedures, since statistical inference methods for nonlinear maps usually rely on regularity through a type of differentiability. We show that the map from objective function to uniform test statistics applied to the value functions - such as Kolmogorov-Smirnov or Cram\'er-von Mises statistics - are directionally differentiable. We establish consistency and weak convergence of nonparametric plug-in estimates of the test statistics and show how they can be used to conduct uniform inference. Because the limiting distribution of sample value functions is not generally tractable, to conduct practical inference, we develop detailed resampling techniques that combine a bootstrap procedure with estimates of the directional derivatives. In addition, we formally establish uniform size control of the resampling procedure for testing. Monte Carlo simulations assess the finite-sample properties of the proposed methods and show accurate empirical size of the procedures. Finally, we apply our methods to the evaluation of a job training program using bounds for the distribution function of treatment effects.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    70
    References
    2
    Citations
    NaN
    KQI
    []