Online Ordinal Optimization under Model Misspecification

2021 
We consider an ordinal optimization problem, where a decision maker learns the statistical characteristics of a number of systems using sequential sampling in order to ultimately determine the "best" one (with high probability). In so doing, the decision maker postulates a parametric model which may not precisely represent the true underlying system structure. We show that this misspecification, if not managed properly, can lead to suboptimal performance in the ordinal optimization problem due to a phenomenon identified as sample-selection endogeneity. To address that, we propose online sampling strategies that judiciously learn the unknown model parameters on the fly, and at the same time eliminate the adverse effects of misspecification as the number of samples grows large. The proposed sampling strategies are oblivious to the model misspecification; are shown to have strong performance guarantees regardless of that knowledge; and are computationally tractable.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    0
    References
    0
    Citations
    NaN
    KQI
    []