Uncertainty Quantification for Accelerated Gradient Methods on Strongly Convex Functions.

2021 
We consider the problem of minimising a strongly convex function that varies with an uncertain parameter $\theta$. This uncertain parameter means the optimum is also a function of $\theta$, and we aim to learn about this function and its statistics. We use chaos expansions, a technique that has been used for stochastic approximation, and use gradient methods to compute the optimal coefficients. We give the first non-asymptotic rates for this problem for gradient descent and also give an accelerated method that obtains optimal convergence rate. Our method is faster than the state-of-the-art and also acts as variance reduction.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    14
    References
    0
    Citations
    NaN
    KQI
    []