A Projected Extrapolated Gradient Method with Larger Step Size for Monotone Variational Inequalities

2021 
A projected extrapolated gradient method is designed for solving monotone variational inequality in Hilbert space. Requiring local Lipschitz continuity of the operator, our proposed method improves the value of the extrapolated parameter and admits larger step sizes, which are predicted based a local information of the involved operator and corrected by bounding the distance between each pair of successive iterates. The correction will be implemented when the distance is larger than a given constant and its main cost is to compute a projection onto the feasible set. In particular, when the operator is the gradient of a convex function, the correction step is not necessary. We establish the convergence and ergodic convergence rate in theory under the larger range of parameters. Related numerical experiments illustrate the improvements in efficiency from the larger step sizes.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    39
    References
    0
    Citations
    NaN
    KQI
    []