Proximity Operators of Discrete Information Divergences

2016 
Information divergences allow one to assess how close two distributions are from each other. Among the large panel of available measures, a special attention has been paid to convex $\varphi$-divergences, such as Kullback-Leibler, Jeffreys-Kullback, Hellinger, Chi-Square, I$_{\alpha}$, and Renyi divergences. While $\varphi$-divergences have been extensively studied in convex analysis, optimization problems involving them often remain challenging. In this regard, one of the main shortcomings of existing methods is that the minimization of $\varphi$-divergences is usually performed with respect to one of their arguments, possibly within alternating optimization techniques. In this paper, we overcome this limitation by deriving new closed-form expressions for the proximity operator of such two-variable functions. This makes it possible to employ standard proximal methods for efficiently solving a wide range of convex optimization problems involving $\varphi$-divergences. In addition, we show that these proximity operators are useful to compute the epigraphical projection of several functions of practical interest. The proposed proximal tools are numerically validated in the context of optimal query execution within database management systems, where the problem of selectivity estimation plays a central role. Experiments are carried out on small to large scale scenarios.
    • Correction
    • Cite
    • Save
    • Machine Reading By IdeaReader
    79
    References
    0
    Citations
    NaN
    KQI
    []