Diffeomorphism learning via relative entropy constrained optimal transport

2016 
Performing inference on the Lie group of diffeomorphisms of Euclidean space has many applications, including computer vision, computational anatomy, and density estimation. Computational tools to find such diffeomorphisms typically involve dynamical systems, and computational fluid mechanics. We here consider the problem where we are given IID samples from a distribution P and want to learn a diffeomorphism that transforms them to a samples from a known distribution Q. Using optimal transport theory, properties of relative entropy, and convex optimization, we demonstrate that when the density for Q is log-concave, efficient and scalable convex optimization algorithms can learn the diffeomorphism. We demonstrate applications in density estimation for probabilistic sleep staging where we improve classification performance.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    16
    References
    0
    Citations
    NaN
    KQI
    []