Kac meets Johnson and Lindenstrauss: a memory-optimal, fast Johnson-Lindenstrauss transform.

2020 
Based on the Kac random walk on the orthogonal group, we present a fast Johnson-Lindenstrauss transform: given a set $X$ of $n$ point sets in $\mathbb{R}^{d}$ and an error parameter $\epsilon$, this is a linear transformation $\Psi: \mathbb{R}^{d} \to \mathbb{R}^{O(\epsilon^{-2}\log{n})}$ such that $\|\Psi x\|_{2} \in (1- \epsilon, 1+\epsilon)\cdot \|x\|_{2}$ for all $x\in X$, and such that for each $x\in X$, $\Psi x$ can be computed in time $O(d\log{d} + \min\{d\log{n} + \epsilon^{-2}\log^{3}{n}\log^{3}(\epsilon^{-1}\log{n})\})$ with only a constant amount of memory overhead. In some parameter regimes, our algorithm is best known, and essentially confirms a conjecture of Ailon and Chazelle.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    12
    References
    1
    Citations
    NaN
    KQI
    []