Complexity of the Fourier transform on the Johnson graph

2017 
The set $X$ of $k$-subsets of an $n$-set has a natural graph structure where two $k$-subsets are connected if and only if the size of their intersection is $k-1$. This is known as the Johnson graph. The symmetric group $S_n$ acts on the space of complex functions on $X$ and this space has a multiplicity-free decomposition as sum of irreducible representations of $S_n$, so it has a well-defined Gelfand-Tsetlin basis up to scalars. The Fourier transform on the Johnson graph is defined as the change of basis matrix from the delta function basis to the Gelfand-Tsetlin basis. The direct application of this matrix to a generic vector requires $\binom{n}{k}^2$ arithmetic operations. We show that --in analogy with the classical Fast Fourier Transform on the discrete circle-- this matrix can be factorized as a product of $n-1$ orthogonal matrices, each one with at most two nonzero elements in each column. This factorization shows that the number of arithmetic operations required to apply this matrix to a generic vector is bounded above by $2(n-1) \binom{n}{k}$. As a consequence, we show that the problem of computing all the weights of the irreducible components of a given function can be solved in $O(n \binom{n}{k})$ operations, improving the previous bound $O(k^2 \binom{n}{k})$ when $k$ asymptotically dominates $\sqrt{n}$ in a non-uniform model of computation. The same improvement is achieved for the problem of computing the isotypic projection onto a single component. The proof is based on the construction of $n-1$ intermediate bases, each one parametrized by certain pairs composed by a standard Young tableau and a word. The parametrization of each basis is obtained via the Robinson-Schensted insertion algorithm.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    12
    References
    3
    Citations
    NaN
    KQI
    []