Sparse random tensors: Concentration, regularization and applications
2021
We prove a non-asymptotic concentration inequality of sparse inhomogeneous random tensors under the spectral norm. For an order-$k$ inhomogeneous random tensor $T$ with sparsity $p_{\max}\geq \frac{c\log n}{n }$, we show that $\|T-\mathbb E T\|=O(\sqrt{n p_{\max}}\log^{k-2}(n))$ with high probability. The optimality of this bound up to polylog factors is provided by an information theoretic lower bound. By tensor matricization, we extend the range of sparsity to $p_{\max}\geq \frac{c\log n}{n^{k-1}}$ and obtain $\|T-\mathbb E T\|=O(\sqrt{n^{k-1} p_{\max}})$ with high probability. We also provide a simple way to regularize $T$ such that $O(\sqrt{n^{k-1}p_{\max}})$ concentration still holds down to sparsity $p_{\max}\geq \frac{c}{n^{k-1}}$. We present our concentration and regularization results with two applications: (i) a randomized construction of hypergraphs of bounded degrees with good expander mixing properties, (ii) concentration of sparsified tensors under uniform sampling.
Keywords:
- Correction
- Source
- Cite
- Save
- Machine Reading By IdeaReader
0
References
2
Citations
NaN
KQI