Graph Metric Learning via Gershgorin Disc Alignment
2020
We propose a general projection-free metric learning framework, where the minimization objective min
M∈S
Q(M) is a convex differentiable function of the metric matrix M, and M resides in the set S of generalized graph Laplacian matrices for connected graphs with positive edge weights and node degrees. Unlike low-rank metric matrices common in the literature, S includes the important positive-diagonal-only matrices as a special case in the limit. The key idea for fast optimization is to rewrite the positive definite cone constraint in S as signal-adaptive linear constraints via Gershgorin disc alignment, so that the alternating optimization of the diagonal and off-diagonal terms in M can be solved efficiently as linear programs via Frank-Wolfe iterations. We prove that left-ends of the Gershgorin discs can be aligned perfectly using the first eigenvector v of M, which we update iteratively using Locally Optimal Block Preconditioned Conjugate Gradient (LOBPCG) with warm start as diagonal / off-diagonal terms are optimized. Experiments show that our efficiently computed graph metric matrices outperform metrics learned using competing methods in terms of classification tasks.
- Correction
- Source
- Cite
- Save
- Machine Reading By IdeaReader
26
References
0
Citations
NaN
KQI