Structured adaptive and random spinners for fast machine learning computations
2017
We consider an efficient computational framework for speeding up several machine learning algorithms with almost no loss of accuracy. The proposed framework relies on projections via structured matrices that we call Structured Spinners, which are formed as products of three structured matrix-blocks that incorporate rotations. The approach is highly generic, i.e. i) structured matrices under consideration can either be fully-randomized or learned, ii) our structured family contains as special cases all previously considered structured schemes, iii) the setting extends to the non-linear case where the projections are followed by non-linear functions, and iv) the method finds numerous applications including kernel approximations via random feature maps, dimensionality reduction algorithms, new fast cross-polytope LSH techniques, deep learning, convex optimization algorithms via Newton sketches, quantization with random projection trees, and more. The proposed framework comes with theoretical guarantees characterizing the capacity of the structured model in reference to its unstructured counterpart and is based on a general theoretical principle that we describe in the paper. As a consequence of our theoretical analysis, we provide the first theoretical guarantees for one of the most efficient existing LSH algorithms based on the HD3HD2HD1 structured matrix [Andoni et al., 2015]. The exhaustive experimental evaluation confirms the accuracy and efficiency of structured spinners for a variety of different applications.
Keywords:
- Quantization (signal processing)
- Deep learning
- Machine learning
- Mathematical optimization
- Structured support vector machine
- Mathematics
- Dimensionality reduction
- Matrix (mathematics)
- Artificial intelligence
- Structured prediction
- Convex optimization
- Random projection
- Kernel (linear algebra)
- Theoretical computer science
- Computer science
- Correction
- Source
- Cite
- Save
- Machine Reading By IdeaReader
32
References
23
Citations
NaN
KQI