A Kernel Support Vector Machine Trained Using Approximate Global and Exhaustive Local Sampling

2017 
AGEL-SVM is an extension to a kernel Support Vector Machine (SVM) and is designed for distributed computing using Approximate Global Exhaustive Local sampling (AGEL)-SVM. The dual form of SVM is typically solved using sequential minimal optimization (SMO) which iterates very fast if the full kernel matrix can fit in a computer's memory. AGEL-SVM aims to partition the feature space into sub problems such that the kernel matrix per problem can fit in memory by approximating the data outside each partition. AGEL-SVM has similar Cohen's Kappa and accuracy metrics as the underlying SMO implementation. AGEL-SVM's training times greatly decreased when running on a 128 worker MATLAB pool on Amazon's EC2. Predictor evaluation times are also faster due to a reduction in support vectors per partition.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    8
    References
    0
    Citations
    NaN
    KQI
    []