A Kernel Support Vector Machine Trained Using Approximate Global and Exhaustive Local Sampling


Journal article


Benjamin Bryant, H. Sari-Sarraf, L. R. Long, Sameer Kiran Antani
BDCAT, 2017

Semantic Scholar DBLP DOI
Cite

Cite

APA   Click to copy
Bryant, B., Sari-Sarraf, H., Long, L. R., & Antani, S. K. (2017). A Kernel Support Vector Machine Trained Using Approximate Global and Exhaustive Local Sampling. BDCAT.


Chicago/Turabian   Click to copy
Bryant, Benjamin, H. Sari-Sarraf, L. R. Long, and Sameer Kiran Antani. “A Kernel Support Vector Machine Trained Using Approximate Global and Exhaustive Local Sampling.” BDCAT (2017).


MLA   Click to copy
Bryant, Benjamin, et al. “A Kernel Support Vector Machine Trained Using Approximate Global and Exhaustive Local Sampling.” BDCAT, 2017.


BibTeX   Click to copy

@article{benjamin2017a,
  title = {A Kernel Support Vector Machine Trained Using Approximate Global and Exhaustive Local Sampling},
  year = {2017},
  journal = {BDCAT},
  author = {Bryant, Benjamin and Sari-Sarraf, H. and Long, L. R. and Antani, Sameer Kiran}
}

Abstract

AGEL-SVM is an extension to a kernel Support Vector Machine (SVM) and is designed for distributed computing using Approximate Global Exhaustive Local sampling (AGEL)-SVM. The dual form of SVM is typically solved using sequential minimal optimization (SMO) which iterates very fast if the full kernel matrix can fit in a computer's memory. AGEL-SVM aims to partition the feature space into sub problems such that the kernel matrix per problem can fit in memory by approximating the data outside each partition. AGEL-SVM has similar Cohen's Kappa and accuracy metrics as the underlying SMO implementation. AGEL-SVM's training times greatly decreased when running on a 128 worker MATLAB pool on Amazon's EC2. Predictor evaluation times are also faster due to a reduction in support vectors per partition.