Design and evaluation of a fast half-against-half support vector machine classifier
0
Citation
6
Reference
10
Related Paper
Abstract:
SVM classifiers with Half Against Half (HAH) architecture are reported to be the fastest classifier amongst other SVM classification architectures reported in literature. An attempt is made to enhance the speed of HAH SVM classifier and is named as Fast HAH (F-HAH) classifier. The performance of proposed F-HAH classifier is evaluated using speaker dependent and multi-speaker dependent isolated digits from TI46 database. 1.05×-1.61× improvement in speed is achieved on using Fast HAH classifier for various experiments carried out.Keywords:
Margin classifier
In view of that the common support vector machine(SVM)learning algorithm is time-consuming and lower in efficiency,an algorithm of incremental vector support vector machine(IV-SVM)learning algorithm is put forward here.A primary SVM classifier is acquired based on the selected increment vectors in the kernel space.According to Karush-Kuhn-Tucker(KKT)conditions,the original training samples are pruned through the primary SVM classifier.The final SVM classifier is obtained by training the primary SVM classifier with reduction samples.Simulation experiments show that,compared with the common support vector machine,IV-SVM can reduce the training time of large capacity data samples support vector machine.
Margin classifier
Karush–Kuhn–Tucker conditions
Relevance vector machine
Ranking SVM
Cite
Citations (1)
Support vector machines (SVMs) are powerful tools for providing solutions to classification and function approximation problems. The comparison among the four classification methods is conducted. The four methods are Lagrangian support vector machine (LSVM), finite Newton Lagrangian support vector machine (NLSVM), smooth support vector machine (SSVM) and finite Newton support vector machine (NSVM). The comparison of their algorithm in generating a linear or nonlinear kernel classifier, accuracy and computational complexity is also given. The study provides some guidelines for choosing an appropriate one from four SVM classification methods in a classification problem.
Relevance vector machine
Linear classifier
Kernel (algebra)
Margin classifier
Statistical classification
Cite
Citations (25)
For complicated recognition problem, the number of support vectors is large and recognition speed is low, because some sample were divided into section by error this time. To solve this problem, a method is bought to simplify the support vector machines based the minimal misestimate margin idea. Experiments show that this new support vector machine not only reduces the number of support vectors and recognition time but also has the same accuracy as (even better than) traditional support vector machine.
Relevance vector machine
Margin classifier
Sequential minimal optimization
Margin (machine learning)
Sample (material)
Cite
Citations (3)
Nonlinear classifiers algorithms of support vector machine(SVM) and least squares support vector machine(LS-SVM) are discussed and compared.Transformer fault diagnosis which is based on support vector machines multi-classification binary tree model is proposed.SVM classifier and the LS-SVM classifier are used for transformer fault diagnosis,through the grid-search and cross-validation method of choice to determine parameters of support vector machines,which achieved high accuracy.Experimental verification shows that the model has high application potential in transformer fault diagnosis.
Margin classifier
Binary classification
Cite
Citations (0)
Network is more and more popular in the present society. Least squares support vector machine is a kind modified support vector machine for classification, which can solve a convex quadratic programming problem. Least squares support vector machine is presented to network intrusion detection. We apply KDDCUP99 experimental data of MIT Lincoln Laboratory to research the classification performance of LS-SVM classifier. Support vector machine, BP neural network are used to compare with the proposed method in the paper. The experimental indicates that LS-SVM detection method has higher detection accuracy than support vector machine, BP neural network.
Relevance vector machine
Margin classifier
Quadratic classifier
Ranking SVM
Least-squares function approximation
Cite
Citations (12)
Margin classifier
Relevance vector machine
Ranking SVM
Discriminative model
Cite
Citations (4)
The paper presents a Multi-class Support Vector Machine classifier and its application to hypothyroid detection and classification. Support Vector Machines (SVM) have been well known method in the machine learning community for binary classification problems. Multi-class SVMs (MCSVM) are usually implemented by combining several binary SVMs. The objective of this work is to show: first, robustness of various kind of kernels for Multi-class SVM classifier, second, a comparison of different constructing methods for Multi-class SVM, such as One-Against-One and One-Against-All, and finally comparing the classifiers' accuracy of Multi-class SVM classifier to AdaBoost and Decision Tree. The simulation results show that One-Against-All Support Vector Machines (OAASVM) are superior to One-Against-One Support Vector Machines (OAOSVM) with polynomial kernels. The accuracy of OAASVM is also higher than AdaBoost and Decision Tree classifier on hypothyroid disease datasets from UCI machine learning dataset.
Margin classifier
AdaBoost
Relevance vector machine
Linear classifier
Quadratic classifier
Binary classification
Robustness
Cite
Citations (124)
This chapter contains sections titled: Motivation for Margin-Based Loss Margin-Based Loss, Robustness, and Complexity Control Optimal Separating Hyperplane High-Dimensional Mapping and Inner Product Kernels Support Vector Machine for Classification Support Vector Implementations Support Vector Regression SVM Model Selection Support Vector Machines and Regularization Single-Class SVM and Novelty Detection Summary and Discussion
Hyperplane
Relevance vector machine
Margin classifier
Robustness
Novelty Detection
Margin (machine learning)
Regularization
Cite
Citations (16)
패턴인식 문제에서 support vector machine은 신경망과 더불어 두 그룹을 구분하는 문제(classification problem)에 매우 효과적인 방법으로 제안되고 있다. 본 연구에서는 support vector machine과 신경망을 비교 분석하여 support vector machine의 우수성을 확인하였으며, 두 그룹의 upper bound에 벌칙함수를 다르게 적용하는 방법을 support vector machine에 도입하여 'weighted support vector machine' 시스템을 구현
하였다. 이러한 weighted support vector machine은 어느 한 그룹의 인식률을 향상시킬 수 있는 적절하고 효율적인 방법임을 여러 가지 실험 결과로부터 보이도록 하였다.
Relevance vector machine
Margin classifier
Sequential minimal optimization
Cite
Citations (0)
Support Vector Machine (SVM) is one of the best techniques to classify the data into multiple classes. In the recent years, support vector machine has been used extensively for classification problems. The main advantage of using support vector machines is its better generalization ability while using higher dimension of data. This paper gives a review on the formulation of some important variants of SVM i.e. hard margin SVM, soft margin SVM, Least Squares Support Vector Machine (LSSVM), Twin Support Vector Machine (TWSVM) and Least Squares Support Vector Machine (LS-TWSVM). To check the effectiveness of these methods, numerical experiments are performed on artificial and real world datasets.
Relevance vector machine
Margin (machine learning)
Margin classifier
Least-squares function approximation
Cite
Citations (1)