Multi-class Support Vector Machine (SVM) Classifiers -- An Application in Hypothyroid Detection and Classification
124
Citation
19
Reference
10
Related Paper
Citation Trend
Abstract:
The paper presents a Multi-class Support Vector Machine classifier and its application to hypothyroid detection and classification. Support Vector Machines (SVM) have been well known method in the machine learning community for binary classification problems. Multi-class SVMs (MCSVM) are usually implemented by combining several binary SVMs. The objective of this work is to show: first, robustness of various kind of kernels for Multi-class SVM classifier, second, a comparison of different constructing methods for Multi-class SVM, such as One-Against-One and One-Against-All, and finally comparing the classifiers' accuracy of Multi-class SVM classifier to AdaBoost and Decision Tree. The simulation results show that One-Against-All Support Vector Machines (OAASVM) are superior to One-Against-One Support Vector Machines (OAOSVM) with polynomial kernels. The accuracy of OAASVM is also higher than AdaBoost and Decision Tree classifier on hypothyroid disease datasets from UCI machine learning dataset.Keywords:
Margin classifier
AdaBoost
Relevance vector machine
Linear classifier
Quadratic classifier
Binary classification
Robustness
In view of that the common support vector machine(SVM)learning algorithm is time-consuming and lower in efficiency,an algorithm of incremental vector support vector machine(IV-SVM)learning algorithm is put forward here.A primary SVM classifier is acquired based on the selected increment vectors in the kernel space.According to Karush-Kuhn-Tucker(KKT)conditions,the original training samples are pruned through the primary SVM classifier.The final SVM classifier is obtained by training the primary SVM classifier with reduction samples.Simulation experiments show that,compared with the common support vector machine,IV-SVM can reduce the training time of large capacity data samples support vector machine.
Margin classifier
Karush–Kuhn–Tucker conditions
Relevance vector machine
Ranking SVM
Cite
Citations (1)
Support vector machine is a new machine learning technique developed from the middle of 1990s.Being different from traditional neural networks,it is based on structure risk minimization principle.In this paper,modifying classifier method is proposed,which can improve the performance of a support vector machine classifier by a conformal mapping.But the modifying classifier method is accomplished by two optimization.In order to improve the speed of support vector machine,the training patterns are reduced through the pre extracting support vector machine.The experimental results show that the separability between classes is increased and speed is well improved.
Margin classifier
Structural risk minimization
Relevance vector machine
Quadratic classifier
Cite
Citations (1)
The support vector machine is a novel type of learning technique, based on statistical learning theory, which uses Mercer kernels for efficiently performing computations in high dimensional spaces. In pattern recognition, the support vector algorithm constructs nonlinear decision functions by training a classifier to perform a linear separation in some high dimensional space which is nonlinearly related to input space.
Statistical learning theory
Relevance vector machine
Margin classifier
Linear classifier
Cite
Citations (3)
For complicated recognition problem, the number of support vectors is large and recognition speed is low, because some sample were divided into section by error this time. To solve this problem, a method is bought to simplify the support vector machines based the minimal misestimate margin idea. Experiments show that this new support vector machine not only reduces the number of support vectors and recognition time but also has the same accuracy as (even better than) traditional support vector machine.
Relevance vector machine
Margin classifier
Sequential minimal optimization
Margin (machine learning)
Sample (material)
Cite
Citations (3)
Margin classifier
Relevance vector machine
Ranking SVM
Discriminative model
Cite
Citations (4)
Margin classifier
Quadratic classifier
Linear classifier
Decision boundary
Cite
Citations (0)
With combination of advantages of both Fisher discriminant analysis and support vector machines,this paper develops an improved classification algorithm,called Fisher-Support Vector Classifier.The central idea is that the vector w* of the optimal hyperplane is found along which the samples are projected such that the margin is maximized while within-class scatter is kept as small as possible.In linear case,it can be converted to traditional Support Vector Machines(SVM) to solve and doesn't need to design new algorithms.In nonlinear case,a new algorithm is produced by the reproducing kernel theory. The test result shows that the Fisher-Support Vector Classifier established has a high accuracy and reliability.
Margin classifier
Hyperplane
Fisher kernel
Quadratic classifier
Relevance vector machine
Linear classifier
Statistical learning theory
Cite
Citations (4)
The support vector classifier is a new tool to solve classification problems, giving the classification boundary as a linear combination of the training samples. In non-separable problems with highly overlapped classes, the achieved classifiers are oversized. In this paper, we proposed to change the support vector classifier penalty function by an hyperbolic tangent one, obtaining as a result of the training phase a reduced support vector classifier with the same performance as the original one.
Margin classifier
Quadratic classifier
Decision boundary
Linear classifier
Cite
Citations (15)
Proximal support vector machine is a variation of standard support vector machine and can be trained extremely efficiently for binary classification. However in many application fields, multi-class classification and incremental learning must be supported. Incremental linear proximal support vector classifier for multi-class classification has been developed in recent years, but only its performance in "one-against-all" manner has been investigated, and the application of proximal support vector machine for nonlinear multi-class classification has not been studied. In order to apply proximal support vector machine to more fields, three multi-class classification policies ("one-against-all", "one-against-one", "DAGSVM") applied to incremental linear proximal support vector classifier are compared and incremental nonlinear proximal support vector classifier for multi-class classification based on Gaussian kernel is investigated in the paper. The experiments indicate that "one-against-all" policy is best for incremental linear proximal support vector classifier according to the tradeoff between computing complexity and correctness, and the introduced incremental nonlinear proximal support vector classifier is effective in "one-against-all" manner when the reduce rate is below 0.6.
Margin classifier
Linear classifier
Quadratic classifier
Relevance vector machine
Binary classification
Cite
Citations (8)
A two-layer support vector classifier model with rejection feature is proposed in this paper.Firstly the sphere support vectors of each class to describe the distribution of the sample were obtained by searching all the sphere boundaries containing the samples of each class.Then the input pattern of no-object classes could be rejected by the first support vector domain description (SVDD).If a pattern is accepted by the first SVDD,the second layer of support vector classifier (SVC) with maximum margin between two classes will be used for classification.In addition,Instead of the traditional quadratic programming,multiplicative iterative updates rule is used to solve the optimizing problems in SVDD of the first layer and the SVC in second layer.Compared to the tradition algorithm of the support vector machine,the new method improves greatly the computational speed of optimization.Experimental results demonstrate that the method of two-Layer support vector classifier with Rejection Feature is feasible and it could be applicable in many real pattern recognition fields.
Margin classifier
Quadratic classifier
Relevance vector machine
Linear classifier
Feature vector
Cite
Citations (0)