Network intrusion detection method by least squares support vector machine classifier
12
Citation
5
Reference
10
Related Paper
Citation Trend
Abstract:
Network is more and more popular in the present society. Least squares support vector machine is a kind modified support vector machine for classification, which can solve a convex quadratic programming problem. Least squares support vector machine is presented to network intrusion detection. We apply KDDCUP99 experimental data of MIT Lincoln Laboratory to research the classification performance of LS-SVM classifier. Support vector machine, BP neural network are used to compare with the proposed method in the paper. The experimental indicates that LS-SVM detection method has higher detection accuracy than support vector machine, BP neural network.Keywords:
Relevance vector machine
Margin classifier
Quadratic classifier
Ranking SVM
Least-squares function approximation
In view of that the common support vector machine(SVM)learning algorithm is time-consuming and lower in efficiency,an algorithm of incremental vector support vector machine(IV-SVM)learning algorithm is put forward here.A primary SVM classifier is acquired based on the selected increment vectors in the kernel space.According to Karush-Kuhn-Tucker(KKT)conditions,the original training samples are pruned through the primary SVM classifier.The final SVM classifier is obtained by training the primary SVM classifier with reduction samples.Simulation experiments show that,compared with the common support vector machine,IV-SVM can reduce the training time of large capacity data samples support vector machine.
Margin classifier
Karush–Kuhn–Tucker conditions
Relevance vector machine
Ranking SVM
Cite
Citations (1)
Hypersphere
Margin classifier
Quadratic classifier
Relevance vector machine
Cite
Citations (25)
Network is more and more popular in the present society. Least squares support vector machine is a kind modified support vector machine for classification, which can solve a convex quadratic programming problem. Least squares support vector machine is presented to network intrusion detection. We apply KDDCUP99 experimental data of MIT Lincoln Laboratory to research the classification performance of LS-SVM classifier. Support vector machine, BP neural network are used to compare with the proposed method in the paper. The experimental indicates that LS-SVM detection method has higher detection accuracy than support vector machine, BP neural network.
Relevance vector machine
Margin classifier
Quadratic classifier
Ranking SVM
Least-squares function approximation
Cite
Citations (12)
Support Vector Machine(SVM)has to solve the quadratic programming problem,while least squares support vector machine(LS-SVM)only needs to deal with the linear equations. However the defect of LS-SVM is the lack of sparseness.In this paper,a method named sparse least squares support vector machine classifier(SLS-SVM)is presented to remedy the defect of the LS-SVM.It is carried out by pre-extracting margin vectors using center distance ratio method as original training samples and putting those which have not been classified correctly in the first training together as new training samples.The proposed method not only remedies the defect of LS-SVM,but also speeds up training and classifying.Furthermore,it can rectify the deviation of the classifier for unbalanced training data and the classifying ability is not affected.The good per- formance of SLS-SVM is verified on several data sets.
Margin classifier
Quadratic classifier
Least-squares function approximation
Training set
Cite
Citations (1)
Motivated by the support vector data descrip- tion, a classical one-class support vector machine, and the twin support vector machine classifier, this paper formu- lates a twin support vector hypersphere (TSVH) classifier, a novel binary support vector machine (SVM) classifier that determines a pair of hyperspheres by solving two related SVM-type quadratic programming problems, each of which is smaller than that of a conventional SVM, which means that this TSVH is more efficient than the classical SVM. In addition, the TSVH successfully avoids matrix inversion compared with the twin support vector machine, which indicates learning algorithms of the SVM can be easily extended to this TSVH. Computational results on several synthetic as well as benchmark data sets indicate that the proposed TSVH is not only faster, but also obtains better generalization.
Hypersphere
Margin classifier
Quadratic classifier
Relevance vector machine
Ranking SVM
Cite
Citations (13)
Margin classifier
Relevance vector machine
Ranking SVM
Discriminative model
Cite
Citations (4)
Proximal support vector machine is a variation of standard support vector machine and can be trained extremely efficiently for binary classification. However in many application fields, multi-class classification and incremental learning must be supported. Incremental linear proximal support vector classifier for multi-class classification has been developed in recent years, but only its performance in "one-against-all" manner has been investigated, and the application of proximal support vector machine for nonlinear multi-class classification has not been studied. In order to apply proximal support vector machine to more fields, three multi-class classification policies ("one-against-all", "one-against-one", "DAGSVM") applied to incremental linear proximal support vector classifier are compared and incremental nonlinear proximal support vector classifier for multi-class classification based on Gaussian kernel is investigated in the paper. The experiments indicate that "one-against-all" policy is best for incremental linear proximal support vector classifier according to the tradeoff between computing complexity and correctness, and the introduced incremental nonlinear proximal support vector classifier is effective in "one-against-all" manner when the reduce rate is below 0.6.
Margin classifier
Linear classifier
Quadratic classifier
Relevance vector machine
Binary classification
Cite
Citations (8)
In recent years, Machine Learning techniques, such as Support Vector Machine (SVM), Radial Basis Function (RBF) network, and so on, have been widely employed in engineering fields. In particular, the SVM is one of the powerful classifiers. However, the formulation of the SVM is slightly complex. In addition, we have to solve the Quadratic Programming (QP). The Least-Squares Support Vector Machine (LS-SVM) is one of the machine learning techniques. In this paper, the LS-SVM is introduced. In the LS-SVM, equality constraints is considered. Thus, the solution can be obtained by solving a set of linear equations instead of solving QP problem. The extension of the classical SVM to the SVR is more complex because the epsilon insensitive loss function is introduced, while it is very easy to extend the LS-SVM classifier to the regression version.
Ranking SVM
Relevance vector machine
Least-squares function approximation
Margin classifier
Cite
Citations (15)
As the expansion of the standard Support Vector Machine, compared with the traditional standard Support Vector Machine, the Least Squares Support Vector Machine loses the sparseness of standard Support Vector Machine, which would affect the efficiency of the second study. Aimed at the above puzzle, the article proposed an improved Least Squares Support Vector Machine incremental learning method, using self-adaptive methods to prune the sample, according to the performance of the classifier which each training has been to set the pruning threshold and the increment size of the sample. If you get a good performance of classifier, pruning threshold and sample increment is big, the other hand, if you get a poor performance of classifier, pruning threshold and sample increment is small, resulting in improved efficiency of Least Squares Support Vector Machine training to solve the sparse problem. The simulation experiment results verify the proposed algorithm is feasible.
Margin classifier
Relevance vector machine
Pruning
Least-squares function approximation
Training set
Cite
Citations (16)
This paper proposes a new parameterless support vector machine classifier based on quadratic programming,which avoids some shortcomings such as the need for choosing regularized parameter in standard SVM.Its formulation is simple and easy to be realized.Some numerical results illustrate that the new SVM classifier is feasible and effective.
Quadratic classifier
Margin classifier
Relevance vector machine
Cite
Citations (0)