logo
    Semi-Supervised Remote Sensing Image Classification based on Clustering and the Mean Map Kernel
    14
    Citation
    17
    Reference
    10
    Related Paper
    Citation Trend
    Abstract:
    This paper presents a semi-supervised classifier based on the combination of the expectation-maximization (EM) algorithm for Gaussian mixture models (GMM) and the mean map kernel. The proposed method uses the most reliable samples in terms of maximum likelihood to compute a kernel function that accurately reflects the similarity between clusters in the kernel space. The proposed method improves classification accuracy in situations where the available labeled information does not properly describe the classes in the test image.
    Keywords:
    Kernel (algebra)
    Mean-shift
    Similarity (geometry)
    The classification problem by nonlinear support vector machine with kernel function is discussed in this paper. The stretching ratio is defined in order to analyze the performance of the kernel function. A new type of kernel function is introduced by modifying the Gaussian kernel, and it has many properties as good as or better than Gaussian function. For example, the map of the new kernel function magnifies the distance between vectors in local because the stretching ratio is always larger than one without enlarging the radius of the circumscribed hypersphere that includes the whole mapping vectors in feature space, which gets the bigger margin. Two criterions are proposed to choose a good spread parameter for a given kernel function approximately but easily. Some experiments are given to compare the classification performances between the proposed kernel function and Gaussian kernel function.
    Kernel (algebra)
    Kernel smoother
    String kernel
    Hypersphere
    Citations (3)
    The kernel function and parameters selection is a key problem in the research of support vector machine. After discussing the influence of support vector machine on kernel parameters and error penalty factors, a new kernel function CombKer was proposed and constructed. The CombKer kernel function is a kind of combination kernel function, which combines the Gaussian RBF kernel function that has the local characteristic, with the linear kernel function that has the global characteristic. Finally, some experiments on different domains data in the support vector machine constructed by the CombKer kernel function were done, and the results showed the better ability on prediction of this kind of support vector machine and proved the validation of the CombKer kernel function.
    Kernel (algebra)
    Tree kernel
    String kernel
    Relevance vector machine
    Kernel smoother
    Citations (38)
    This paper researches the selection problem of kernel function and the parameters optimization problem Support Vector Machine(SVM).Through the features of local kernel function and global kernel function,we mix the Gaussian kernel function and polynomial kernel function together and propose a new kernel function named multi-kernel function.Then we apply Multi-kernel function into face recognition and prove that multi-kernel function can achieve a higher recognition rate.
    Kernel (algebra)
    String kernel
    Tree kernel
    Kernel smoother
    Citations (0)
    In the last few years, various types of machine learning algorithms, such as Support Vector Machine (SVM), Support Vector Regression (SVR), and Non-negative Matrix Factorization (NMF) have been introduced. The kernel approach is an effective method for increasing the classification accuracy of machine learning algorithms. This paper introduces a family of one-parameter kernel functions for improving the accuracy of SVM classification. The proposed kernel function consists of a trigonometric term and differs from all existing kernel functions. We show this function is a positive definite kernel function. Finally, we evaluate the SVM method based on the new trigonometric kernel, the Gaussian kernel, the polynomial kernel, and a convex combination of the new kernel function and the Gaussian kernel function on various types of datasets. Empirical results show that the SVM based on the new trigonometric kernel function and the mixed kernel function achieve the best classification accuracy. Moreover, some numerical results of performing the SVR based on the new trigonometric kernel function and the mixed kernel function are presented.
    Kernel (algebra)
    Kernel smoother
    Citations (0)
    Kernel function implicitly maps data from its original space to a higher dimensional feature space. Kernel based machine learning algorithms are typically applied to data that is not linearly separable in its original space. Although kernel methods are among the most elegant part of machine learning, it is challenging for users to define or select a proper kernel function with optimized parameter settings for their data. In this paper, we propose a novel method called Deep Kernel that can automatically learn a kernel function from data using deep learning. The deep kernel is currently utilized in classification, and dimension reduction and visualization. For the classification task, we evaluate the deep kernel method by comparing its performance with the optimized Gaussian kernels, both using support vector machines as the decision model, on different types of datasets. The experimental results show that the proposed deep kernel method outperforms the traditional methods with Gaussian kernels on most of the data sets. For the dimension reduction and visualization task, the deep kernel is used along with kernel PCA. The results are also compared and contrasted with using the RBF kernel with multiple parameters. The deep kernel is shown to be more powerful in dimension reduction and visualization than the RBF kernel.
    Kernel (algebra)
    Tree kernel
    String kernel
    Citations (21)
    Sparse representation classification (SRC) and kernel method have been successfully used in pattern recognition. On account of the limitations of the single kernel function, we proposed multiple kernel sparse classification method in face recognition to improve human face recognition rate. The Power kernel function has a good stability, and the Gaussian kernel function has good practicability. The Power kernel function and Gaussian kernel function are linearly combined. Through the transformation of different kernel space, we effectively extract the nonlinear structure information of the human face. Many experimental results show that the multiple kernel sparse representation classification algorithms that based on Power kernel function and Gaussian kernel function have higher recognition rate than that only using the single kernel sparse representation classification.
    Kernel (algebra)
    String kernel
    Tree kernel
    Citations (0)
    This paper researches the selection problem of kernel function for Relevance Vector Machine(RVM).Improved Gauss kernel function is proposed.The characteristic of improved Gauss kernel function and normal Gauss kernel function are compared.The improving performance of proposed kernel function is validated.Besides the improving of single kernel function,multi-kernel RVM is researched,by combining local Gaussian kernel and global polynomial kernel,form multi-kernel function,and use it in RVM.Comparison experiments of kinds of kernel functions run on different datasets,and the performance of improved Gauss kernel function and mixture kernel function are validated.
    Kernel (algebra)
    Kernel smoother
    Tree kernel
    Relevance vector machine
    Citations (5)