Network constraints and multi-objective optimization for one-class classification

1996 
Abstract This paper introduces a constrained second-order network with a multiple objective learning algorithm that forms closed hyperellipsoidal decision boundaries for one-class classification. The network architecture has uncoupled constraints that give independent control over each decision boundary's size, shape, position, and orientation. The architecture together with the learning algorithm guarantee the formation of positive definite eigenvalues for closed hyperellipsoidal decision boundaries. The learning algorithm incorporates two criteria, one that seeks to minimize classification mapping error and another that seeks to minimize the size of the decision boundaries. We consider both additive combinations and multiplicative combinations of the individual criteria, and we present empirical evidence for selecting functional forms of the individual objectives that are bounded and normalized. The resulting multiple objective criterion allows the decision boundaries to increase or decrease in size as necessary to achieve both within-class generalization and out-of-class generalization without requiring the use of non-target patterns in the training set. The resulting network learns compact closed decision boundaries when trained with target data only. We show results of applying the network to the Iris data set (Fisher (1936), Annals of Eugenics, 7(2), 179–188). Advantages of this approach include its inherent ability for one-class generalization, freedom from characterizing the non-target class, and the ability to form closed decision boundaries for multi-modal classes that are more complex than hyperspheres without requiring inversion of large matrices.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    35
    References
    145
    Citations
    NaN
    KQI
    []