Evolutionary-based generation of rotation and scale invariant texture descriptors from SIFT keypoints

2021 
To describe image data, prominent keypoints are commonly detected before running an extraction process to generate a feature vector. However, producing a reliable set of features is difficult and often requires human intervention. Indeed, images can undergo different changes that can affect the result and decrease the classification performance. To overcome these challenges, many approaches focused on constructing image descriptors that are invariant to transformations such as scale, illumination and rotation. These solutions mostly focused on one way to deal with information and faced more problems as they needed human intervention and a large set of data. In this study, we propose a genetic programming-based method with the intention of evolving a rotation and scale-invariant set of image descriptors. The generated vectors are then used for classifying texture images using a limited number of instances. In fact, in order to automatically evolve a descriptor that can handle illumination, scale and rotation changes, the proposed method combines two approaches that can treat information differently. It uses genetic programming with SIFT descriptor in order to extract prominent scale invariant keypoints before generating the feature vector. The performance of the proposed method has been validated on five datasets including scale and rotation variations. Results show that the method significantly outperforms similar low-level and GP-based descriptors.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    42
    References
    1
    Citations
    NaN
    KQI
    []