Touch Gesture Recognition System based on 1D Convolutional Neural Network with Two Touch Sensor Orientation Settings.

2019 
Touch is regarded as an important channel in human-robot interaction. This paper presents a touch gesture recognition system that can be applied to hard-skinned robots. Related studies have been based on traditional machine learning methods with hand-crafted features that make it difficult for developers to access optimal features that they cannot imagine. To prevent this, our proposed touch gesture recognition system uses a 1D convolutional neural network (1D CNN) that can learn features from data directly. The recognition system classifies four touch patterns: hit, pat, push, and rub. The results show an average recognition rate of 90.5%, which is higher than one of the related studies. Additionally, we verify the effect of touch sensor orientation on recognition performance. Many studies achieved accuracy with a touch sensor installed in only one orientation. In this study, we experimentally confirm that a classifier trained with data from a vertically installed touch sensor shows degraded performance on test data from a horizontally installed touch sensor, and vice versa. To achieve high recognition accuracy for both orientations, the network is newly trained with data from both vertically and horizontally installed sensors. The results show an 88.5% and 89.1% accuracy rate for the vertical and horizontal test data, respectively. That is, the model achieves reliable performance in both orientations whereas classifiers trained with data from a certain orientation cannot show good performance on test data from the different orientation.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    21
    References
    0
    Citations
    NaN
    KQI
    []