Sign boundary and hand articulation feature recognition in Sign Language videos

2021 
In this paper we present a recommendation system for (semi-)automatic annotation of sign language videos exploiting deep learning techniques, which handle handshape recognition in continuous signing data. Major tools in our approach have been the keypoint output of OpenPose and the use of HamNoSys in sign annotation of the training data. Prior to application on signed phrases, we tested our method with recognition of hand shape, hand location and palm orientation in isolated signs using two lexical datasets. The system has been trained on the Danish Sign Language lexicon and has also been applied to POLYTROPON, a lexicon of the Greek Sign Language (GSL), for which we received satisfactory recognition results. Experimentation with the POLYTROPON corpus of GSL phrases, has provided results which verify that our approach exhibits satisfactory accuracy rates. Thus, it can be exploited in a recommendation system for semi-automatic annotation of isolated signs and signed phrases in big SL video data, also contributing towards the development of further datasets for machine learning training.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    27
    References
    0
    Citations
    NaN
    KQI
    []