Learned Hand Gesture Classification Through Synthetically Generated Training Samples

2018 
Hand gestures are a natural component of human-human communication. Simple hand gestures are intuitive and can exhibit great lexical variety. It stands to reason that such a user input mechanism can have many benefits, including seamless interaction, intuitive control and robustness to physical constraints and ambient electrical, light and sound interference. However, while semantic and logical information encoded via hand gestures is readily decoded by humans, leveraging this communication channel in human-machine interfaces remains a challenge. Recent data-driven deep learning approaches are promising towards uncovering abstract and complex relationships that manual and direct rule-based classification schemes fail to discover. Such an approach is amenable towards hand gesture recognition, but requires myriad data which can be collected physically via user experiments. This process, however, is onerous and tedious. A streamlined approach with less overhead is sought. To that end, this work presents a novel method of synthetic hand gesture dataset generation that leverages modern gaming engines. Furthermore, preliminary results indicate that the dataset, despite being synthetic and requiring no physical data collection, is both accurate and rich enough to train a real-world hand gesture classifier that operates in real-time.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    19
    References
    3
    Citations
    NaN
    KQI
    []