Semantic matchmaking as a way for attitude discovery

2019 
Powerful data analysis techniques are currently applied to 3D motion sensing devices like Microsft Kinect for posture and gesture recognition. Though effective, they are computationally intensive and require complex training. This paper proposes an approach for on-the-fly automated posture and gesture recognition, exploiting Kinect and treating the detection as a semantic-based resource discovery problem. A proper data model and an ontology support the annotation of body postures and gestures. The proposed system automatically annotates Kinect data with a Semantic Web standard logic formalism and then attempts to recognize postures by applying a semantic-based matchmaking between descriptions and reference body poses stored in a Knowledge Base. In addition, sequences of postures are compared in order to recognize gestures. The paper presents details about the prototype implementing the framework as well as an early experimental evaluation on a public dataset, in order to assess the feasibility of both ideas and algorithms.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    7
    References
    1
    Citations
    NaN
    KQI
    []