Project MultiLeap: Fusing Data from Multiple Leap Motion Sensors

2021 
Finding a simple and precise way to control the virtual environment is one of the goals of a lot of humancomputer interaction research. One of the approaches is using a Leap Motion optical sensor, which provides hand and finger tracking without the need of any hand-held device. However, the Leap Motion system currently supports only one sensor at the time.To overcome this limitation, we proposed a set of algorithms to combine the data from multiple Leap Motion sensors to increase the precision and the usability of the hand tracking. First, we suggested a way how to improve the calibration of the current hand pose alignment proposed by Leap Motion. Then, we proposed an approach to fuse the tracking data from multiple Leap Motion sensors to provide more precise interaction with the virtual world. For this, we implemented our very own algorithm for computing the confidence level of the tracking data that can be used to distinguish which Leap Motion sensor detects the tracked hands best. We implemented those algorithms into our MultiLeap library.We also created two demo scenes that we used to validate the correctness of our work – one for evaluation of the fusing algorithms and one for mimicking the interaction with control panels in a helicopter cockpit.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    7
    References
    0
    Citations
    NaN
    KQI
    []