A Virtual Menu Using Gesture Recognition for 3D Object Manipulation in Mixed Reality

2021 
For years, the study on human-computer interaction in the mixed reality has been exploring in more natural and efficient interacting ways. As a common interacting component, the traditional menu on tangible user interface is not well suited to virtual interactive scenes in mixed reality and virtual menu represents a new category of efficient interacting methods in mixed reality. Recently, the combination of gesture recognition with the virtual menu has attracted little attention from researchers. To implement a natural, intuitive, and efficient interaction, in this work, a virtual menu using gesture recognition is proposed for the 3D object manipulation in mixed reality. In particular, several gesture states with respect to the different stages of menu interaction are first defined; then, the gestures on the screen are mapped to 3D virtual objects through coordinate transformation; and finally, the style and interaction logic of the menu is introduced to. In the experiment, the user’s interaction results using different menus are evaluated by comparing the time needed and the result’s accuracy with the same tasks. The experimental results show that our proposal can effectively reduce the number of interacting operations, and improve the efficiency of interacting process.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    12
    References
    0
    Citations
    NaN
    KQI
    []