Face Commands - User-Defined Facial Gestures for Smart Glasses

2020 
We propose the use of face-related gestures involving the movement of the face, eyes, and head for augmented reality (AR). This technique allows us to use computer systems via hands-free, discreet interactions. In this paper, we present an elicitation study to explore the proper use of facial gestures for daily tasks in the context of a smart home. We used Amazon Mechanical Turk to conduct this study (N=37). Based on the proposed gestures, we report usage scenarios and complexity, proposed associations between gestures/tasks, a user-defined gesture set, and insights from the participants. We also conducted a technical feasibility study (N=13) with participants using smart eyewear to consider their uses in daily life. The device has 16 optical sensors and an inertial measurement unit (IMU). We can potentially integrate the system into optical see-through displays or other smart glasses. The results demonstrate that the device can detect eight temporal face-related gestures with a mean F1 score of 0.911 using a convolutional neural network (CNN). We also report the results of user-independent training and a one-hour recording of the experimenter testing two of the gestures.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    57
    References
    2
    Citations
    NaN
    KQI
    []