Look Ma, No Hands: A Wearable Neck-Mounted Interface

2021 
Touch screen interactions have been linked to repetitive strain injuries. Their use as an input channel to user interfaces can also have limitations due to screen size or mobility constraints. In this work, we investigate the feasibility of a neck-mounted wearable interface for software interaction, using only subtle head movements. Technologies and configurations for sensoring the neck region, including e-textiles and flex sensors, are considered. The type and placement of the sensors are evaluated. An end-to-end prototype system is developed, which takes the sensor readings from the bespoke neck hardware and wirelessly transmits them to a smart phone, for carrying-out the motion classification and interfacing with applications. The classification accuracy of common head movements using classical machine learning algorithms are evaluated. A classification accuracy of 91% is achieved with data collected from the prototype, for a library of five common head gestures and positions.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    12
    References
    0
    Citations
    NaN
    KQI
    []