Across-Sensor Feature Learning for Energy-Efficient Activity Recognition on Mobile Devices

2019 
In this paper we propose across-sensor representation learning framework for improving power-accuracy trade-off in multi-sensor human activity recognition (HAR). The goal of the study is to achieve the level of performance comparable to one of multi-sensor HAR systems by using fewer or even single sensor. Such performance is achieved by learning relations between these sensors at training time and utilizing them at test time. These relations are learned by supervised deep models which use multi-sensor data during training only. The absence of need for having multiple sensors during test time allows turning these sensors off and replacing them with a single sensor coupled with learned across-sensor relations. These across-sensor relations make up for the information lost from the turned-off sensors. Using fewer sensors reduces energy consumption of HAR systems deployed on a smartphone. Moreover, it allows building HAR systems for situations when collection of multi-sensor data is possible only during training. This work presents preliminary results achieved with the proposed approach on the SHL dataset. Obtained results show an improvement of up to 14% in classification accuracy of single-sensor HAR.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    39
    References
    7
    Citations
    NaN
    KQI
    []