A C-LSTM Neural Network for Human Activity Recognition Using Wearables

2018 
Recognizing human activities and the context in which they occur form sensor data is at the core of many research areas in pervasive computing and has extensive applications in solving real-life, human-centric problems. However, human activity recognition (HAR) is challenging due to the large variability in motor movements employed for a given action. By the way of enhancing recognition accuracy and decreasing reliance on engineered features to address increasingly complex recognition problems we introduce a new framework for wearable human activity recognition which combines convolutional and recurrent layers. The convolutional layers act as feature extractors and provide abstract representations of the input sensor data in feature maps. The recurrent layers model the temporal dynamics of the activation of the feature maps. Generally, the proposed network shows improvements compared with conventional machine learning methods. Experiments with the opportunity Dataset show that, comparing with baseline LSTM, our algorithm can recognize the human activities with an F1 score of 0.918, increased by 2.4%.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    25
    References
    3
    Citations
    NaN
    KQI
    []