Visualizing Worklog Based on Human Working Activity Recognition Using Unsupervised Activity Pattern Encoding

2020 
Wearable motion sensor-based complex activity recognition during working hours has recently been studied to evaluate and thereby improve worker productivity. In the application of this technique to practical fields, one of the biggest challenges is performing time-consuming modeling tasks such as data labeling and hand-crafted feature extraction. One way to enable faster modeling is to decrease the time required for the manual tasks by making use of unlabeled motion datasets and the characteristics of complex activities. In this study, we propose a working activity recognition method that combines unsupervised encoding of the activity patterns of motions (denoted as "atomic activities"), the representation of working activities by combination of atomic activities, and the integration of additional information such as sensor time. We evaluated our method using an actual dataset from the caregiving field and found that it had an equivalent recognition performance (70.3% macro F-measure) to conventional hand-crafted feature extraction method. This is also comparable to that of previous methods using large labeled datasets. We also found that our method could visualize daily work processes with the accuracy of 71.2%. These results indicate that the proposed method has the potential to contribute to the rapid implementation of working activity recognition in actual working fields.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    4
    References
    0
    Citations
    NaN
    KQI
    []