IDIoT: Towards Ubiquitous Identification of IoT Devices through Visual and Inertial Orientation Matching During Human Activity

2020 
As Internet-of-Things (IoT) devices become pervasive, opportunities for new, useful services open up. Leveraging existing devices in the environment to enhance the information extracted by personal devices can provide complementary sensing modalities. For instance, an elderly care facility might track progress of their seniors while they exercise by combining basic fitness bracelet data and their motion from vision such as activity/exercise detection. In order for these devices to share and aggregate such information, there needs to be an ID association step, where the devices' physical ID (i.e. the user and location on the body) is matched to their virtual ID (e.g. IP address). Existing approaches to assist this matching process often require intentional interaction, pre-calibration or direct line-of-sight to the device, which become inconvenient as the number of devices increases, or the device is obscured by clothing and body parts. The problem gets worse when devices have multiple users (e.g. family members) or multiple devices are involved simultaneously. We present IDIoT, a calibration-free passive sensing approach that utilizes human-device motion to determine the (user, body location) of each device. IDIoT leverages human pose information of the user, captured by existing 2D cameras (such as on smart TVs), and combines with 3D inertial sensing present on most IoT devices via bone orientation estimation. This way, IDIoT can associate multiple devices even if they are under clothing or in pockets. We extensively characterized IDIoT through real-world experiments and a publicly available dataset with humans wearing 13 on-body devices. Compared to other state-of-the-art baselines, IDIoT achieves up to 2x improvement in device identification, with an average accuracy of up to 92.2%.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    34
    References
    6
    Citations
    NaN
    KQI
    []