Safety-critical human detection featuring time-of-flight environment perception

2017 
Industrie 4.0, Industrial IoT, and cyber-physical production systems in general introduce high levels of automation. Hence, at shop floor level, for example, the interfaces between human and machines are crucial. If a robot's environment perception is not robust and fail-safe, humans may not be detected properly, which may cause critical consequences. However, given the resource constraints of safety controllers typically used in critical application domains, the implementation of complex computer vision algorithms is often challenging. Here we explore the capabilities the indirect Time-of-Flight environment perception technology provides in order to speed up complex computer vision algorithms. In particular we are investigating the use-case of human detection in the domains of critical and resource-constrained applications, such as factory automation and automotive. We demonstrate that preprocessing based on Time-of-Flight 3D data can reduce computation time of typically used computer vision algorithms, such as Viola-Jones, by up to 50%. Furthermore, we showcase a human detection demonstrator case-study implemented on an AURIX processing system that represents a state-of-the-art safety controller. By exploiting the Time-of-Flight technology's depth and amplitude data in a clever way, safety-critical human detection is enabled on the AURIX platform. Compared to competing environment perception technologies, the outlined solution is hardly achievable with structured light or stereo visioning due to the safety controller's resource constraints.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    24
    References
    1
    Citations
    NaN
    KQI
    []