Boundary determination of foot ulcer images by applying the associative hierarchical random field framework

2019 
: As traditional visual-examination-based methods provide neither reliable nor consistent wound assessment, several computer-based approaches for quantitative wound image analysis have been proposed in recent years. However, these methods require either some level of human interaction for proper image processing or that images be captured under controlled conditions. However, to become a practical tool of diabetic patients for wound management, the wound image algorithm needs to be able to correctly locate and detect the wound boundary of images acquired under less-constrained conditions, where the illumination and camera angle can vary within reasonable bounds. We present a wound boundary determination method that is robust to lighting and camera orientation perturbations by applying the associative hierarchical random field (AHRF) framework, which is an improved conditional random field (CRF) model originally applied to natural image multiscale analysis. To validate the robustness of the AHRF framework for wound boundary recognition tasks, we have tested the method on two image datasets: (1) foot and leg ulcer images (for the patients we have tracked for 2 years) that were captured under one of the two conditions, such that 70% of the entire dataset are captured with image capture box to ensure consistent lighting and range and the remaining 30% of the images are captured by a handheld camera under varied conditions of lighting, incident angle, and range and (2) moulage wound images that were captured under similarly varied conditions. Compared to other CRF-based machine learning strategies, our new method provides a determination accuracy with the best global performance rates (specificity: >95% and sensitivity: >77% .
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    32
    References
    2
    Citations
    NaN
    KQI
    []