Building damage classification in high-resolution satellite imagery with convolutional neural networks

2020 
After a natural disaster, the situation is often unclear, though an accurate assessment of the situation is essential for relief organizations. Since the areas are usually difficult to reach, satellite images offer an excellent opportunity to view the affected regions holistically. One indicator for estimating the extent of a disaster are building damages. The use of convolutional neural networks can help to classify them automatically and thus to identify the most affected areas. This work mainly addresses which disaster types a model should be trained on to achieve the best possible results classifying a new natural disaster without further training. Different disaster-specific, as well as cross-disaster models, are trained. These are tested concerning their performance in the event of a new tornado, hurricane, or fire. The variability of the results reveals that the success of automatic damage assessment depends on the test area. Another aspect that is critically examined is the feasibility of dividing the data into four damage classes. Here, some experiments show that a fine-grained division is not equally suitable for all damage types. Again, dividing the damage into four classes, instead of just two as is often done, opens up new possibilities: The chronological ordering of classes is used to approach damage assessment as a regression problem. The results of various studies show that the regression approach for multilevel damage assessment leads to much better results than a standard classification approach. Particularly, the confusion between destroyed and undamaged buildings can be considerably reduced by this approach. This is also important for applying a trained model to a new disaster since it is worse to classify a destroyed building as undamaged than a slightly damaged one.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    0
    References
    0
    Citations
    NaN
    KQI
    []