Visualizing the Learning Progress of Self-Driving Cars

2018 
Using Deep Learning to predict lateral and longitudinal vehicle control, i.e. steering, acceleration and braking, is becoming increasingly popular. However, it remains widely unknown why those models perform so well. In order for them to become a commercially viable solution, it first needs to be understood why a certain behavior is triggered and how and what those networks learn from human-generated driving data to ensure safety. One research direction is to visualize what the network sees by highlighting regions of an image that influence the outcome of the model. In this vein, we propose a generic visualization method using Attention Heatmaps (AHs) to highlight what a given Convolutional Neural Network (CNN) learns over time. To do so, we rely on a novel occlusion technique to mask different regions of an input image to observe the effect on a predicted steering signal. We then gradually increase the amount of training data and study the effect on the resulting Attention Heatmaps, both in terms of visual focus and temporal behavior.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    8
    References
    1
    Citations
    NaN
    KQI
    []