Towards improved accuracy of UAV-based wheat ears counting: A transfer learning method of the ground-based fully convolutional network

2022 
In order to achieve accurate UAV-based wheat ear counting, a transfer learning method of the ground-based fully convolutional network, i.e., EarDensityNet, was proposed in this study. The EarDensityNet, which integrated the filter pyramid block and dilated convolution, was designed to map the wheat canopy images to ear density maps generated by dot annotations. The wheat ear counting can be obtained by summing all the pixel values of the corresponding ear density map. Results showed strong correlations could be observed between the actual number of wheat ears to those estimated by the EarDensityNet, with high coefficient of determination ( = 0.9179) and low Root-Mean-Square-Error (RMSE = 17.61 ears, NRMSE = 4.47%), outperforming the compared methods. Ground resolution of canopy images had a significant impact on the performance of the EarDensityNet. Transfer learning of the ground-based EarDensityNet could take full advantage of the rich details presented by the ground-based images with high pixel resolution, thus effectively alleviating the degradation of counting performance caused by the decreased ground resolution. Therefore, obtained results showed the fine-tuned EarDensityNet more accurate UAV-based wheat ear counting ( = 0.9570, RMSE = 801.34, and NRMSE = 22.06%) than one learned from scratch, demonstrating the superiority and applicability. Border effect from splitting digital images with high pixel resolution into sub-images did not make a major problem to the EarDensityNet, demonstrating great potentials to be generalized from plot-wise to field-wise. Wheat ear counting was recommended after the flowering stage since the textures of wheat ears were more obvious for EarDensityNet to learn complex feature representations.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    0
    References
    0
    Citations
    NaN
    KQI
    []