Census Parcels Cropping System Classification from Multitemporal Remote Imagery: A Proposed Universal Methodology

2015 
A procedure named CROPCLASS was developed to semi-automate census parcel crop assessment in any agricultural area using multitemporal remote images. For each area, CROPCLASS consists of a) a definition of census parcels through vector files in all of the images; b) the extraction of spectral bands (SB) and key vegetation index (VI) average values for each parcel and image; c) the conformation of a matrix data (MD) of the extracted information; d) the classification of MD decision trees (DT) and Structured Query Language (SQL) crop predictive model definition also based on preliminary land-use ground-truth work in a reduced number of parcels; and e) the implementation of predictive models to classify unidentified parcels land uses. The software named CROPCLASS-2.0 was developed to semi-automatically perform the described procedure in an economically feasible manner. The CROPCLASS methodology was validated using seven GeoEye-1 satellite images that were taken over the LaVentilla area (Southern Spain) from April to October 2010 at 3- to 4-week intervals. The studied region was visited every 3 weeks, identifying 12 crops and others land uses in 311 parcels. The DT training models for each cropping system were assessed at a 95% to 100% overall accuracy (OA) for each crop within its corresponding cropping systems. The DT training models that were used to directly identify the individual crops were assessed with 80.7% OA, with a user accuracy of approximately 80% or higher for most crops. Generally, the DT model accuracy was similar using the seven images that were taken at approximately one-month intervals or a set of three images that were taken during early spring, summer and autumn, or set of two images that were taken at about 2 to 3 months interval. The classification of the unidentified parcels for the individual crops was achieved with an OA of 79.5%.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    33
    References
    2
    Citations
    NaN
    KQI
    []