Next Article in Journal
Editorial for Special Issue: “Remote Sensing of Forest Cover Change”
Previous Article in Journal
Synergistic Effect of Multi-Sensor Data on the Detection of Margalefidinium polykrikoides in the South Sea of Korea
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Evaluation of Using Sentinel-1 and -2 Time-Series to Identify Winter Land Use in Agricultural Landscapes

1
Institute of Electronics and Telecommunications of Rennes IETR, UMR CNRS 6164, University of Rennes, 35000 Rennes, France
2
Littoral-Environnement-Télédétection-Géomatique LETG UMR 6554, University of Rennes, 35 000 Rennes, France
3
Internal Research Unit Forests & Societies, Centre de Coopération Internationale en Recherche Agronomique pour le Développement CIRAD, 34 398 Montpellier, France
4
L’unité mixte de recherche Biodiversité, AGroécologie et Aménagement du Paysage UMR BAGAP, Institut National De La Recherche Agronomique, INRA, 35 000 Rennes, France
*
Author to whom correspondence should be addressed.
Remote Sens. 2019, 11(1), 37; https://doi.org/10.3390/rs11010037
Submission received: 20 November 2018 / Revised: 13 December 2018 / Accepted: 20 December 2018 / Published: 27 December 2018

Abstract

:
Monitoring vegetation cover during winter is a major environmental and scientific issue in agricultural areas. From an environmental viewpoint, the presence and type of vegetation cover in winter influences the transport of pollutants to water resources. From a methodological viewpoint, characterizing spatio-temporal dynamics of land cover and land use at the field scale is challenging due to the diversity of farming strategies and practices in winter. The objective of this study was to evaluate the respective advantages of Sentinel optical and SAR time-series to identify land use in winter. To this end, Sentinel-1 and -2 time-series were classified using Support Vector Machine and Random Forest algorithms in a 130 km² agricultural area. From the classification, the Sentinel-2 time-series identified winter land use more accurately (overall accuracy (OA) = 75%, Kappa index = 0.70) than that of Sentinel-1 (OA = 70%, Kappa = 0.66) but a combination of the Sentinel-1 and -2 time-series was the most accurate (OA = 81%, Kappa = 0.77). Our study outlines the effectiveness of Sentinel-1 and -2 for identify land use in winter, which can help to change agricultural practices.

1. Introduction

Monitoring vegetation cover during winter is a major environmental and scientific issue in agricultural areas. From an environmental viewpoint, the presence and type of vegetation cover in winter influences the transport of pollutants to waterbodies by reducing the loss of nitrates, nutrients, pesticides or sediment from agricultural fields [1,2]. Lack of vegetation cover acts as an accelerator when soils are bare after a main crop (e.g., maize, rapeseed), while catch crops act as an obstacle to transport [3]. In this context, identifying and characterizing winter land use is a major component of water quality restoration and sustainable management in agricultural landscapes [4]. From a methodological viewpoint, characterizing spatio-temporal dynamics of land use and land cover (LULC) at the field scale is challenging due to the diversity of farming strategies and practices in winter. Identifying winter land use remains a major scientific challenge for the remote sensing community. While optical remotely sensed data are used mainly to determine annual LULC [5], they have several limitations for identifying land use in winter.
Time-series of medium-resolution remotely sensed data have been shown to be useful for classifying LULC over large areas [6,7] or for mapping and monitoring bare soils during winter in intensive agricultural regions [8,9]. However, due to their insufficient spatial resolution, these time-series can detect only patches of bare soil for winter land use at the field scale ([10]. High and very high spatial resolution remotely sensed data are also widely used to discriminate the main crop rotations and land uses at the field scale [11,12]. Refs. [13,14] demonstrated the potential of very high resolution remotely sensed data to monitor land use during summer. [15,16] showed the ability of high and very high spatial resolution images (Landsat and IKONOS) to identify crop residues during winter with high accuracy (overall accuracy (OA) > 80%). However, soil surface and vegetation growth conditions vary daily, seasonally and among fields [17]. Capturing these variations, which is necessary to identify land use types, requires acquiring several remotely sensed images during winter.
Optical remotely sensed data are under-exploited for identifying winter land use, mainly because few cloud-free images are available in winter for monitoring intra-annual dynamics of crops. Synthetic-Aperture Radar (SAR) data provide a reliable solution to address the limitations of optical images because they are not sensitive to atmospheric conditions and can be acquired during the day or night [18]. Thus, time-series of SAR images can be acquired to study intra-annual changes in vegetation [19]. SAR images have been used extensively to map land use in winter, especially to identify bare soil and tillage practices [20,21]. SAR time-series can also be used to identify inter-crops and crop residues. Refs. [22,23] demonstrated that vegetation structure and phenology is directly related to the backscattering mechanisms that occur between the SAR signal and land surfaces.
However, while optical and SAR data should be complementary, few studies have evaluated the combined use of optical and SAR time-series to identify and characterize land use [5,24,25] and, to our knowledge, none has done so for land use in winter. The development of Sentinel-1 and -2 sensors, which acquire optical and SAR data with high spatial and temporal resolutions, provides interesting opportunities to monitor winter land use. Until now, few studies have evaluated the use of Sentinel-1 (SAR) and Sentinel-2 (optical) time-series for monitoring land use, either separately or combined [26,27]. Refs. [28,29] identified land use classes during summer using Sentinel-2 time-series, with OA > 91%. However, only three studies [28,29,30] have used Sentinel-2 data to monitor intra-annual land use changes, mainly because few cloud-free images are available during winter. Ref. [30] has shown the potential of Sentinel-2 time-series compared to Landsat-8 and SPOT-5 time-series for detecting changes in LULC, with better results obtained with Sentinel-2 than the other data. Refs. [28,29] demonstrated the ability of Sentinel-2 single-date images and time-series to map land use changes during the growing season, with OA > 90%. Several studies evaluated the use of Sentinel-1 data for identifying and monitoring land cover during summer, with OA > 80% [31,32]. Ref. [33] classified Sentinel-1 time-series to map winter vegetation in five quality classes (“bare soil” to “high quality”) using a deep-learning approach, with OA > 98%. Only two studies investigated the potential of the use of combined Sentinel-1 and -2 time-series to identify land cover. Refs. [34,35] demonstrated that the combined use of Sentinel-1 and -2 time-series increased the OA by 5–10 percentage points. Nevertheless, the evaluation of using Sentinel-1 and -2 time-series to monitor winter land use remains unexplored.
The aim of this study was to evaluate the ability to use Sentinel-1 and -2 time-series to identify winter land use. To this end, a time-series of nine Sentinel-1 and -2 images acquired during a single hydrological year in autumn, winter and early spring was processed, first separately and then combined. Optical and SAR parameters, such as backscattering coefficients and vegetation indices, were extracted first. These parameters were then used to perform classifications with Random Forest (RF) and Support Vector Machine (SVM) algorithms [36,37].

2. Study Site and Data

2.1. Study Site

The “Zone Atelier Armorique” study site, a long-term ecological research (LTER) site in the LTER-Europe and ILTER networks, is located south of the Bay of Mont-Saint-Michel (48°31’0” N, 1°31’30” W), France (Figure 1). It was established in 1993 to assess relationships between changes in farming practices, landscape dynamics and ecological processes related to biodiversity, water quality and climate [38]. This agricultural area covers ca. 130 km² and has a temperate climate with an annual mean temperature of 12 °C, minimum mean temperature for the coldest year of 8 °C, maximum mean annual temperature of 16 °C and mean annual precipitation of 650 mm. The site contains ca. 7000 agricultural fields surrounded by a hedgerow network; field size ranges from 0.1-65 ha, with a mean of 2.1 ha. The crop system is characterized by a single crop planted per field each year. The main annual crops are maize, wheat, rapeseed and barley. In winter, catch crops are grown to prevent nitrogen leaching, in the framework of the Nitrate Directive of the European Union [39] and some of the catch crops are fed to cattle. In certain areas, soils are not completely covered by vegetation. Hence, it is important to locate such areas to advise farmers how to implement the best management practices.

2.2. Field Data

Land Use Data

The winter land use types studied are winter crops, grasslands, catch crops, crop residues and bare soil (Figure 2). Winter crops (wheat, barley and rapeseed) cover 40% of the utilized agricultural area (UAA) (Table 1). Wheat and barley sown in October have similar plant structure and phenology. Barley is harvested at the end of June or beginning of July, whereas wheat is harvested mainly in the middle of July. Rapeseed is sown mainly in September and harvested at the beginning of July. At full development, rapeseed plants are twice as tall (ca. 1.8 m) as wheat and barley and their stems are intertwined, with no clear vertical structure [27]. Grasslands, which are considered to have a major influence in regulating water flows and nutrient cycling [40], cover ca. 30% of the UAA. They are mown or grazed, which explains their similar plant structure but different phenology. Catch crops, sown after the main crops (August–October), cover 25% of the UAA, are diverse (e.g., oat, phacelia, mustard) and show different plant structure and phenology. Although bare soils have been banned for several years to avoid soil erosion and water pollutant flows, they cover an average of 5% of the UAA. Residues of annual crops (cereal stubble or maize stalks) cover parts of fields.
We conducted ground surveys every ten days between November 2016 and February 2017 on 257 crop fields to identify land use in order to calibrate and validate the classification of remote sensing data (Figure 1). Samples were randomly distributed throughout the study site; two-thirds (171 fields) were used for training and the other one-third (86 fields) for validation. The size of the inventoried fields ranged from 0.1–65 ha. The objects taken into account in this study for the object-oriented approach correspond to the 257 fields identified in situ from November 2016 to May 2017. We had a vector layer of field boundaries on which we reported the ground surveys.

2.3. Satellite Imagery

A series of nine optical Sentinel-2 and nine SAR Sentinel-1 images were acquired from autumn to spring (August 2016 to May 2017) from the data hub of the European Space Agency and Centre National d’Etudes Spatiales [41]. Sentinel-1 images were acquired in Single Look Complex (SLC) mode (delivered with VH and VV polarization states) with an incidence angle of 45–47°. The associated range and azimuth spatial resolutions were respectively 2.3 and 13.9 m. Sentinel-2 level 2A images were acquired (i.e., corrected for geometric and atmospheric effects) with a spatial resolution of 10–60 m and a spectral resolution of 13 bands. Characteristics of the optical and SAR images are summarized in Table 2.

3. Materials and Methods

The full dataset consisted of parameters derived from the optical and SAR image time-series. The method developed to identify winter vegetation cover from this dataset had three steps: (i) pre-processing the SAR and optical image time-series, (ii) selecting the most consistent parameters and (iii) classifying the optical and SAR time-series.

3.1. Pre-Processing of Time-Series

3.1.1. Pre-processing of Sentinel-1 Images

Backscattering Coefficients

Sentinel-1 images were first radiometrically calibrated (Figure 3) using SNAP (Sentinel Application Platform) v5.0 software with the following equation [42]:
v a l u e ( i ) = | D N i | ² A i ² ,
where DN is the digital number of each pixel (amplitude of backscattering signal) and A is the information necessary to convert SAR reflectivity into physical units provided in the Calibration Annotation Data Set in the image metadata. This equation transforms the DN of each pixel into a backscattering coefficient on a linear scale.
A refined Lee filter [43,44] was then applied with a window of 7 × 7 pixels to reduce speckle noise using SNAP v5.0 software. This window size was selected to decrease speckle noise while preserving a suitable spatial scale, which was necessary to ensure identification of winter land use. Sentinel-1 images were then geocoded using Shuttle Radar Topography Mission 3s data to correct topographic deformations. The accuracy of geometric correction was less than 10 m per pixel. A backscattering ratio was calculated by dividing σ 0 V H by σ 0 V V . This ratio highlights differing scattering mechanisms of each target. The backscattering coefficients σ 0 V H and σ 0 V V   and the backscattering ratio σ 0 V H : σ 0 V V were then converted into decibels (db) using the following equation:
σ 0 ( d b ) = 10 × l o g 10 ( σ 0 )   ,

Polarimetric Parameters

A 2 × 2 covariance matrix ( C 2 ) was first extracted from the scattering matrix S of each SLC image using PolSARpro v5.1.1 software [45]. The elements of the matrix, which are independent of the polarimetric absolute phase [46], were directly geocoded using SNAP v5.0 software. A refined Lee filter was then applied using a window of 7 × 7 pixels to reduce speckle noise.
The second (C12) and fourth (C22) elements of the diagonal were compared to the backscattering coefficients. The SPAN, which corresponds to the total scattered power and the Shannon Entropy (SE), which corresponds to the sum of two contributions related to the intensity ( S E i ) and the degree of polarization ( S E p ) [46], were then calculated from the matrix. SE measures the disorder encountered in polarimetric SAR images using the following equation:
S E = l o g ( π 2 e 2 | C 2 | ) = S E i + S E p   ,
A total of 13 polarimetric parameters were derived: the elements of the matrix, the SPAN, SE, SEi and SEp (Table 3).

3.1.2. Pre-processing of Sentinel-2 Images

Sentinel-2 images were pre-processed using the CNES Kalideos processing chain [47,48]. They were corrected for atmospheric disturbances, orthorectified and georeferenced based on the Universal Transverse Mercator (UTM) reference system (zone 30N). Assessment of Level-2A Sentinel-2 time-series, which were corrected from atmospheric and geometric effects from Sentinel-2 Level-1 time-series, was performed based on a visual interpretation of reference target reflectance (water, buildings, etc.).

Calculation of Vegetation Indices and Biophysical Parameters

Based on the literature, 2 vegetation indices, 1 water index and 3 biophysical parameters were computed using SNAP v5.0 software. The two vegetation indices were derived from the near infrared and red bands: The Normalized Difference Vegetation Index (NDVI; [49]) and the Soil Adjusted Vegetation Index (SAVI; [50]) that have demonstrated their relevance to study land use [51,52]. The Normalized Difference Water Index (NDWI; [53]) was calculated from the near and middle infrared bands often used to estimate water in vegetation [54]. Biophysical parameters (Leaf Area Index, fraction of photosynthetically active radiation and fractional vegetation cover) were calculated using the PROSAIL radiative transfer model [55,56] implemented in SNAP v5.0 software. These parameters describe the state of the vegetation cover and provide information on the density of green vegetation [57].
In total, the number of parameters to be processed was 297: 162 for Sentinel-2 data (18 parameters × 9 dates) and 135 for Sentinel-1 (15 parameters × 9 dates).

3.2. Processing Sentinel-1 and -2 Time-Series

The method for processing Sentinel-1 and -2 time-series we developed had three steps (Figure 4): (i) extracting features to yield one parameter value per field, (ii) reducing the optical and SAR parameter dataset using a data-mining algorithm and (iii) classifying vegetation cover using RF and SVM algorithms.

3.2.1. Feature Extraction

The parameters derived from Sentinel time-series (Table 3) were used to process two classification approaches: an object-based and a pixel-based approach. Hedgerows, which are considered noisy features in land use mapping, were removed from the images by applying a 5 m negative buffer around the 257 field boundaries observed on the study site. Then, two feature extraction sequences were performed: (i) for the object-based approach, the mean and median values of optical and SAR parameters were calculated at the field scale; (ii) for the pixel-based approach, 100 pixels were randomly selected from the 257 ground surveys for each winter land use class (5 classes × 100 pixels), which represents a total of 500 pixels.

3.2.2. Reduction of the Parameter Dataset

A two-step approach was used to remove noise and inconsistent data from the parameter datasets used for the classification. First, a correlation matrix was computed and a threshold of 0.95/−0.95 was applied to find the most correlated parameters (268 out of 297 parameters). Then, the choice of parameters to be removed was made based on a current state-of-the-art to select the most relevant parameters for the study of inter land use [27,51,52]. At this stage, the dataset consisted of 144 parameters (90 optical and 54 radar parameters). In a second step, an analysis of the parameter relevance was performed using the RF significance function to further reduce the number of parameters to be classified. A measure of variable importance was provided for each candidate predictor using this heuristic method based on the Gini Index [36,58,59]. The break in the histogram was used as the threshold for selecting the more relevant parameters, that is, 15 optical and 15 SAR parameters (Figure 5). These 30 parameters were used for winter land use classifications.

3.2.3. Land Use Classification

The RF and SVM algorithms, which are supervised classification methods, were used to classify land use during winter 2016–2017. These algorithms were chosen for their consistently strong performance and the accuracy with which they classify LULC [60,61]. The RF algorithm is an ensemble algorithm that uses a set of classification and regression trees to make a prediction [36]. The package randomForest developed by [62] and implemented in R (v.3.3.2) was used to perform winter land use classifications. Two Random Forest parameters, namely the number of trees (ntree), which was created by randomly selecting samples from the training dataset [28] and the number of variables used for tree nodes splitting (mtry) were tuned and randomly determined using the tune function implemented in the randomForest package. For this study, the ntree parameter was set at 1.000, few articles having demonstrated that beyond the creation of 1.000 classification trees, the number of errors produced remains stable [63]. The SVM algorithm is based on statistical learning theory that aims to determine the location of decision boundaries that produce an optimal separation of classes [37]. In a two-class pattern-recognition problem in which classes can be separated linearly, the SVM selects the linear decision boundary that creates the greatest margin between the two classes. The margin is the sum of distances to the hyperplane from the closest points of the two classes [37]. Thus, it initially extracts the best linear boundary between two classes of the training set; however, it is not restricted to linear discrimination, since one of its main advantages is its extension to nonlinear discrimination via the kernel trick [19]. The package e1071 (v 1.7-0) developed by [64] and implemented in R (v.3.3.2) was used to perform SVM classifications. A set of four SVM parameters (gamma, cost, degree, nu) was randomly tuned using the tune function integrated in the e1071 package. Then, several tests were carried out to determine the optimal kernel for winter use classification. At the end of the selection process the polynomial kernel was selected. Results obtained with RF and SVM algorithm were compared to evaluate their suitability for classifying vegetation cover into land use classes. Classification performance was estimated using a cross-validation test. The classification was applied to a varying subset of 257 fields (500 pixels): two-thirds (171 fields or 332 pixels) were used for training, one-third (86 fields or 168 pixels) for validation [19]. This process was repeated by changing the training/validation subsamples. Classification accuracy was assessed using OA and the Kappa index, which expresses the proportional decrease in error generated by the classification compared to the error of a completely random classification [65]. Finally, the vegetation cover used to map land use was classified using the algorithm with the highest OA. The classification process was tested for the two approaches: object-based and pixel-based. Sentinel-1 and Sentinel-2 parameters were classified separately and by combining optical and SAR data in the same dataset, that is, we realized a fusion at the lowest processing level (pixels level) referring to the merging of measured physical parameters [66]. The combination of Sentinel-1 and -2 parameters were then classified.

4. Results and Discussion

4.1. Importance of Optical and SAR Parameters for Identifying Winter Land Use

The correlation analysis computed with a threshold of 0.95/−0.95 was applied to find the most correlated parameters (268 out of 297 parameters), which resulted in the suppression of 153 parameters. However, correlation matrix was not displayed because the number of parameters (297) is too large and it would not be readable.
Figure 5 shows the parameter contribution to land use classifications, ranked by importance for optical and Synthetic-Aperture Radar image time-series. The break in the histogram (after band 5 in May for Sentinel-2; after polarized Shannon Entropy in April for Sentinel-1) was used as the threshold for selecting the most relevant parameters, that is, 15 optical and 15 SAR parameters.
The most important SAR parameter was the ratio VH/VV in May, which highlights crop growth, especially of winter crops [27]. Similarly, the backscattering coefficients were also important. Results show the importance of parameters derived from the May and April images due to their sensitivity to variations in double-bounce and volume-scattering mechanisms [67]. The backscattering coefficient VH calculated from the November image was also important due to its sensitivity to direct contributions from the ground and the canopy [68]. These parameters highlight the difference between bare soils and crops.
Concerning optical parameters, the two most important parameters were band 2 in December 2016 and SAVI in May 2017 with a Gini index above 70. One can notice that the best NDVI parameter came in 14th position, which is not in accordance with the current literature [52]. Results also show the importance of parameters derived from the May image, 11 parameters out of the 15 most important parameters being derived from this image. This is due to the fact that this period is the main phase of highly dynamic plant growth: vegetation peaks in May (the end of spring), when grasslands are easily detected (Figure 5).
The spectral distributions of two of the most pertinent parameters, that is, SAVI in May 2017 and VH/VV ratio in May 2017, were computed for the five classes of winter land use (Figure 6). SAVI was selected over Band 2 based on the current literature, which demonstrated the potential of this vegetation index to discriminate and characterize crop dynamics [28,54,69].
The Figure 6a shows the potential of SAVI to discriminate winter crops and grasslands from catch crops and crop residues in May -when the phenological stages and land use changes are most pronounced. Conversely, results highlight the difficulty to discriminate bare soils from the other classes, due to a high intra-class variance. To a lesser extent, Figure 6b shows the potential of the VH/VV ratio to further separate winter crops and grasslands from catch crops and crop residues. However, like the SAVI parameter, the VH/VV ratio is not sufficient to separate bare soils from other classes due to a very high intra-class variance. These results are consistent with the existing literature, which demonstrated the importance of using backscattering coefficients alone ( σ V H or σ V V ) or in combination (VH/VV) to identify land use [70].

4.2. Winter Land Use Classification

Accuracy of the winter land use classification obtained from optical and SAR parameters varied significantly depending on (i) the classification approach (i.e., pixel-based or object-based), (ii) the classification algorithm (i.e., RF or SVM) and (iii) the time-series dataset (i.e., Sentinel-1, Sentinel-2 or a combination of both).
Results of the pixel-based and object-based approaches to identifying winter land use had similar accuracy for the Sentinel-2 dataset and the combined Sentinel-1 and -2 dataset. The object-based approach had a slightly higher OA (68–83%, Kappa = 0.64–0.77) than the pixel-based approach (55–81%, Kappa = 0.51–0.77) (Figure 7). Although this is consistent with other studies of LULC [71], few studies have presented advantages of using the object-based approach instead of the pixel-based approach to identify and characterize LULC [72,73]. In contrast, results obtained with the Sentinel-1 data showed the superiority of the object-based approach, due to the heterogeneity of SAR values within fields in winter.
Concerning the classification techniques, the RF algorithm had higher OA than the SVM algorithm (median OA = 81% and 79%, respectively). The RF algorithm also had less variation in OA than SVM (72–83% (Kappa = 0.67–0.77) and 68–80% (Kappa = 0.64–0.76), respectively). The potential of SVM and RF algorithms for remote sensing studies has been widely demonstrated [59,60,61]. Our results show that the RF algorithm is slightly more accurate effective than the SVM algorithm, which is consistent with results of other studies [59,61,74].
This study evaluated the respective advantages of Sentinel optical and SAR time-series to identify winter land use. Classification was better using a combination of Sentinel-1 and -2 parameters (median OA = 81%, Kappa = 0.77) (Table 4), with OA ranging from 75–82% (Kappa = 0.68–0.77). Conversely, classifications based on either Sentinel-1 or Sentinel-2 parameters alone had OAs of 68–78% and 74–80%, respectively (Figure 7). While the results highlight the utility of Sentinel-1 and Sentinel-2 individually, they also emphasize that classification using the Sentinel-2 dataset always outperformed that using the Sentinel-1 dataset. The classification results highlight the advantages of using the combined Sentinel-1 and -2 datasets, with OA ranging from 68–83% (Kappa = 0.64–0.77). Therefore, our study confirms the effectiveness of Sentinel-1 and -2 time-series for identifying land use, as previous studies have demonstrated [26,27] and also shows the potential of using the combined Sentinel-1 and -2 datasets for this purpose. Additionally, the originality of this study is the identification of land use in winter.
The best classification, with OA of 81% and Kappa of 0.77, used an object-based approach and 30 parameters derived from a combination of Sentinel-1 and -2 parameters. Misclassification errors were observed between bare soils (under- and over-estimation rates of 64% and 94%, respectively) and the other classes (Table 5). This agrees with the study’s difficulty in discriminating bare soils from the other classes (Figure 6).
The spatial distribution of winter land uses mapped at the 1:100,000 scale from the best classification (Figure 8) shows that bare soils and crop residues covered less than 5% of the UAA, while a high percentage was covered with grasslands (30%) or winter crops (35%). In general, catch crops and winter crops were located on the largest fields, while bare soils and grasslands were located on the smallest fields.
The distribution of membership probabilities associated with this classification indicates that accuracy decreased at the edges of the study site (Figure 9). Fields smaller than 1 ha had the lowest membership probability (0.47), while those larger than 10 ha had the highest (0.84), indicating that classification accuracy increased with field size. These results agree with the confusion matrix (Table 5), in which misclassification was greatest for bare soils.

5. Conclusions

This study evaluated the respective advantages of using Sentinel optical and SAR time-series to identify winter land use, using SVM and RF algorithms with pixel-based and object-based approaches. Our study used high spatial and temporal resolutions of Sentinel data to identify land use types in winter and to our knowledge, this is the first time such a study has been undertaken.
Results show that winter land use can be identified accurately using combined Sentinel-1 and -2 time-series with a pixel-based approach using an RF algorithm. Analysis of using Sentinel-1 and -2 parameters to identify winter land use led to recommendations for extracting features when mapping winter land use. Results reveal the advantage of using backscattering coefficients alone or combined with the NDVI.
Our results also demonstrated limits of this approach to identifying winter land use in small fields, due to the spatial resolution of Sentinel sensors. Thus, future research could evaluate the use of very high spatial resolution optical and SAR images, such as ALOS-2 or TerraSAR-X data, to improve the accuracy of classifying land use types during winter. Better understanding of optical and SAR signal behaviours of different agricultural practices and environmental conditions would help to identify and monitor winter land use. This has important implications for developing sustainable agriculture that decreases the risk of transferring pollutants to the environment.

Author Contributions

Conceptualization, J.D., L.H.-M. and E.P.; methodology, J.D., E.P., L.H.-M., J.B. and S.C.; software, J.D.; validation, J.D., J.B. (Julie Betbeder), J.B. (Jacques Baudry), L.H.-M. and E.P.; formal analysis, J.D.; investigation, J.D., E.P., L.H.-M., J.B. (Julie Betbeder), S.C. and J.B. (Jacques Baudry); resources, J.D., J.B. (Julie Betbeder) and J.B. (Jacques Baudry); data curation, J.D.; writing—original draft preparation, J.D.; writing—review and editing, J.D., L.H.-M., E.P., J.B. (Julie Betbeder) and J.B. (Jacques Baudry); visualization, J.D.; supervision, E.P. and L.H.-M.; project administration, L.H.-M. and E.P.; funding acquisition: L.H.-M. and E.P.

Funding

This research was funded by MESR (Ministry of Higher Education and Research of France), PHD grant 2016”

Acknowledgments

This study was supported by the Kalideos project, funded by the CNES [75] and the Zone Atelier Armorique project. We would like to thank Napo N’Bohn and Marianne Balaresque for their help in the field.

References

  1. Withers, P.J.; Neal, C.; Jarvie, H.P.; Doody, D.G. Agriculture and Eutrophication: Where Do We Go from Here? Sustainability 2014, 6, 5853–5875. [Google Scholar] [CrossRef] [Green Version]
  2. Galloway, J.N.; Townsend, A.R.; Erisman, J.W.; Bekunda, M.; Cai, Z.; Freney, J.R.; Martinelli, L.A.; Seitzinger, S.P.; Sutton, M.A. Transformation of the Nitrogen Cycle: Recent Trends, Questions, and Potential Solutions. Science 2008, 320, 889–892. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  3. Dabney, S.M. Cover Crop Impacts on Watershed Hydrology. J. Soil Water Conserv. 1998, 53, 207–213. [Google Scholar]
  4. Corgne, S. Hiérarchisation Des Facteurs Structurant Les Dynamiques Pluriannuelles Des Sols Nus Hivernaux. Application Au Bassin Versant Du Yar (Bretagne). Norois. Environ. Aménage. Soc. 2004, 193, 17–29. [Google Scholar] [CrossRef]
  5. Fieuzal, R.; Baup, F.; Marais-Sicre, C. Monitoring Wheat and Rapeseed by Using Synchronous Optical and Radar Satellite Data—From Temporal Signatures to Crop Parameters Estimation. Adv. Remote Sens. 2013, 2013. [Google Scholar] [CrossRef]
  6. Zhang, X.; Friedl, M.A.; Schaaf, C.B.; Strahler, A.H.; Hodges, J.C.F.; Gao, F.; Reed, B.C.; Huete, A. Monitoring Vegetation Phenology Using MODIS. Remote Sens. Environ. 2003, 84, 471–475. [Google Scholar] [CrossRef]
  7. Clark, M.L.; Aide, T.M.; Grau, H.R.; Riner, G. A Scalable Approach to Mapping Annual Land Cover at 250 M Using MODIS Time Series Data: A Case Study in the Dry Chaco Ecoregion of South America. Remote Sens. Environ. 2010, 114, 2816–2832. [Google Scholar] [CrossRef]
  8. Lecerf, R.; Hubert-Moy, L.; Corpetti, T.; Baret, F.; Latif, B.A.; Nicolas, H. Estimating Biophysical Variables at 250 M with Reconstructed EOS/MODIS Time Series to Monitor Fragmented Landscapes. In Proceedings of the IEEE International Geoscience and Remote Sensing Symposium, Boston, MA, USA, 7–11 July 2008; Volume 2. [Google Scholar] [CrossRef]
  9. Hubert-Moy, L.; Lecerf, R.; Corpetti, T.; Dubreuil, V. Monitoring Winter Vegetation Cover Using Multitemporal Modis Data. In Proceedings of the IEEE International Conference on Geoscience and Remote Sensing Symposium (IGARSS’05), Seoul, South Korea, 29 July 2005; Volume 3, pp. 2113–2116. [Google Scholar]
  10. Lecerf, R.; Corpetti, T.; Hubert-Moy, L.; Dubreuil, V. Monitoring Land Use and Land Cover Changes in Oceanic and Fragmented Landscapes with Reconstructed MODIS Time Series. In Proceedings of the International Workshop on the Analysis of Multi-Temporal Remote Sensing Images, Biloxi, MS, USA, 16–18 May 2005; pp. 195–199. [Google Scholar]
  11. Xu, D. Compare NDVI Extracted from Landsat 8 Imagery with That from Landsat 7 Imagery. Am. J. Remote Sens. 2014, 2, 10. [Google Scholar] [CrossRef]
  12. Pan, Z.; Huang, J.; Zhou, Q.; Wang, L.; Cheng, Y.; Zhang, H.; Blackburn, G.A.; Yan, J.; Liu, J. Mapping Crop Phenology Using NDVI Time-Series Derived from HJ-1 A/B Data. Int. J. Appl. Earth Observ. Geoinf. 2015, 34, 188–197. [Google Scholar] [CrossRef]
  13. El Hajj, M.; Bégué, A.; Guillaume, S.; Martiné, J.-F. Integrating SPOT-5 Time Series, Crop Growth Modeling and Expert Knowledge for Monitoring Agricultural practices—The Case of Sugarcane Harvest on Reunion Island. Remote Sens. Environ. 2009, 113, 2052–2061. [Google Scholar] [CrossRef]
  14. Murakami, T.; Ogawa, S.; Ishitsuka, N.; Kumagai, K.; Saito, G. Crop Discrimination with Multitemporal SPOT/HRV Data in the Saga Plains, Japan. Int. J. Remote Sens. 2001, 22, 1335–1348. [Google Scholar] [CrossRef]
  15. Bannari, A.; Pacheco, A.; Staenz, K.; McNairn, H.; Omari, K. Estimating and Mapping Crop Residues Cover on Agricultural Lands Using Hyperspectral and IKONOS Data. Remote Sens. Environ. 2006, 104, 447–459. [Google Scholar] [CrossRef]
  16. Pacheco, A.; McNairn, H. Evaluating Multispectral Remote Sensing and Spectral Unmixing Analysis for Crop Residue Mapping. Remote Sens. Environ. 2010, 114, 2219–2228. [Google Scholar] [CrossRef]
  17. McNairn, H.; Brisco, B. The Application of C-Band Polarimetric SAR for Agriculture: A Review. Can. J. Remote Sens. 2004, 30, 525–542. [Google Scholar] [CrossRef]
  18. Smith, L.C. Satellite Remote Sensing of River Inundation Area, Stage, and Discharge: A Review. Hydrol. Process. 1997, 11, 1427–1439. [Google Scholar] [CrossRef]
  19. Betbeder, J.; Rapinel, S.; Corgne, S.; Pottier, E.; Hubert-Moy, L. TerraSAR-X Dual-Pol Time-Series for Mapping of Wetland Vegetation. ISPRS J. Photogramm. Remote Sens. 2015, 107, 90–98. [Google Scholar] [CrossRef]
  20. McNairn, H.; Duguay, C.; Boisvert, J.; Huffman, E.; Brisco, B. Defining the Sensitivity of Multi-Frequency and Multi-Polarized Radar Backscatter to Post-Harvest Crop Residue. Can. J. Remote Sens. 2001, 27, 247–263. [Google Scholar] [CrossRef]
  21. Baghdadi, N.; Zribi, M.; Loumagne, C.; Ansart, P.; Anguela, T.P. Analysis of TerraSAR-X Data and Their Sensitivity to Soil Surface Parameters over Bare Agricultural Fields. Remote Sens. Environ. 2008, 112, 4370–4379. [Google Scholar] [CrossRef]
  22. Jiao, X.; McNairn, H.; Shang, J.; Liu, J. The Sensitivity of Multi-Frequency (X, C and L-Band) Radar Backscatter Signatures to Bio-Physical Variables (LAI) over Corn and Soybean Fields. Int. Arch. Photogramm. Remote Sens. 2010, 38, 318–321. [Google Scholar]
  23. Betbeder, J.; Rapinel, S.; Corpetti, T.; Pottier, E.; Corgne, S.; Hubert-Moy, L. Multitemporal Classification of TerraSAR-X Data for Wetland Vegetation Mapping. J. Appl. Remote Sens. 2014, 8, 83648. [Google Scholar] [CrossRef]
  24. Hadria, R.; Duchemin, B.; Baup, F.; Le Toan, T.; Bouvet, A.; Dedieu, G.; Le Page, M. Combined Use of Optical and Radar Satellite Data for the Detection of Tillage and Irrigation Operations: Case Study in Central Morocco. Agric. Water Manag. 2009, 96, 1120–1127. [Google Scholar] [CrossRef]
  25. Laurin, G.V.; Liesenberg, V.; Chen, Q.; Guerriero, L.; Del Frate, F.; Bartolini, A.; Coomes, D.; Wilebore, B.; Lindsell, J.; Valentini, R. Optical and SAR Sensor Synergies for Forest and Land Cover Mapping in a Tropical Site in West Africa. Int. J. Appl. Earth Observ. Geoinf. 2013, 21, 7–16. [Google Scholar] [CrossRef]
  26. Inglada, J.; Vincent, A.; Arias, M.; Marais-Sicre, C. Improved Early Crop Type Identification by Joint Use of High Temporal Resolution SAR And Optical Image Time Series. Remote Sens. 2016, 8, 362. [Google Scholar] [CrossRef]
  27. Veloso, A.; Mermoz, S.; Bouvet, A.; Le Toan, T.; Planells, M.; Dejoux, J.-F.; Ceschia, E. Understanding the Temporal Behavior of Crops Using Sentinel-1 and Sentinel-2-like Data for Agricultural Applications. Remote Sens. Environ. 2017, 199, 415–426. [Google Scholar] [CrossRef]
  28. Belgiu, M.; Csillik, O. Sentinel-2 Cropland Mapping Using Pixel-Based and Object-Based Time-Weighted Dynamic Time Warping Analysis. Remote Sens. Environ. 2017, 204, 509–523. [Google Scholar] [CrossRef]
  29. Vuolo, F.; Neuwirth, M.; Immitzer, M.; Atzberger, C.; Ng, W.-T. How Much Does Multi-Temporal Sentinel-2 Data Improve Crop Type Classification? Int. J. Appl. Earth Observ. Geoinf. 2018, 72, 122–130. [Google Scholar] [CrossRef]
  30. Radoux, J.; Chomé, G.; Jacques, D.C.; Waldner, F.; Bellemans, N.; Matton, N.; Lamarche, C.; d’Andrimont, R.; Defourny, P. Sentinel-2’s Potential for Sub-Pixel Landscape Feature Detection. Remote Sens. 2016, 8, 488. [Google Scholar] [CrossRef]
  31. Abdikan, S.; Sanli, F.B.; Ustuner, M.; Calò, F. Land Cover Mapping Using Sentinel-1 SAR Data. Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2016, 41, 757. [Google Scholar] [CrossRef]
  32. Dimov, D.; Löw, F.; Ibrakhimov, M.; Stulina, G.; Conrad, C. SAR and Optical Time Series for Crop Classification. In Proceedings of the IEEE International Geoscience and Remote Sensing Symposium (IGARSS), Fort Worth, TX, USA, 23–28 July 2017; pp. 811–814. [Google Scholar]
  33. Minh, D.H.T.; Ienco, D.; Gaetano, R.; Lalande, N.; Ndikumana, E.; Osman, F.; Maurel, P. Deep Recurrent Neural Networks for Winter Vegetation Quality Mapping via Multitemporal Sar Sentinel-1. IEEE Geosci. Remote Sens. Lett. 2018, 15, 464–468. [Google Scholar] [CrossRef]
  34. Van Tricht, K.; Gobin, A.; Gilliams, S.; Piccard, I. Synergistic Use of Radar Sentinel-1 and Optical Sentinel-2 Imagery for Crop Mapping: A Case Study for Belgium. Remote Sens. 2018, 10, 1642. [Google Scholar] [CrossRef]
  35. Steinhausen, M.J.; Wagner, P.D.; Narasimhan, B.; Waske, B. Combining Sentinel-1 and Sentinel-2 Data for Improved Land Use and Land Cover Mapping of Monsoon Regions. Int. J. Appl. Earth Observ. Geoinf. 2018, 73, 595–604. [Google Scholar] [CrossRef]
  36. Breiman, L. Random Forests. Mach. Learn. 2001, 45, 5–32. [Google Scholar] [CrossRef] [Green Version]
  37. Cortes, C.; Vapnik, V. Support-Vector Networks. Mach. Learn. 1995, 20, 273–297. [Google Scholar] [CrossRef]
  38. ZA Armorique. Available online: https://osur.univ-rennes1.fr/za-armorique/ (accessed on 9 November 2018).
  39. Nitrates—Water Pollution Environment European Commission. Available online: https://bit.ly/1U3YPLX (accessed on 9 November 2018).
  40. Lobell, D.B.; Field, C.B. Global Scale Climate–crop Yield Relationships and the Impacts of Recent Warming. Environ. Res. Lett. 2007, 2, 14002. [Google Scholar] [CrossRef]
  41. PEPS—Plateforme D’exploitation des Produits Sentinel (CNES). Available online: https://peps.cnes.fr/rocket/#/home (accessed on 9 November 2018).
  42. Miranda, N.; Meadows, P.J. Radiometric Calibration of S-1 Level-1 Products Generated by the S-1 IPF. Available online: https://bit.ly/2ActiEv (accessed on 21 December 2018).
  43. Lee, J.-S. Speckle Analysis and Smoothing of Synthetic Aperture Radar Images. Comput. Graph. Image Process. 1981, 17, 24–32. [Google Scholar] [CrossRef]
  44. Xing, X.; Chen, Q.; Yang, S.; Liu, X. Feature-Based Nonlocal Polarimetric SAR Filtering. Remote Sens. 2017, 9, 1043. [Google Scholar] [CrossRef]
  45. Pottier, E.; Ferro-Famil, L. PolSARPro V5.0: An ESA Educational Toolbox Used for Self-Education in the Field of POLSAR and POL-INSAR Data Analysis. In Proceedings of the IEEE International Geoscience and Remote Sensing Symposium, Munich, Germany, 22–27 July 2012; pp. 7377–7380. [Google Scholar] [CrossRef]
  46. Lee, J.-S.; Pottier, E. Polarimetric Radar Imaging: From Basics to Applications; CRC Press: Boca Raton, FL, USA, 2009. [Google Scholar]
  47. Hagolle, O.; Dedieu, G.; Mougenot, B.; Debaecker, V.; Duchemin, B.; Meygret, A. Correction of Aerosol Effects on Multi-Temporal Images Acquired with Constant Viewing Angles: Application to Formosat-2 Images. Remote Sens. Environ. 2008, 112, 1689–1701. [Google Scholar] [CrossRef]
  48. Hagolle, O.; Huc, M.; Pascual, D.V.; Dedieu, G. A Multi-Temporal Method for Cloud Detection, Applied to FORMOSAT-2, VENµS, LANDSAT and SENTINEL-2 Images. Remote Sens. Environ. 2010, 114, 1747–1755. [Google Scholar] [CrossRef]
  49. Rouse, J.W. Monitoring Vegetation Systems in the Great Plains with ERTS; NASA: Washington, DC, USA, 1974.
  50. Huete, A.R. A Soil-Adjusted Vegetation Index (SAVI). Remote Sens. Environ. 1988, 25, 295–309. [Google Scholar] [CrossRef]
  51. Symeonakis, E.; Calvo-Cases, A.; Arnau-Rosalen, E. Land Use Change and Land Degradation in Southeastern Mediterranean Spain. Environ. Manag. 2007, 40, 80–94. [Google Scholar] [CrossRef]
  52. Yengoh, G.T.; Dent, D.; Olsson, L.; Tengberg, A.E.; Tucker, C.J. The Use of the Normalized Difference Vegetation Index (NDVI) to Assess Land Degradation at Multiple Scales: A Review of the Current Status, Future Trends, and Practical Considerations; Lund University Center for Sustainability Studies (LUCSUS), and The Scientific and Technical Advisory Panel of the Global Environment Facility (STAP/GEF): Lund, Sweden, 2014; Volume 47. [Google Scholar]
  53. Gao, B. NDWI—A Normalized Difference Water Index for Remote Sensing of Vegetation Liquid Water from Space. Remote Sens. Environ. 1996, 58, 257–266. [Google Scholar] [CrossRef]
  54. Clay, D.E.; Kim, K.-I.; Chang, J.; Clay, S.A.; Dalsted, K. Characterizing Water and Nitrogen Stress in Corn Using Remote Sensing. Agron. J. 2006, 98, 579–587. [Google Scholar] [CrossRef] [Green Version]
  55. Weiss, M.; Baret, F.; Smith, G.J.; Jonckheere, I.; Coppin, P. Review of Methods for in Situ Leaf Area Index (LAI) Determination: Part II. Estimation of LAI, Errors and Sampling. Agric. For. Meteorol. 2004, 121, 37–53. [Google Scholar] [CrossRef]
  56. Jacquemoud, S.; Verhoef, W.; Baret, F.; Bacour, C.; Zarco-Tejada, P.J.; Asner, G.P.; François, C.; Ustin, S.L. PROSPECT+SAIL Models: A Review of Use for Vegetation Characterization. Remote Sens. Environ. 2009, 113 (Suppl. S1), S56–S66. [Google Scholar] [CrossRef]
  57. Dusseux, P.; Corpetti, T.; Hubert-Moy, L.; Corgne, S. Combined Use of Multi-Temporal Optical and Radar Satellite Images for Grassland Monitoring. Remote Sens. 2014, 6, 6163–6182. [Google Scholar] [CrossRef] [Green Version]
  58. Kostelich, E.J.; Schreiber, T. Noise Reduction in Chaotic Time-Series Data: A Survey of Common Methods. Phys. Rev. E 1993, 48, 1752. [Google Scholar] [CrossRef]
  59. Pal, M. Random Forest Classifier for Remote Sensing Classification. Int. J. Remote Sens. 2005, 26, 217–222. [Google Scholar] [CrossRef]
  60. Belgiu, M.; Drăguţ, L. Random Forest in Remote Sensing: A Review of Applications and Future Directions. ISPRS J. Photogramm. Remote Sens. 2016, 114, 24–31. [Google Scholar] [CrossRef]
  61. Mountrakis, G.; Im, J.; Ogole, C. Support Vector Machines in Remote Sensing: A Review. ISPRS J. Photogramm. Remote Sens. 2011, 66, 247–259. [Google Scholar] [CrossRef]
  62. Liaw, A.; Wiener, M. Classification and Regression by randomForest. R News 2002, 2, 18–22. [Google Scholar]
  63. Lawrence, R.L.; Wood, S.D.; Sheley, R.L. Mapping Invasive Plants Using Hyperspectral Imagery and Breiman Cutler Classifications (RandomForest). Remote Sens. Environ. 2006, 100, 356–362. [Google Scholar] [CrossRef]
  64. Dimitriadou, E.; Hornik, K.; Leisch, F.; Meyer, D.; Weingessel, A. Misc Functions of the Department of Statistics (e1071), TU Wien. R Packag. 2008, 1, 5–24. [Google Scholar]
  65. Congalton, R.G. A Review of Assessing the Accuracy of Classifications of Remotely Sensed Data. Remote Sens. Environ. 1991, 37, 35–46. [Google Scholar] [CrossRef]
  66. Pohl, C.; Van Genderen, J.L. Review Article Multisensor Image Fusion in Remote Sensing: Concepts, Methods and Applications. Int. J. Remote Sens. 1998, 19, 823–854. [Google Scholar] [CrossRef]
  67. Wiseman, G.; McNairn, H.; Homayouni, S.; Shang, J. RADARSAT-2 Polarimetric SAR Response to Crop Biomass for Agricultural Production Monitoring. IEEE J. Sel. Top. Appl. Earth Observ. Remote Sens. 2014, 7, 4461–4471. [Google Scholar] [CrossRef]
  68. Brown, S.C.; Quegan, S.; Morrison, K.; Bennett, J.C.; Cookmartin, G. High-Resolution Measurements of Scattering in Wheat Canopies-Implications for Crop Parameter Retrieval. IEEE Trans. Geosci. Remote Sens. 2003, 41, 1602–1610. [Google Scholar] [CrossRef]
  69. Bargiel, D. A New Method for Crop Classification Combining Time Series of Radar Images and Crop Phenology Information. Remote Sens. Environ. 2017, 198, 369–383. [Google Scholar] [CrossRef]
  70. Beck, P.S.; Atzberger, C.; Høgda, K.A.; Johansen, B.; Skidmore, A.K. Improved Monitoring of Vegetation Dynamics at Very High Latitudes: A New Method Using MODIS NDVI. Remote Sens. Environ. 2006, 100, 321–334. [Google Scholar] [CrossRef]
  71. Duro, D.C.; Franklin, S.E.; Dubé, M.G. A Comparison of Pixel-Based and Object-Based Image Analysis with Selected Machine Learning Algorithms for the Classification of Agricultural Landscapes Using SPOT-5 HRG Imagery. Remote Sens. Environ. 2012, 118, 259–272. [Google Scholar] [CrossRef]
  72. Myint, S.W.; Gober, P.; Brazel, A.; Grossman-Clarke, S.; Weng, Q. Per-Pixel vs. Object-Based Classification of Urban Land Cover Extraction Using High Spatial Resolution Imagery. Remote Sens. Environ. 2011, 115, 1145–1161. [Google Scholar] [CrossRef]
  73. Yan, G.; Mas, J.-F.; Maathuis, B.H.P.; Xiangmin, Z.; Van Dijk, P.M. Comparison of Pixel-Based and Object-Oriented Image Classification Approaches—A Case Study in a Coal Fire Area, Wuda, Inner Mongolia, China. Int. J. Remote Sens. 2006, 27, 4039–4055. [Google Scholar] [CrossRef]
  74. Gislason, P.O.; Benediktsson, J.A.; Sveinsson, J.R. Random Forests for Land Cover Classification. Pattern Recognit. Lett. 2006, 27, 294–300. [Google Scholar] [CrossRef]
  75. Kalideos. Available online: https://www.kalideos.fr/drupal/fr (accessed on 19 November 2018).
Figure 1. Location of the “Zone Atelier Armorique” study site and ground surveys (RGB composite image from Sentinel-1 data, 2016, ©Copernicus data 2016).SE: Shannon Entropy.
Figure 1. Location of the “Zone Atelier Armorique” study site and ground surveys (RGB composite image from Sentinel-1 data, 2016, ©Copernicus data 2016).SE: Shannon Entropy.
Remotesensing 11 00037 g001
Figure 2. Main land use types encountered in winter in the study site: (a) crop residues (maize stalks), (b) bare soils, (c) winter crops (winter barley), (d) grasslands and (e) catch crops (mustard).
Figure 2. Main land use types encountered in winter in the study site: (a) crop residues (maize stalks), (b) bare soils, (c) winter crops (winter barley), (d) grasslands and (e) catch crops (mustard).
Remotesensing 11 00037 g002
Figure 3. Pre-processing workflow for Sentinel-1 Synthetic-Aperture Radar (SAR) and Sentinel-2 optical images. SLC: Single Look Complex.
Figure 3. Pre-processing workflow for Sentinel-1 Synthetic-Aperture Radar (SAR) and Sentinel-2 optical images. SLC: Single Look Complex.
Remotesensing 11 00037 g003
Figure 4. Image processing workflow applied to Sentinel-1 parameters, Sentinel-2 parameters and the combination of Sentinel-1 and Sentinel-2 parameters. SVM: Support Vector Machine, RF: Random Forest, OA: overall accuracy.
Figure 4. Image processing workflow applied to Sentinel-1 parameters, Sentinel-2 parameters and the combination of Sentinel-1 and Sentinel-2 parameters. SVM: Support Vector Machine, RF: Random Forest, OA: overall accuracy.
Remotesensing 11 00037 g004
Figure 5. Parameters that contributed the most to land use classifications, ranked by importance for (a) optical and (b) Synthetic-Aperture Radar image time-series from August 2016 to May 2017. See Table 3 for the definition of abbreviations.
Figure 5. Parameters that contributed the most to land use classifications, ranked by importance for (a) optical and (b) Synthetic-Aperture Radar image time-series from August 2016 to May 2017. See Table 3 for the definition of abbreviations.
Remotesensing 11 00037 g005
Figure 6. Distribution of (a) the Soil Adjusted Vegetation Index (SAVI) and (b) the ratio VH/VV parameters for winter land use classes calculated from May 2017 images.
Figure 6. Distribution of (a) the Soil Adjusted Vegetation Index (SAVI) and (b) the ratio VH/VV parameters for winter land use classes calculated from May 2017 images.
Remotesensing 11 00037 g006
Figure 7. Overall accuracy of (a) object-based and (b) pixel-based image classifications of winter land use, for the best Sentinel-1, Sentinel-2 and combined Sentinel-1 and -2 parameters using the Random Forest (RF) and Support Vector Machine (SVM) algorithms.
Figure 7. Overall accuracy of (a) object-based and (b) pixel-based image classifications of winter land use, for the best Sentinel-1, Sentinel-2 and combined Sentinel-1 and -2 parameters using the Random Forest (RF) and Support Vector Machine (SVM) algorithms.
Remotesensing 11 00037 g007
Figure 8. Distribution of winter land use obtained using a parameter dataset derived from a combination of Sentinel-1 and -2 time-series. Classification was performed using a Random Forest algorithm.
Figure 8. Distribution of winter land use obtained using a parameter dataset derived from a combination of Sentinel-1 and -2 time-series. Classification was performed using a Random Forest algorithm.
Remotesensing 11 00037 g008
Figure 9. Map of the distribution of winter land use membership probabilities obtained using the Random Forest algorithm.
Figure 9. Map of the distribution of winter land use membership probabilities obtained using the Random Forest algorithm.
Remotesensing 11 00037 g009
Table 1. Land use classification.
Table 1. Land use classification.
Winter Land Use TypesWinter Land Use SubtypesMain Crops
Winter CropsNoneWinter wheat
Rapeseed
Winter barley
GrasslandsMown grasslandsNone
Grazed grasslands
Catch cropsCatch crops fed to cattleOat
Oat and vetch
Fodder cabbage
Fodder radish
Temporary grassland (ryegrass and clover)
Catch crops not usedPhacelia
Phacelia and mustard
Phacelia and oat
Mustard
Meslin (wheat/rye mixture)
Crop residuesCereal stubbleMaize stalks
Bare soilsNoneNone
Table 2. Characteristics of the Sentinel-1 Synthetic-Aperture Radar (SAR) and Sentinel-2 optical images. NIR: near infrared, SWIR: shortwave infrared.
Table 2. Characteristics of the Sentinel-1 Synthetic-Aperture Radar (SAR) and Sentinel-2 optical images. NIR: near infrared, SWIR: shortwave infrared.
Sentinel-2 (Optical)Sentinel-1 (SAR)
Spatial resolution10–60 mGround resolution2.3 m
Azimuth resolution13.9 m
Spectral band-central wavelength (µm)Band 1 (Coastal)-0.443 µm
Band 2 (Blue)-0.490 µm
Band 3 (Green)-0.560 µm
Band 4 (Red)-0.665 µm
Band 5 (Red Edge)-0.705 µm
Band 6 (Red Edge)-0.740 µm
Band 7 (Red Edge)-0.783 µm
Band 8 (NIR)-0.842 µm
Band 8A (NIR)-0.865 µm
Band 9 (Water)-0.940 µm
Band 10 (SWIR)-1.375 µm
Band 11 (SWIR)-1.610 µm
Band 12 (SWIR)-2.190 µm
PolarizationDual (VV-VH)
ModeInterferometric Wide-Single Look Complex
Incidence angle45–47° (right descending)
Coverage290 kmCoverage250 km
Dates (D-M-Y)23-Aug-2016
31-Oct-2016
30-Nov-2016
30-Dec-2016
19-Jan-2017
18-Feb-2017
30-Mar-2017
9-Apr-2017
9-May-2017
Dates (D-M-Y)25-Aug-2016
5-Nov-2016
29-Nov-2016
29-Dec-2016
16-Jan-2017
21-Feb-2017
29-Mar-2017
10-Apr-2017
10-May-2017
Table 3. Parameters derived from Sentinel-1 Synthetic-Aperture Radar (SAR) and Sentinel-2 optical image time-series.
Table 3. Parameters derived from Sentinel-1 Synthetic-Aperture Radar (SAR) and Sentinel-2 optical image time-series.
Sentinel-2 Optical ParametersSentinel-1 SAR Parameters
Band 2-blueMatrix element C11 decibels (C11 db)
Band 3-greenMatrix element C12 imaginary part (C12 img)
Band 4-redMatrix element C12 real part (C12 rel)
Band 5-vegetation red edgeMatrix element C22 (C22)
Band 6-vegetation red edgeMatrix element C22 decibels (C22 db)
Band 7-vegetation red edgeShannon entropy (SE)
Band 8-near infrared (NIR) Shannon   entropy   Intensity   ( S E i )
Band 8a-narrow NIR Shannon   entropy   Intensity   normalized   ( S E i n o )
Band 9-water vapor Shannon   entropy   normalized   ( S E n o )
Band 10-shortwave infrared (SWIR) (cirrus) Shannon   entropy   polarization   ( S E p )
Band 11-SWIR Shannon   entropy   polarization   normalized   ( S E p n o )
Band 12-SWIRTotal scattered power (SPAN)
Normalized   Difference   Vegetation   Index   ( NDVI   =   ( Band   8     Band   4 ) ( Band   8   +   Band   4 ) ) Backscattering   coefficient   VH   ( σ 0 V H )
Normalized Difference Water Index
( NDWI   =   ( Band   8     Band   12 ) ( Band   8   +   Band   12 ) )
Backscattering   coefficient   VV   ( σ 0 V V )
Soil Adjusted Vegetation Index
( SAVI   =   ( Band   8     Band   4 ) ( Band   8   +   Band   4   +   L ) ( 1 + L ) )
VH/VV
Leaf Area Index (LAI)
Fraction of photosynthetically active radiation (FAPAR)
Fractional vegetation cover (FCOVER)
Table 4. Median accuracy of winter land use classifications obtained for the best Sentinel-1, Sentinel-2 and combined Sentinel-1 and -2 parameters using the Random Forest (RF) and Support Vector Machine (SVM) algorithms. OA: overall accuracy, Kappa: Kappa index.
Table 4. Median accuracy of winter land use classifications obtained for the best Sentinel-1, Sentinel-2 and combined Sentinel-1 and -2 parameters using the Random Forest (RF) and Support Vector Machine (SVM) algorithms. OA: overall accuracy, Kappa: Kappa index.
AlgorithmsDatasetsObject-Based ApproachPixel-Based Approach
OAKappaOAKappa
RFSentinel-172%0.6758%0.52
Sentinel-278%0.7572%0.67
Sentinel-1 & -281%0.7779%0.76
SVMSentinel-173%0.6759%0.53
Sentinel-279%0.7665%0.54
Sentinel-1 & -278%0.7564%0.54
Table 5. Confusion matrix of the best winter land use classification obtained using a parameter dataset derived from a combination of Sentinel-1 and -2 time-series. Overall accuracy = 81%, Kappa index = 0.77.
Table 5. Confusion matrix of the best winter land use classification obtained using a parameter dataset derived from a combination of Sentinel-1 and -2 time-series. Overall accuracy = 81%, Kappa index = 0.77.
Catch CropsWinter CropsGrasslandsCrop ResiduesBare SoilsCommission Errors
Catch crops310421908368.3 %
Winter crops14102002390.3 %
Grasslands205631006868.3 %
Crop residues0303896285.7 %
Bare soils1675042693.8 %
Omission errors89.3 %79.2 %87.6 %100 %64.4 %81 %

Share and Cite

MDPI and ACS Style

Denize, J.; Hubert-Moy, L.; Betbeder, J.; Corgne, S.; Baudry, J.; Pottier, E. Evaluation of Using Sentinel-1 and -2 Time-Series to Identify Winter Land Use in Agricultural Landscapes. Remote Sens. 2019, 11, 37. https://doi.org/10.3390/rs11010037

AMA Style

Denize J, Hubert-Moy L, Betbeder J, Corgne S, Baudry J, Pottier E. Evaluation of Using Sentinel-1 and -2 Time-Series to Identify Winter Land Use in Agricultural Landscapes. Remote Sensing. 2019; 11(1):37. https://doi.org/10.3390/rs11010037

Chicago/Turabian Style

Denize, Julien, Laurence Hubert-Moy, Julie Betbeder, Samuel Corgne, Jacques Baudry, and Eric Pottier. 2019. "Evaluation of Using Sentinel-1 and -2 Time-Series to Identify Winter Land Use in Agricultural Landscapes" Remote Sensing 11, no. 1: 37. https://doi.org/10.3390/rs11010037

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop