Conference Agenda

Overview and details of the sessions of this conference. Please select a date or location to show only sessions at that day or location. Please select a single session for detailed view (with abstracts and downloads if available).

 
Session Overview
Session
3.1: Validation and Accuracy
Time:
Thursday, 16/Mar/2017:
9:00am - 10:40am

Session Chair: Thomas R. Loveland, U.S.Geological Survey
Session Chair: Sophie Bontemps, Université catholique de Louvain
Mtg. Room: Big Hall
Bldg 14

Show help for 'Increase or decrease the abstract text size'
Presentations
9:00am - 9:20am

Validation of global annual land cover map series and land cover change: experience from the Land Cover component of the ESA Climate Change Initiative

Sophie Bontemps1, Frédéric Achard2, Céline Lamarche1, Philippe Mayaux3, Olivier Arino4, Pierre Defourny1

1UCLouvain-Geomatics (Belgium), Belgium; 2Joint Research Center, Italy; 3European Commission, Belgium; 4European Space Agency, Italy

In the framework of the ESA Climate Change Initiative (CCI), the Land Cover (LC) team delivered a new generation of satellite-derived series of 300 m global annual LC products spanning a period from 1992 to 2015. These maps, consistent in space and time, were obtained from AVHRR (1992 - 1999), SPOT-Vegetation (1999 - 2012), MERIS (2003 - 2012) and PROBA-V (2012 - 2015) time series. The typology was defined using the UN Land Cover Classification System (LCCS) and counts 22 classes compatible with the GLC2000, GlobCover 2005 and 2009 products.

A critical step in the acceptance of these products by users is providing confidence in the products’ quality and in their uncertainties through validation against independent data. Building on the GlobCover experience, a validation strategy was designed to globally validate the maps series and change at four points in time: 2000, 2005, 2010 and 2015.

In order to optimize data and resources availability, 2600 Primary Sample Units (PSUs), defined as a 20 × 20 km box, were selected based on a stratified random sampling. Over each PSU, five Secondary Sample Units (SSUs) were defined, located at the center of each 20 × 20 km box. This cluster sampling strategy increased the number of sample sites and thus lowered standard error of accuracy estimates.

An international network of land cover specialists with regional expertise was in charge of interpreting high spatial resolution images over SSUs to build the reference database that eventually allowed assessing the accuracy of the CCI global LC map series.

Over each SSU, the visual interpretation of very high resolution imagery close to 2010 allowed labeling each object derived by segmentation according to the CCI LC legend. Change between 2010, 2005 and 2000 was then systematically evaluated on Landsat TM or ETM+ scenes acquired from the Global Land Surveys (GLS) for the respective years. Annual NDVI profiles derived from the SPOT-Vegetation time series facilitated image interpretation by providing seasonal variations of vegetation greenness. This reference validation database was then complemented for 2015 thanks to the large FAO data set obtained using the Collect Earth tool.

Overall, users and producers accuracies of the CCI LC map series are then derived. In addition, various quality indices, related to specific use in the climate models (carbon content, net primary productivity, methane emissions, etc.), will be constructed by taking into account the semantic distance between LC classes.

Such validation scheme was made possible thanks to a tailored user-friendly validation web interface which integrated large amount of very high spatial resolution imagery, pixel-based NDVI profiles from various sensors, as well as GIS interactive tools to facilitate LC and change interpretation. This efficient web platform was already selected by several other international initiatives to collect reference data sets as this unique object-based approach provides reference database which can be used to validate land cover maps at different resolution.


9:20am - 9:40am

Comparative validation of Copernicus Pan-European High Resolution Layer (HRL) on Tree Cover Density (TCD) and University of Maryland (UMd) Global Forest Change (GFC) over Europe

Christophe Sannier1, Heinz Gallaun2, Javier Gallego3, Ronald McRoberts4, Hans Dufourmont5, Alexandre Pennec1

1Systèmes d'Information à Référence Spatiale (SIRS), France; 2Joanneum Research, Austria; 3Joint Research Centre, Italy; 4US Forest Service, USA; 5European Environment Agency, Denmark

The validation of datasets such as the Copernicus Pan-European Tree Cover Density (TCD) high resolution layer (TCD) and UMD Global Forest Change (GFC) Tree percent requires considerable effort to provide validation results at Pan-European level over nearly 6 million km². A stratified systematic sampling approach was developed based on the LUCAS sampling frame. A two-stage stratified sample of 17,296 1ha square primary sampling units (PSU) was selected over EEA39 based on countries or groups of countries which area was greater than 90,000km² and a series of omission and commission strata. In each PSU, a grid of 5 x 5 Secondary Sample units (SSUs) with a 20 m step was applied. These points were photo-interpreted on orthophotos with a resolution better than 2.5m.

The UMD GFC data was processed to provide a 2012 Tree Percent layer comparable to the Copernicus High Resolution Layer by including Tree losses and gains over the selected period. An appropriate interpolation procedure was then applied to both the UMD GFC and Copernicus HRL TCD to provide a precise match with the PSU validation data and account for potential geographic differences between validation and map data and SSU sampling errors.

Initial results based on the binary conversion of the HRL TCD data by applying a 10, 15 and 30% thresholds indicate a level of omission errors in line with the required maximum level of 15% and exceeds the target level of 10% for commission errors set in the product specifications. However, disaggregated results were also computed at country / group of countries level as well as for biogeographical regions and showed considerable geographical variability. Best results were obtained in countries or biogeographical regions with high tree cover (e.g. Continental and Boreal regions) and the worst results in countries or biogeographical regions with low tree cover (e.g. Anatolian, Arctic). There is less variability between production lots and the outcome of the analysis of the scatterplots show that there is a strong relationship between validation and map data with the best results in the countries and biogeographical regions mentioned previously. However, there seems to be a general trend to slightly underestimate TCD. Results for the UMD GFC dataset are currently in progress but will be made available in time for the conference. Results provided at HRL TCD production lot, bio-geographical regions and country/group of country level, should provide a sound basis for targeting further improvement to the products.

The stratification procedure was based on a combination of the HRL TCD Layer and CORINE Land Cover (CLC) was effective for commission errors, but less so for omission errors. Therefore, the stratification should be simplified to only include a commission (tree cover mask 1-100) and an omission stratum (rest of the area) or an alternative stratification to CLC should be applied to better target omission. Finally, the approach developped could be rolled out at global scale for the complete validation of global datasets.


9:40am - 10:00am

Copernicus Global Land Hot Spot Monitoring Service – Accuracy Assessment and Area Estimation Approach

Heinz Gallaun1, Gabriel Jaffrain2, Zoltan Szantoi3, Andreas Brink3, Adrien Moiret2, Stefan Kleeschulte4, Mathias Schardt1, Conrad Bielski5, Cedric Lardeux6, Alex Petre7

1JOANNEUM RESEARCH, Austria; 2IGN FI, France; 3Joint Research Centre (JRC), European Commission; 4space4environment (s4e), Luxembourg; 5EOXPLORE, Germany; 6ONF International, France; 7GISBOX, Romania

The main objective of the Copernicus Global Land – Hot Spot Monitoring Service is to provide detailed land information on specific areas of interest, including protected areas or hot spots for biodiversity and land degradation. For such areas of interest, land cover and land cover change products which are mainly derived from medium (Landsat, Sentinel2) and high resolution satellite data are made available to global users. The service directly supports field projects and policies developed by the European Union (EU) in the framework of EU’s international policy interests. It is coordinated by the Joint Research Center of the European Commission and answers ad-hoc requests and focus mainly within the domain of the sustainable management of natural resources.

Comprehensive, independent validation and accuracy assessment of all thematic map products is performed before providing the land cover and land cover change products to the global users. The following rigorous methodology is conducted:

Spatial, temporal and logical consistency is assessed by determination of the positional accuracy, the assessment of the validity of data with respect to time, and the logical consistency of the data e.g. topology, attribution and logical relationships.

A Qualitative-systematic accuracy assessment is performed wall-to-wall by a systematic visual examination of the land cover and land cover change maps within a geographic information system and their accuracies are documented in terms of type of errors.

For quantitative accuracy assessment, a stratified random sampling approach is implemented which is based on inclusion probabilities. A web-based interpretation tool based on PostgreSQL is implemented which provides high resolution time series imagery e.g. from Sentinel 2, derived temporal trajectories of reflection, and ancillary information in addition to the very high resolution imagery. In order to quantify the uncertainty of the derived accuracy measures, confidence intervals are derived by analytic formulas as well as by applying bootstrapping.

Area Estimation is performed according to the requirements for international reporting. In general, the errors of omission and errors of commission are not equal and such bias shows up in the course of the quantitative accuracy assessment. As the implemented approach applies probability sampling and the error matrix is based on inclusion probabilities, area estimates are directly derived from the error matrix. The area estimates are complemented by confidence intervals.

The approach and methodology will be discussed in detail on the basis of a systematic accuracy assessment and area estimation results for sites in Africa which are the focus in the first year of implementation of the Copernicus Global Land Hot Spot Monitoring Service.


10:00am - 10:20am

Forest degradation assessment: accuracy assessment of forest degradation products using HR data

Naila Yasmin, Remi D’annunzio, Inge Jonckheere

FAO Forestry Department, FAO of UN, Rome Italy

REDD+, Reducing emissions from deforestation and forest degradation, is a mechanism developed by Parties to the United Nations Framework Convention on Climate Change (UNFCCC). It creates a financial value for the carbon stored in forests by offering incentives for developing countries to reduce emissions from forested lands and invest in low-carbon paths to sustainable development. Developing countries would receive results-based payments for results-based actions. Forest monitoring is always remained challenging to asses with the least error. So far, a lot of countries started to report on the deforestation on a national scale, but the technical issue to assess the degradation in-country is still a major point of research. Remote sensing options are actually being explored in order to map degradation.

The ForMosa project aim was to develop a sound methodology for forest degradation monitoring for REDD+ using medium coarse satellite data sources such as Landsat-8, Sentinel-2, and SPOT 5 in addition to Rapid Eye imagery. The project is carried out by three partners: Planet, the Wageningen University (Service Providers) and the Forest department of FAO, which carried out the accuracy assessment of the project products. Initially three pilot study sites, Kafa Tura in Ethiopia, Madre de Dios in Peru and Bac Kan in Vietnam were selected. The initial product developed at 10 m resolution with five classes, representing different levels of forest degradation with increasing intensity.

The Forest department of FAO used the in-house built open source tools for the accuracy assessment. The process consists of four steps, (i) map data, (ii) sample design, (iii) response design and (iv) analysis. A stratified random sampling approach was used to access the product by using high-resolution Google earth and Bing map imagery. In this paper, the methodology of the work and its results will be presented.


10:20am - 10:40am

A New Open Reference Global Dataset for Land Cover Mapping at a 100m Resolution

Myroslava Lesiv1, Steffen Fritz1, Linda See1, Nandika Tsendbazar2, Martin Herold2, Martina Duerauer1, Ruben Van De Kerchove3, Marcel Buchhorn3, Inian Moorthy1, Bruno Smets3

1International Institute for Applied Systems Analysis, Austria; 2Wageningen University and Research, the Netherlands; 3VITO, Belgium

Classification techniques are dependent on the quality and quantity of reference data, where the data should represent different land cover types, varying landscapes and be of a high overall quality. In general there is currently a lack of reference data for large scale land cover mapping. In particular, reference data are needed for a new dynamic 100m land cover product that will be added to the Copernicus Global Land services portfolio in the near future. These reference data must correspond to the 100m Proba-V data spatially and temporally. Therefore the main objectives of this study are as follows: to develop algorithms and tools for collecting high quality reference data; to collect reference data that correspond to Proba-V data spatially and temporally; to develop a strategy for validating the new dynamic land cover product and to collect validation data.

To aid in the collection of reference data for the development of the new dynamic land cover layer, the Geo-Wiki Engagement Platform (http://www.geo-wiki.org/) was used to develop tools and to provide user-friendly interface to collect high quality reference data. Experts, trained by staff at IIASA (International Institute for Applied Systems Analysis) interpreted various land cover types of a network of reference locations via the Geo-Wiki platform. At each reference location, experts were presented with 100m x 100m area, subdivided into 10m x 10m cells superimposed on high-resolution Google Earth or Microsoft Bing imagery. Using visual cues, acquired from multiple training sessions, the experts interpreted each cell into land cover classes which include trees, shrubs, water, arable land, burnt areas, etc. This information is then translated into different legends using the UN LCCS (United Nations Land Cover Classification System) as a basis. The distribution of sample sites is systematic, with the same distance between sample sites. However, land cover data are not collected at every sample site as the frequency depends on the heterogeneity of land cover types by region. Data quality is controlled by ensuring that multiple classifications are collected at the same sample site by different experts. In addition, there are control points that have been validated by IIASA staff for use as additional quality control measures.

As a result, we have collected the reference data (at approximately 20000 sites) for Africa, which has been chosen due to its high density in complex landscapes and areas covered by vegetation mosaics. Current efforts are underway to expand the reference data collection to a global scale.

For validation of the new global land cover product, a separate Geo-Wiki branch was developed and accessed by local experts who are not participated in the training data collection. A stratified random sampling procedure is used due to its flexibility and statistical rigorousness. For the stratification, a global stratification based on climate zones and human density by Olofsson et al (2012) was chosen owing to its independence from existing land cover maps and thus its suitability for future map evolutions.

The reference datasets will be freely available for use by the scientific community.



 
Contact and Legal Notice · Contact Address:
Conference: WorldCover 2017
Conference Software - ConfTool Pro 2.6.113
© 2001 - 2017 by H. Weinreich, Hamburg, Germany