Plan du site Contact Calendrier Nouveau Home

Banque de données projets FEDRA


Actions de recherche



Recherche et applications > Banque de données projets > Banque de données projets FEDRA

Application and validation of generic algorithms for Hyperspectral data cubes (HYPERWAVE)

Projet de recherche S0/00/051 (Action de recherche S0)

Personnes :

  • Prof. dr.  COPPIN Pol - Katholieke Universiteit Leuven (K.U.Leuven)
    Partenaire financé belge
    Durée: 1/12/2003-31/12/2005
  • Dr.  RUDDICK Kevin - Institut Royal des Sciences Naturelles de Belgique (IRSNB)
    Partenaire financé belge
    Durée: 1/12/2003-31/12/2005
  • Prof. dr.  SCHEUNDERS Paul - Universiteit Antwerpen (UA)
    Partenaire financé belge
    Durée: 1/12/2003-31/12/2005
  • Dhr.  DEBRUYN Walter - Vlaamse Instelling voor Technologisch Onderzoek (VITO)
    Partenaire financé belge
    Durée: 1/12/2003-31/12/2005

Description :

Context and objectives

The objective of this proposal is to apply and validate the algorithms developed in the frame of the Hypercrunch project. In this earlier STEREO 1 project (SR/00/05), state of the art algorithms to analyse Hyperspectral datasets have been developed and validated via stress detection in orchards. In this context Hyperspectral data cubes were acquired for both stressed and non-stressed orchard plots. One of the goals of Hypercrunch was to develop generic data reduction algorithms, i.e. algorithms that are application independent to the extent possible. This to allow us to implement and automate the generic part in operational data processing chains such as the APEX-chain. A lot of effort has been put in the development of a prototype toolbox to apply the algorithms. To be able to validate and fully test the toolbox, it has to be confronted with as many different data cubes dealing with as many diverse remote sensing applications as possible and feasible. Data cubes and expertise from previous projects within the STEREO envelope and from different projects performed at Vito can be fully re-used for this purpose, without additional cost.


Part of the generic algorithms, developed for Hypercrunch are based on the wavelet transform. Using this transform, several feature extraction techniques are being developed (the use of wavelet coefficients and first and second order statistical features from the distribution of wavelet coefficients). Feature reduction techniques are being developed using suboptimal floating search techniques. Fisher’s Linear Discriminant Analysis is applied for classification. For aquatic application, a technique for the retrieval of oceanic constituents from ocean color is developed based on simulated annealing.


For the dune vegetation mapping application, a 4-step classification framework is proposed, that works directly on the posterior class probabilities and includes a coupling of binary classifier procedure, spatial smoothing and unmixing. A large scale classification experiment was set up, for which a large amount of ground reference data and airborne hyperspectral data of the entire Belgian coastline were collected. A total of 23 classes was distinguished, within the vegetation groups salty vegetation, marram dune, moss dune, grassland, scrub and woodland, some mixed vegetation and 4 non-vegetation classes.
For the aquatic application, a model is fitted to reflectance spectra, using simulating annealing for optimising the means square error of the reflectance over all spectra.

Products and services

Dune vegetation maps for Belgian coastline are produced in digital and paper form

Documentation :

A propos de ce site

Cookies policy

Données personnelles

© 2020 SPP Politique scientifique