Chapter 12
Radiometry

Remote sensing is not just a matter of taking pictures, but also – mostly – a matter of measuring physical values. In order to properly deal with physical magnitudes, the numerical values provided by the sensors have to be calibrated. After that, several indices with physical meaning can be computed.

12.1 Radiometric Indices

12.1.1 Introduction

With multispectral sensors, several indices can be computed, combining several spectral bands to show features that are not obvious using only one band. Indices can show:

A vegetation index is a quantitative measure used to measure biomass or vegetative vigor, usually formed from combinations of several spectral bands, whose values are added, divided, or multiplied in order to yield a single value that indicates the amount or vigor of vegetation.

Numerous indices are available in OTB and are listed in table 12.1 to 12.4 with their references.




NDVI Normalized Difference Vegetation Index [119]
RVI Ratio Vegetation Index [103]
PVI Perpendicular Vegetation Index [116144]
SAVI Soil Adjusted Vegetation Index [64]
TSAVI Transformed Soil Adjusted Vegetation Index [98]
MSAVI Modified Soil Adjusted Vegetation Index [112]
MSAVI2Modified Soil Adjusted Vegetation Index [112]
GEMI Global Environment Monitoring Index [108]
WDVI Weighted Difference Vegetation Index [2627]
AVI Angular Vegetation Index [110]
ARVI Atmospherically Resistant Vegetation Index [78]
TSARVITransformed Soil Adjusted Vegetation Index [78]
EVI Enhanced Vegetation Index [6575]
IPVI Infrared Percentage Vegetation Index [31]
TNDVI Transformed NDVI [35]



Table 12.1: Vegetation indices




IR Redness Index [45]
IC Color Index [45]
IB Brilliance Index [98]
IB2Brilliance Index [98]



Table 12.2: Soil indices




SRWI Simple Ratio Water Index [150]
NDWI Normalized Difference Water Index [20]
NDWI2 Normalized Difference Water Index [94]
MNDWIModified Normalized Difference Water Index [146]
NDPI Normalized Difference Pond Index [82]
NDTI Normalized Difference Turbidity Index [82]
SA Spectral Angle



Table 12.3: Water indices




NDBINormalized Difference Built Up Index [97]
ISU Index Surfaces Built [1]



Table 12.4: Built-up indices

The use of the different indices is very similar, and only few example are given in the next sections.

12.1.2 NDVI

NDVI was one of the most successful of many attempts to simply and quickly identify vegetated areas and their condition, and it remains the most well-known and used index to detect live green plant canopies in multispectral remote sensing data. Once the feasibility to detect vegetation had been demonstrated, users tended to also use the NDVI to quantify the photosynthetic capacity of plant canopies. This, however, can be a rather more complex undertaking if not done properly.

The source code for this example can be found in the file
Examples/Radiometry/NDVIRAndNIRVegetationIndexImageFilter.cxx.

The following example illustrates the use of the otb::RAndNIRIndexImageFilter with the use of the Normalized Difference Vegatation Index (NDVI). NDVI computes the difference between the NIR channel, noted LNIR, and the red channel, noted Lr radiances reflected from the surface and transmitted through the atmosphere:

N DVI= LNIR−-Lr
       LNIR+ Lr
(12.1)

The following classes provide similar functionality:

With the otb::RAndNIRIndexImageFilter class the filter inputs are one channel images: one inmage represents the NIR channel, the the other the NIR channel.

Let’s look at the minimal code required to use this algorithm. First, the following header defining the otb::RAndNIRIndexImageFilter class must be included.

#include "otbRAndNIRIndexImageFilter.h"

The image types are now defined using pixel types the dimension. Input and output images are defined as otb::Image.

  const unsigned int Dimension = 2; 
  typedef double                                 InputPixelType; 
  typedef float                                  OutputPixelType; 
  typedef otb::Image<InputPixelType, Dimension>  InputRImageType; 
  typedef otb::Image<InputPixelType, Dimension>  InputNIRImageType; 
  typedef otb::Image<OutputPixelType, Dimension> OutputImageType;

The NDVI (Normalized Difference Vegetation Index) is instantiated using the images pixel type as template parameters. It is implemented as a functor class which will be passed as a parameter to an otb::RAndNIRIndexImageFilter.

  typedef otb::Functor::NDVI<InputPixelType, 
      InputPixelType, 
      OutputPixelType>   FunctorType;

The otb::RAndNIRIndexImageFilter type is instantiated using the images types and the NDVI functor as template parameters.

  typedef otb::RAndNIRIndexImageFilter<InputRImageType, 
      InputNIRImageType, 
      OutputImageType, 
      FunctorType> 
  RAndNIRIndexImageFilterType;

Now the input images are set and a name is given to the output image.

  readerR->SetFileName(argv[1]); 
  readerNIR->SetFileName(argv[2]); 
  writer->SetFileName(argv[3]);

We set the processing pipeline: filter inputs are linked to the reader output and the filter output is linked to the writer input.

  filter->SetInputR(readerR->GetOutput()); 
  filter->SetInputNIR(readerNIR->GetOutput()); 
 
  writer->SetInput(filter->GetOutput());

Invocation of the Update() method on the writer triggers the execution of the pipeline. It is recommended to place update() calls in a try/catch block in case errors occur and exceptions are thrown.

  try 
    { 
    writer->Update(); 
    } 
  catch (itk::ExceptionObject& excep) 
    { 
    std::cerr << "Exception caught !" << std::endl; 
    std::cerr << excep << std::endl; 
    }

Let’s now run this example using as input the images NDVI_3.hdr and NDVI_4.hdr (images kindly and free of charge given by SISA and CNES) provided in the directory Examples/Data.


PIC PIC PIC

Figure 12.1: NDVI input images on the left (Red channel and NIR channel), on the right the result of the algorithm.


12.1.3 ARVI

The source code for this example can be found in the file
Examples/Radiometry/ARVIMultiChannelRAndBAndNIRVegetationIndexImageFilter.cxx.

The following example illustrates the use of the otb::MultiChannelRAndBAndNIRIndexImageFilter with the use of the Atmospherically Resistant Vegetation Index (ARVI) otb::Functor::ARVI. ARVI is an improved version of the NDVI that is more robust to the atmospheric effect. In addition to the red and NIR channels (used in the NDVI), the ARVI takes advantage of the presence of the blue channel to accomplish a self-correction process for the atmospheric effect on the red channel. For this, it uses the difference in the radiance between the blue and the red channels to correct the radiance in the red channel. Let’s define ρNIR, ρr, ρb the normalized radiances (that is to say the radiance normalized to reflectance units) of red, blue and NIR channels respectively. ρrb is defined as

 ∗   ∗      ∗   ∗
ρrb = ρr − γ∗(ρb− ρr)
(12.2)

The ARVI expression is

       ρ∗NIR− ρ∗rb
ARV I= ρ∗--+-ρ∗-
        NIR   rb
(12.3)

This formula can be simplified with :

       LNIR−-Lrb-
ARV I= LNIR+ Lrb
(12.4)

For more details, refer to Kaufman and Tanre’ work [78].

The following classes provide similar functionality:

With the otb::MultiChannelRAndBAndNIRIndexImageFilter class the input has to be a multi channel image and the user has to specify index channel of the red, blue and NIR channel.

Let’s look at the minimal code required to use this algorithm. First, the following header defining the otb::MultiChannelRAndBAndNIRIndexImageFilter class must be included.

#include "otbMultiChannelRAndBAndNIRIndexImageFilter.h"

The image types are now defined using pixel types and dimension. The input image is defined as an otb::VectorImage, the output is a otb::Image.

  const unsigned int Dimension = 2; 
  typedef double                                      InputPixelType; 
  typedef float                                       OutputPixelType; 
  typedef otb::VectorImage<InputPixelType, Dimension> InputImageType; 
  typedef otb::Image<OutputPixelType, Dimension>      OutputImageType;

The ARVI (Atmospherically Resistant Vegetation Index) is instantiated using the image pixel types as template parameters. Note that we also can use other functors which operate with the Red, Blue and Nir channels such as EVI, ARVI and TSARVI.

  typedef  otb::Functor::ARVI<InputPixelType, 
      InputPixelType, 
      InputPixelType, 
      OutputPixelType>        FunctorType;

The otb::MultiChannelRAndBAndNIRIndexImageFilter type is defined using the image types and the ARVI functor as template parameters. We then instantiate the filter itself.

  typedef otb::MultiChannelRAndBAndNIRIndexImageFilter 
  <InputImageType, 
      OutputImageType, 
      FunctorType> 
  MultiChannelRAndBAndNIRIndexImageFilterType; 
 
  MultiChannelRAndBAndNIRIndexImageFilterType::Pointer 
    filter = MultiChannelRAndBAndNIRIndexImageFilterType::New();

Now the input image is set and a name is given to the output image.

  reader->SetFileName(argv[1]); 
  writer->SetFileName(argv[2]);

The three used index bands (red, blue and NIR) are declared.

  filter->SetRedIndex(::atoi(argv[5])); 
  filter->SetBlueIndex(::atoi(argv[6])); 
  filter->SetNIRIndex(::atoi(argv[7]));

The γ parameter is set. The otb::MultiChannelRAndBAndNIRIndexImageFilter class sets the default value of γ to 0.5. This parameter is used to reduce the atmospheric effect on a global scale.

  filter->GetFunctor().SetGamma(::atof(argv[8]));

The filter input is linked to the reader output and the filter output is linked to the writer input.

  filter->SetInput(reader->GetOutput()); 
 
  writer->SetInput(filter->GetOutput());

The invocation of the Update() method on the writer triggers the execution of the pipeline. It is recommended to place update calls in a try/catch block in case errors occur and exceptions are thrown.

  try 
    { 
    writer->Update(); 
    } 
  catch (itk::ExceptionObject& excep) 
    { 
    std::cerr << "Exception caught !" << std::endl; 
    std::cerr << excep << std::endl; 
    }

Let’s now run this example using as input the image IndexVegetation.hd (image kindly and free of charge given by SISA and CNES) and γ=0.6 provided in the directory Examples/Data.


PIC PIC

Figure 12.2: ARVI result on the right with the left image in input.


12.1.4 AVI

The source code for this example can be found in the file
Examples/Radiometry/AVIMultiChannelRAndGAndNIRVegetationIndexImageFilter.cxx.

The following example illustrates the use of the otb::MultiChannelRAndGAndNIR VegetationIndexImageFilter with the use of the Angular Vegetation Index (AVI). The equation for the Angular Vegetation Index involves the gren, red and near infra-red bands. λ1, λ2 and λ3 are the mid-band wavelengths for the green, red and NIR bands and tan1 is the arctangent function.

The AVI expression is

     λ3−-λ2
A 1 =  λ2
(12.5)

     λ2− λ1
A 2 =--λ---
        2
(12.6)

         (        )       (     )
AVI= tan−1  --A1--- + tan−1 --A2-
           NIR − R         G − R
(12.7)

For more details, refer to Plummer work [110].

With the otb::MultiChannelRAndGAndNIRIndexImageFilter class the input has to be a multi channel image and the user has to specify the channel index of the red, green and NIR channel.

Let’s look at the minimal code required to use this algorithm. First, the following header defining the otb::MultiChannelRAndGAndNIRIndexImageFilter class must be included.

#include "otbMultiChannelRAndGAndNIRIndexImageFilter.h"

The image types are now defined using pixel types and dimension. The input image is defined as an otb::VectorImage, the output is a otb::Image.

  const unsigned int Dimension = 2; 
  typedef double                                      InputPixelType; 
  typedef float                                       OutputPixelType; 
  typedef otb::VectorImage<InputPixelType, Dimension> InputImageType; 
  typedef otb::Image<OutputPixelType, Dimension>      OutputImageType;

The AVI (Angular Vegetation Index) is instantiated using the image pixel types as template parameters.

  typedef  otb::Functor::AVI<InputPixelType, InputPixelType, 
      InputPixelType,  OutputPixelType> FunctorType;

The otb::MultiChannelRAndGAndNIRIndexImageFilter type is defined using the image types and the AVI functor as template parameters. We then instantiate the filter itself.

  typedef otb::MultiChannelRAndGAndNIRIndexImageFilter 
  <InputImageType, OutputImageType, FunctorType> 
  MultiChannelRAndGAndNIRIndexImageFilterType; 
 
  MultiChannelRAndGAndNIRIndexImageFilterType::Pointer 
    filter = MultiChannelRAndGAndNIRIndexImageFilterType::New();

Now the input image is set and a name is given to the output image.

  reader->SetFileName(argv[1]); 
  writer->SetFileName(argv[2]);

The three used index bands (red, green and NIR) are declared.

  filter->SetRedIndex(::atoi(argv[5])); 
  filter->SetGreenIndex(::atoi(argv[6])); 
  filter->SetNIRIndex(::atoi(argv[7]));

The λ R, G and NIR parameters are set. The otb::MultiChannelRAndGAndNIRIndexImageFilter class sets the default values of λ to 660, 560 and 830.

  filter->GetFunctor().SetLambdaR(::atof(argv[8])); 
  filter->GetFunctor().SetLambdaG(::atof(argv[9])); 
  filter->GetFunctor().SetLambdaNir(::atof(argv[10]));

The filter input is linked to the reader output and the filter output is linked to the writer input.

  filter->SetInput(reader->GetOutput()); 
 
  writer->SetInput(filter->GetOutput());

The invocation of the Update() method on the writer triggers the execution of the pipeline. It is recommended to place update calls in a try/catch block in case errors occur and exceptions are thrown.

  try 
    { 
    writer->Update(); 
    } 
  catch (itk::ExceptionObject& excep) 
    { 
    std::cerr << "Exception caught !" << std::endl; 
    std::cerr << excep << std::endl; 
    }

Let’s now run this example using as input the image verySmallFSATSW.tif provided in the directory Examples/Data.


PIC PIC

Figure 12.3: AVI result on the right with the left image in input.


12.2 Atmospheric Corrections

The source code for this example can be found in the file
Examples/Radiometry/AtmosphericCorrectionSequencement.cxx.

The following example illustrates the application of atmospheric corrections to an optical multispectral image similar to Pleiades. These corrections are made in four steps :

The manipulation of each class used for the different steps and the link with the 6S radiometry library will be explained.

Let’s look at the minimal code required to use this algorithm. First, the following header defining the otb::AtmosphericCorrectionSequencement class must be included. For the numerical to luminance image, luminance to refletance image, and reflectance to atmospheric correction image corrections and the neighborhood correction, four header files are required.

#include "otbImageToLuminanceImageFilter.h" 
#include "otbLuminanceToReflectanceImageFilter.h" 
#include "otbReflectanceToSurfaceReflectanceImageFilter.h" 
#include "otbSurfaceAdjacencyEffect6SCorrectionSchemeFilter.h"

This chain uses the 6S radiative transfer code to compute radiometric parameters. To manipulate 6S data, three classes are needed (the first one to store the metadata, the second one that calls 6S class and generates the information which will be stored in the last one).

#include "otbAtmosphericCorrectionParameters.h" 
#include "otbAtmosphericCorrectionParametersTo6SAtmosphericRadiativeTerms.h" 
#include "otbAtmosphericRadiativeTerms.h"

Image types are now defined using pixel types and dimension. The input image is defined as an otb::VectorImage, the output image is a otb::VectorImage. To simplify, input and output image types are the same one.

  const unsigned int Dimension = 2; 
  typedef double                                 PixelType; 
  typedef otb::VectorImage<PixelType, Dimension> ImageType;

The GenerateOutputInformation() reader method is called to know the number of component per pixel of the image. It is recommended to place GenerateOutputInformation calls in a try/catch block in case errors occur and exceptions are thrown.

  reader->SetFileName(argv[1]); 
  try 
    { 
    reader->GenerateOutputInformation(); 
    } 
  catch (itk::ExceptionObject& excep) 
    { 
    std::cerr << "Exception caught !" << std::endl; 
    std::cerr << excep << std::endl; 
    }

The otb::ImageToLuminanceImageFilter type is defined and instancied. This class uses a functor applied to each component of each pixel (Xk) whose formula is:

       Xk
LkTOA = --+ βk.
       αk
(12.8)

Where :

  typedef otb::ImageToLuminanceImageFilter<ImageType, ImageType> 
  ImageToLuminanceImageFilterType; 
 
  ImageToLuminanceImageFilterType::Pointer filterImageToLuminance 
    = ImageToLuminanceImageFilterType::New();

Here, α and β are read from an ASCII file given in input, stored in a vector and passed to the class.

  filterImageToLuminance->SetAlpha(alpha); 
  filterImageToLuminance->SetBeta(beta);

The otb::LuminanceToReflectanceImageFilter type is defined and instancied. This class used a functor applied to each component of each pixel of the luminance filter output (LTOAk):

         π.Lk
ρkTOA =-k----TOA----.
      ES.cos(θS).d∕d0
(12.9)

Where :

  typedef otb::LuminanceToReflectanceImageFilter<ImageType, ImageType> 
  LuminanceToReflectanceImageFilterType; 
  LuminanceToReflectanceImageFilterType::Pointer filterLuminanceToReflectance 
    = LuminanceToReflectanceImageFilterType::New();

The solar illumination is read from a ASCII file given in input, stored in a vector and given to the class. Day, month and zenital solar angle are inputs and can be directly given to the class.

  filterLuminanceToReflectance->SetZenithalSolarAngle( 
    static_cast<double>(atof(argv[6]))); 
  filterLuminanceToReflectance->SetDay(atoi(argv[7])); 
  filterLuminanceToReflectance->SetMonth(atoi(argv[8])); 
  filterLuminanceToReflectance->SetSolarIllumination(solarIllumination);

At this step of the chain, radiometric informations are nedeed. Those informations will be computed from different parameters stored in a otb::AtmosphericCorrectionParameters class intance. This container will be given to an otb::AtmosphericCorrectionParametersTo6SAtmosphericRadiativeTerms class instance which will call a 6S routine that will compute the needed radiometric informations and store them in a otb::AtmosphericRadiativeTerms class instance. For this, otb::AtmosphericCorrectionParametersTo6SAtmosphericRadiativeTerms otb::AtmosphericCorrectionParameters and otb::AtmosphericRadiativeTerms types are defined and instancied.

  typedef otb::AtmosphericCorrectionParametersTo6SAtmosphericRadiativeTerms 
  AtmosphericCorrectionParametersTo6SRadiativeTermsType; 
 
  typedef otb::AtmosphericCorrectionParameters 
  AtmosphericCorrectionParametersType; 
 
  typedef otb::AtmosphericRadiativeTerms 
  AtmosphericRadiativeTermsType;

The otb::AtmosphericCorrectionParameters class needs several parameters :

  dataAtmosphericCorrectionParameters->SetSolarZenithalAngle( 
    static_cast<double>(atof(argv[6]))); 
 
  dataAtmosphericCorrectionParameters->SetSolarAzimutalAngle( 
    static_cast<double>(atof(argv[9]))); 
 
  dataAtmosphericCorrectionParameters->SetViewingZenithalAngle( 
    static_cast<double>(atof(argv[10]))); 
 
  dataAtmosphericCorrectionParameters->SetViewingAzimutalAngle( 
    static_cast<double>(atof(argv[11]))); 
 
  dataAtmosphericCorrectionParameters->SetMonth(atoi(argv[8])); 
 
  dataAtmosphericCorrectionParameters->SetDay(atoi(argv[7])); 
 
  dataAtmosphericCorrectionParameters->SetAtmosphericPressure( 
    static_cast<double>(atof(argv[12]))); 
 
  dataAtmosphericCorrectionParameters->SetWaterVaporAmount( 
    static_cast<double>(atof(argv[13]))); 
 
  dataAtmosphericCorrectionParameters->SetOzoneAmount( 
    static_cast<double>(atof(argv[14]))); 
 
  AerosolModelType aerosolModel = 
    static_cast<AerosolModelType>(::atoi(argv[15])); 
 
  dataAtmosphericCorrectionParameters->SetAerosolModel(aerosolModel); 
 
  dataAtmosphericCorrectionParameters->SetAerosolOptical( 
    static_cast<double>(atof(argv[16])));

Once those parameters are loaded and stored in the AtmosphericCorrectionParameters instance class, it is given in input of an instance of AtmosphericCorrectionParametersTo6SAtmosphericRadiativeTerms that will compute the needed radiometric informations.

  AtmosphericCorrectionParametersTo6SRadiativeTermsType::Pointer 
    filterAtmosphericCorrectionParametersTo6SRadiativeTerms = 
    AtmosphericCorrectionParametersTo6SRadiativeTermsType::New(); 
 
  filterAtmosphericCorrectionParametersTo6SRadiativeTerms->SetInput( 
    dataAtmosphericCorrectionParameters); 
 
  filterAtmosphericCorrectionParametersTo6SRadiativeTerms->Update();

The output of this class will be an instance of the AtmosphericRadiativeTerms class. This class contains (for each channel of the image)

Atmospheric corrections can now start. First, an instance of otb::ReflectanceToSurfaceReflectanceImageFilter is created.

  typedef otb::ReflectanceToSurfaceReflectanceImageFilter<ImageType, 
      ImageType> 
  ReflectanceToSurfaceReflectanceImageFilterType; 
 
  ReflectanceToSurfaceReflectanceImageFilterType::Pointer 
    filterReflectanceToSurfaceReflectanceImageFilter 
    = ReflectanceToSurfaceReflectanceImageFilterType::New();

The aim of the atmospheric correction is to invert the surface reflectance (for each pixel of the input image) from the TOA reflectance and from simulations of the atmospheric radiative functions corresponding to the geometrical conditions of the observation and to the atmospheric components. The process required to be applied on each pixel of the image, band by band with the following formula:

ρuSnif=  --A----
       1+ SxA
(12.10)

Where,

    ---ρTOA−-ρatm---
A = T(μ ).T(μ ).tallgas
       S    V  g
(12.11)

With :

All those parameters are contained in the AtmosphericCorrectionParametersTo6SRadiativeTerms output.

  filterReflectanceToSurfaceReflectanceImageFilter-> 
  SetAtmosphericRadiativeTerms( 
    filterAtmosphericCorrectionParametersTo6SRadiativeTerms->GetOutput());

Next (and last step) is the neighborhood correction. For this, the SurfaceAdjacencyEffect6SCorrectionSchemeFilter class is used. The previous surface reflectance inversion is performed under the assumption of a homogeneous ground environment. The following step allows to correct the adjacency effect on the radiometry of pixels. The method is based on the decomposition of the observed signal as the summation of the own contribution of the target pixel and of the contribution of neighbored pixels moderated by their distance to the target pixel. A simplified relation may be :

    ρunif.T(μ )− < ρS > .t (μ )
ρS= -S------V----------d-V-
           exp(− δ∕μV)
(12.12)

With :

The neighborhood consideration window size is given by the window radius. An instance of otb::SurfaceAdjacencyEffect6SCorrectionSchemeFilter is created.

  typedef otb::SurfaceAdjacencyEffect6SCorrectionSchemeFilter<ImageType, 
      ImageType> 
  SurfaceAdjacencyEffect6SCorrectionSchemeFilterType; 
  SurfaceAdjacencyEffect6SCorrectionSchemeFilterType::Pointer 
    filterSurfaceAdjacencyEffect6SCorrectionSchemeFilter 
    = SurfaceAdjacencyEffect6SCorrectionSchemeFilterType::New();

The needs four input informations:

At this step, each filter of the chain is instancied and every one has its input paramters set. A name can be given to the output image and each filter can linked to other to create the final processing chain.

  writer->SetFileName(argv[2]); 
 
  filterImageToLuminance->SetInput(reader->GetOutput()); 
  filterLuminanceToReflectance->SetInput(filterImageToLuminance->GetOutput()); 
  filterReflectanceToSurfaceReflectanceImageFilter->SetInput( 
    filterLuminanceToReflectance->GetOutput()); 
  filterSurfaceAdjacencyEffect6SCorrectionSchemeFilter->SetInput( 
    filterReflectanceToSurfaceReflectanceImageFilter->GetOutput()); 
 
  writer->SetInput( 
    filterSurfaceAdjacencyEffect6SCorrectionSchemeFilter->GetOutput());

The invocation of the Update() method on the writer triggers the execution of the pipeline. It is recommended to place this call in a try/catch block in case errors occur and exceptions are thrown.

  try 
    { 
    writer->Update(); 
    } 
  catch (itk::ExceptionObject& excep) 
    { 
    std::cerr << "Exception caught !" << std::endl; 
    std::cerr << excep << std::endl; 
    }