Radiometry

Remote sensing is not just a matter of taking pictures, but also – mostly – a matter of measuring physical values. In order to properly deal with physical magnitudes, the numerical values provided by the sensors have to be calibrated. After that, several indices with physical meaning can be computed.

With multispectral sensors, several indices can be computed, combining several spectral bands to show features that are not obvious using only one band. Indices can show:

A vegetation index is a quantitative measure used to measure biomass or vegetative vigor, usually formed from combinations of several spectral bands, whose values are added, divided, or multiplied in order to yield a single value that indicates the amount or vigor of vegetation.

Numerous indices are available in OTB and are listed in table 12.1 to 12.4 with their references.

NDVI | Normalized Difference Vegetation Index [120] |

RVI | Ratio Vegetation Index [104] |

PVI | Perpendicular Vegetation Index [117, 145] |

SAVI | Soil Adjusted Vegetation Index [64] |

TSAVI | Transformed Soil Adjusted Vegetation Index [9, 8] |

MSAVI | Modified Soil Adjusted Vegetation Index [113] |

MSAVI2 | Modified Soil Adjusted Vegetation Index [113] |

GEMI | Global Environment Monitoring Index [109] |

WDVI | Weighted Difference Vegetation Index [26, 27] |

AVI | Angular Vegetation Index [111] |

ARVI | Atmospherically Resistant Vegetation Index [79] |

TSARVI | Transformed Soil Adjusted Vegetation Index [79] |

EVI | Enhanced Vegetation Index [65, 76] |

IPVI | Infrared Percentage Vegetation Index [31] |

TNDVI | Transformed NDVI [35] |

SRWI | Simple Ratio Water Index [151] |

NDWI | Normalized Difference Water Index [20] |

NDWI2 | Normalized Difference Water Index [95] |

MNDWI | Modified Normalized Difference Water Index [147] |

NDPI | Normalized Difference Pond Index [83] |

NDTI | Normalized Difference Turbidity Index [83] |

SA | Spectral Angle |

The use of the different indices is very similar, and only few example are given in the next sections.

NDVI was one of the most successful of many attempts to simply and quickly identify vegetated areas and
their

The source code for this example can be found in the file

The following example illustrates the use of the _{NIR}, and the red channel, noted _{r} radiances reflected from the surface and transmitted through the
atmosphere:

| (12.1) |

otb::Functor::RVI otb::Functor::PVI otb::Functor::SAVI otb::Functor::TSAVI otb::Functor::MSAVI otb::Functor::GEMI otb::Functor::WDVI otb::Functor::IPVI otb::Functor::TNDVI

With the

Let’s look at the minimal code required to use this algorithm. First, the following header defining the

The image types are now defined using pixel types the dimension. Input and output images are defined as

The NDVI (Normalized Difference Vegetation Index) is instantiated using the images pixel type as
template parameters. It is implemented as a functor class which will be passed as a parameter to an

The

Now the input images are set and a name is given to the output image.

We set the processing pipeline: filter inputs are linked to the reader output and the filter output is linked to the writer input.

Invocation of the

Let’s now run this example using as input the images

The source code for this example can be found in the file

The following example illustrates the use of the _{NIR}^{*}, _{r}^{*}, _{b}^{*} the normalized radiances (that is to say the radiance
normalized to reflectance units) of red, blue and NIR channels respectively. _{rb}^{*} is defined
as

| (12.2) |

The ARVI expression is

| (12.3) |

This formula can be simplified with :

| (12.4) |

For more details, refer to Kaufman and Tanre’ work [79].

With the

Let’s look at the minimal code required to use this algorithm. First, the following header defining the

The image types are now defined using pixel types and dimension. The input image is defined as an

The ARVI (Atmospherically Resistant Vegetation Index) is instantiated using the image pixel types as template parameters. Note that we also can use other functors which operate with the Red, Blue and Nir channels such as EVI, ARVI and TSARVI.

The

Now the input image is set and a name is given to the output image.

The three used index bands (red, blue and NIR) are declared.

The

The filter input is linked to the reader output and the filter output is linked to the writer input.

The invocation of the

Let’s now run this example using as input the image

The source code for this example can be found in the file

The following example illustrates the use of the otb::MultiChannelRAndGAndNIR VegetationIndexImageFilter
with the use of the Angular Vegetation Index (AVI). The equation for the Angular Vegetation Index involves
the gren, red and near infra-red bands. _{1}, _{2} and _{3} are the mid-band wavelengths for the green, red and
NIR bands and ^{-1} is the arctangent function.

The AVI expression is

| (12.5) |

| (12.6) |

| (12.7) |

For more details, refer to Plummer work [111].

With the

Let’s look at the minimal code required to use this algorithm. First, the following header defining the

The image types are now defined using pixel types and dimension. The input image is defined as an

The AVI (Angular Vegetation Index) is instantiated using the image pixel types as template parameters.

The

Now the input image is set and a name is given to the output image.

The three used index bands (red, green and NIR) are declared.

The

The filter input is linked to the reader output and the filter output is linked to the writer input.

The invocation of the

Let’s now run this example using as input the image

The source code for this example can be found in the file

The following example illustrates the application of atmospheric corrections to an optical multispectral image similar to Pleiades. These corrections are made in four steps :

- digital number to luminance correction;
- luminance to refletance image conversion;
- atmospheric correction for TOA (top of atmosphere) to TOC (top of canopy) reflectance estimation;
- correction of the adjacency effects taking into account the neighborhood contribution.

The manipulation of each class used for the different steps and the link with the 6S radiometry library will be explained. In particular, the API modifications that have been made in version 4.2 will be detailed. There was several reasons behind these modifications :

- fix design issues in the framework that were causing trouble when setting the atmospheric parameters
- allow the computation of the radiative terms by other libraries than 6S (such as SMAC method).
- allow the users of the OpticalCalibration application to set and override each correction parameter.

Let’s look at the minimal code required to use this algorithm. First, the following header defining the

In version 4.2, the class

This chain uses the 6S radiative transfer code to compute radiative terms (for instance upward and downward transmittance). The inputs needed are separated into two categories :

- The atmospheric correction parameters : physical parameters of the
atmosphere when the image was taken (for instance : atmospheric pressure,
water vapour amount, aerosol data, ...). They are stored in the class
^{1}otb::AtmosphericCorrectionParameters . - The acquisition correction parameters : sensor related information about the way the image
was taken, usually available with the image metadata (for instance : solar angles, spectral
sensitivity, ...). They are stored in the class
otb::ImageMetadataCorrectionParameters .

The class

Image types are now defined using pixel types and dimension. The input image is defined as an

The

The ^{k}) whose formula is:

| (12.8) |

Where :

L _{TOA}^{k}is the incident luminance (inW.m ^{-2}.sr ^{-1}.μm ^{-1});X ^{k}is the measured digital number (ie. the input image pixel component);α _{k}is the absolute calibration gain for the channel k;β _{k}is the absolute calibration bias for the channel k.

Here,

The _{TOA}^{k}):

| (12.9) |

Where :

ρ _{TOA}^{k}is the reflectance measured by the sensor;θ _{S}is the zenithal solar angle in degrees;E _{S}^{k}is the solar illumination out of the atmosphere measured at a distanced _{0}from the Earth;d∕d _{0}is the ratio between the Earth-Sun distance at the acquisition date and the mean Earth-Sun distance. The ratio can be directly given to the class or computed using a 6S routine. TODO In the last case (that is the one of this example), the user has to precise the month and the day of the acquisition.

The solar illumination is read from a ASCII file given in input, stored in a vector and given to the class. Day, month and zenital solar angle are inputs and can be directly given to the class.

At this step of the chain, radiative information are nedeed to compute the contribution of the atmosphere
(such as atmosphere transmittance and reflectance). Those information will be computed from
different correction parameters stored in

The

- The zenithal and azimutal solar angles that describe the solar incidence configuration (in degrees);
- The zenithal and azimuthal viewing angles that describe the viewing direction (in degrees);
- The month and the day of the acquisition;
- The filter function that is the values of the filter function for one spectral band, from
λ _{inf}toλ _{sup}by step of 2.5 nm. One filter function by channel is required. This last parameter are read in text files, the other one are directly given to the class.

When this container is not set in the ReflectanceToSurfaceReflectance filter, it is automatically filled using the image metadata. The following lines show that it is also possible to set the values manually.

The

- The atmospheric pressure;
- The water vapor amount, that is, the total water vapor content over vertical atmospheric column;
- The ozone amount that is the Stratospheric ozone layer content;
- The aerosol model that is the kind of particles (no aerosol, continental, maritime, urban, desertic);
- The aerosol optical thickness at 550 nm that is the is the Radiative impact of aerosol for the reference wavelength 550 nm;

Once those parameters are loaded, they are used by the 6S library to compute the needed radiometric information.
The RadiometryCorrectionParametersToAtmosphericRadiativeTerms class provides a static function to perform
this step^{2} .

The output is stored inside an instance of the

- The Intrinsic atmospheric reflectance that takes into account for the molecular scattering and the aerosol scattering attenuated by water vapor absorption;
- The spherical albedo of the atmosphere;
- The total gaseous transmission (for all species);
- The total transmittance of the atmosphere from sun to ground (downward transmittance) and from ground to space sensor (upward transmittance).

Atmospheric corrections can now start. First, an instance of

The aim of the atmospheric correction is to invert the surface reflectance (for each pixel of the input image) from the TOA reflectance and from simulations of the atmospheric radiative functions corresponding to the geometrical conditions of the observation and to the atmospheric components. The process required to be applied on each pixel of the image, band by band with the following formula:

| (12.10) |

Where,

| (12.11) |

With :

ρ _{TOA}is the reflectance at the top of the atmosphere;ρ _{S}^{unif}is the ground reflectance under assumption of a lambertian surface and an uniform environment;ρ _{atm}is the intrinsic atmospheric reflectance;t _{g}^{allgas}is the spherical albedo of the atmosphere;T ( μ _{S}) is the downward transmittance;T ( μ _{V }) is the upward transmittance.

All those parameters are contained in the AtmosphericRadiativeTerms container.

Next (and last step) is the neighborhood correction. For this, the SurfaceAdjacencyEffectCorrectionSchemeFilter class is used. The previous surface reflectance inversion is performed under the assumption of a homogeneous ground environment. The following step allows correcting the adjacency effect on the radiometry of pixels. The method is based on the decomposition of the observed signal as the summation of the own contribution of the target pixel and of the contributions of neighbored pixels moderated by their distance to the target pixel. A simplified relation may be :

| (12.12) |

With :

ρ _{S}^{unif}is the ground reflectance under assumption of an homogeneous environment;T ( μ _{V }) is the upward transmittance;t _{d}( μ _{S}) is the upward diffus transmittance;exp ( - δ∕μ _{V }) is the upward direct transmittance;ρ _{S}is the environment contribution to the pixel target reflectance in the total observed signal.(12.13) where,

The neighborhood consideration window size is given by the window radius.

An instance of

Four inputs are needed to compute the neighborhood contribution:

- The radiative terms (stored in the AtmosphericRadiativeTerms container);
- The zenithal viewing angle;
- The neighborhood window radius;
- The pixel spacing in kilometers.

At this step, each filter of the chain is instancied and every one has its input parameters set. A name can be given to the output image, each filter can be linked to the next one and create the final processing chain.

The invocation of the