As presented in chapter 28, OTB has two main mechanisms to handle efficiently large data: streaming allows to process image piece-wise, and multi-threading allows to process concurrently several pieces of one streaming block. Using these concepts, one can easily write pixel-wise or neighborhood-based filters and insert them into a pipeline which will be scalable with respect to the input image size.
Yet, sometimes we need to compute global features on the whole image. One example is to determine image mean and variance of the input image in order to produce a centered and reduced image. The operation of centering and reducing each pixel is fully compliant with streaming and threading, but one has to first estimate the mean and variance of the image. This first step requires to walk the whole image once, and traditional streaming and multi-threading based filter architecture is of no help here.
This is because there is a fundamental difference between these two operations: one supports streaming, and the other needs to perform streaming. In fact we would like to stream the whole image piece by piece through some filter that will collect and keep mean and variance cumulants, and then synthetize theses cumulants to compute the final mean and variance once the full image as been streamed. Each stream would also benefit from parallel processing. This is exactly what persistent filters are for.
There are two main objects in the persistent filters framework. The first is the otb::PersistentImageFilter , the second is the otb::PersistentFilterStreamingDecorator .
The otb::PersistentImageFilter class is a regular itk::ImageToImageFilter , with two additional pure virtual methods: the Synthetize() and the Reset() methods.
Imagine that the GenerateData() or ThreadedGenerateData() progressively computes some global feature of the whole image, using some member of the class to store intermediate results. The Synthetize() is an additional method which is designed to be called one the whole image has been processed, in order to compute the final results from the intermediate results. The Reset() method is designed to allow the reset of the intermediate results members so as to start a fresh processing.
Any sub-class of the otb::PersistentImageFilter can be used as a regular itk::ImageToImageFilter (provided that both Synthetize() and Reset() have been implemented, but the real interest of these filters is to be used with the streaming decorator class presented in the next section.
The otb::PersistentFilterStreamingDecorator is a class designed to be templated with subclasses of the otb::PersistentImageFilter . It provides the mechanism to stream the whole image through the templated filter, using a third class called otb::StreamingImageVirtualWriter . When the Update() method is called on a otb::PersistentFilterStreamingDecorator , a pipeline plugging the templated subclass of the otb::PersistentImageFilter to an instance of otb::StreamingImageVirtualWriter is created. The latter is then updated, and acts like a regular otb::ImageFileWriter but it does not actually write anything to the disk : streaming pieces are requested and immediately discarded. The otb::PersistentFilterStreamingDecorator also calls the Reset() method at the beginning and the Synthetize() method at the end of the streaming process. Therefore, it packages the whole mechanism for the use of a otb::PersistentImageFilter :
There are some methods that allows to tune the behavior of the otb::StreamingImageVirtualWriter , allowing to change the image splitting methods (tiles or strips) or the size of the streams with respect to some target available amount of memory. Please see the class documentation for details. The instance of the otb::StreamingImageVirtualWriter can be retrieved from the otb::PersistentFilterStreamingDecorator through the GetStreamer() method.
Though the internal filter of the otb::PersistentFilterStreamingDecorator can be accessed through the GetFilter() method, the class is often derived to package the streaming-decorated filter and wrap the parameters setters and getters.
This is an end-to-end example to compute the mean over a full image, using a streaming and threading-enabled filter. Please note that only specific details are explained here. For more general information on how to write a filter, please refer to section 29, page 1177.
The first step is to write a persistent mean image filter. We need to include the appropriate header :
Then, we declare the class prototype as follows:
Since the output image will only be used for streaming purpose, we do not need to declare different input and output template types.
In the private section of the class, we will declare a member which will be used to store temporary results, and a member which will be used to store the final result.
Next, we will write the Reset() method implementation in the protected section of the class. Proper allocation of the temporary results container with respect to the number of threads is handled here.
Now, we need to write the ThreadedGenerateData() methods (also in the protected section), were temporary results will be computed for each piece of stream.
Last, we need to define the Synthetize() method (still in the protected section), which will yield the final results:
Now, to use the filter, one only has to decorate it with the otb::PersistentFilterStreamingDecorator . First step is to include the appropriate header:
Then, we decorate the filter with some typedefs:
Now, the decorated filter can be used like any standard filter:
It is often convenient to avoid the few typedefs of the previous section by deriving a new class from the decorated filter:
This also allows to redefine setters and getters for parameters, avoiding to call the GetFilter() method to set them.