SoFirstOrderStatisticsProcessing Class Reference
[Texture Filters]

ImageViz SoFirstOrderStatisticsProcessing image filter More...

#include <ImageViz/Engines/ImageFiltering/TextureFilters/SoFirstOrderStatisticsProcessing.h>

Inheritance diagram for SoFirstOrderStatisticsProcessing:
SoImageVizEngine SoEngine SoFieldContainer SoBase SoRefCounter SoTypedObject

List of all members.

Public Types

enum  MeasureType {
  MEAN = 0,
  VARIANCE = 1,
  SKEWNESS = 2,
  KURTOSIS = 3,
  CONTRAST = 4,
  VARIATION = 5,
  ENERGY = 6,
  ENTROPY = 7
}

Public Member Functions

 SoFirstOrderStatisticsProcessing ()

Public Attributes

SoSFEnum measureType
SoSFImageDataAdapter inImage
SoSFInt32 kernelSize
SoImageVizEngineOutput
< SoSFImageDataAdapter,
SoImageDataAdapter * > 
outImage

Detailed Description

ImageViz SoFirstOrderStatisticsProcessing image filter

SoFirstOrderStatisticsProcessing image filter computes first order statistics.

The SoFirstOrderStatisticsProcessing filter is dealing with first order statistics. It creates a result image where values on pixels are only function of values of this pixel in the initial image and its neighbourhood of a wanted size. In order to calculate this values, we calculate local histograms (array of number of pixels per value in the neighbourhood). In the following equations, $p(n)$ corresponds to the probability to have a pixel with value $n$ (it's the number of those pixels divided by the total number of pixels in the neighbourhood).

Reduced image

For ENTROPY and ENERGY types, we must reduce the number of grey levels in order to calculate more significant statistics. We have two parameters to reduce images : having 4 or 8 classes, and with an equal repartition or not. For an equal repartition, we use the quantiles :

\[q_t=inf\Big\{q,\sum_{r\le q}p(r)\ge t\Big\}\]

So, for 4 classes, we use :

\[q_\frac{1}{4},q_\frac{1}{2},q_\frac{3}{4}\]

and for 8 classes :

\[q_\frac{1}{8},q_\frac{1}{4},q_\frac{3}{8},q_\frac{1}{2},q_\frac{5}{8},q_\frac{3}{4},q_\frac{7}{8}\]

In the other case (no equi-repartition required), we use mean and standard deviation. So the delimiters for 4 classes are :

\[(m-\sigma),(m),(m+\sigma)\]

For 8 classes, we calculate the mean $m$, then we obtain two subsets where we calculate means $m1$ and $m2$, and standard deviations, we have in that case :

\[(m_1-\sigma_1),(m_1),(m_1+\sigma_1),(m),(m_2-\sigma_2),(m_2),(m_2+\sigma_2)\]

This values delimitates the classes and the value given to the pixels which belong to one class is the middle of the segment.

FILE FORMAT/DEFAULT


Library references: contrast energy entropy kurtosis mean skewness variance variation

Deprecated:

Deprecated since Open Inventor 9800
Replaced by SoLocalStatisticsProcessing

Member Enumeration Documentation

Enumerator:
MEAN 

The MEAN type gives to pixels the mean value in its neighbourhood:

\[mean=m=\sum_{n}n\cdot p(n)\]

.

VARIANCE 

The VARIANCE type gives to pixels the variance value in its neighbourhood:

\[variance=\sigma^2=\sum_{n}(n-m)^2\cdot p(n)\]

.

SKEWNESS 

The SKEWNESS type gives to pixels the skewness value in its neighbourhood:

\[skewness=\frac{\sum\limits_{n}(n-m)^3\cdot p(n)}{\sigma^3}\]

.

KURTOSIS 

The KURTOSIS type gives to pixels the kurtosis value in its neighbourhood:

\[kurtosis=\frac{\sum\limits_{n}(n-m)^4\cdot p(n)}{\sigma^4}-3\]

.

CONTRAST 

The CONTRAST type gives to pixels the contrast value in its neighbourhood:

\[contrast=\frac{max(n)-min(n)}{max(n)+min(n)}\]

.

VARIATION 

The VARIATION type gives to pixels the variation value in its neighbourhood:

\[variation=\frac{m}{\sigma}\]

.

ENERGY 

The ENERGY type gives to pixels the energy value in its neighbourhood in the reduced image:

\[energy=\sum_{n}p(n)^2\]

.

ENTROPY 

The ENTROPY type gives to pixels the entropy of its neighbourhood in the reduced image:

\[entropy=-\sum_{n}p(n)\cdot\log p(n)\]

.

See also: SoCrossCorrelationProcessing2d


Constructor & Destructor Documentation

SoFirstOrderStatisticsProcessing::SoFirstOrderStatisticsProcessing (  ) 

Constructor.


Member Data Documentation

The input image.

Default value is NULL. Supported types include: grayscale binary label color image.

The size of the kernel.

Default value is 3.

Select the first order measure to compute.

Use enum MeasureType. Default is MEAN

The output image.

Default value is NULL. Supported types include: grayscale binary label color image.


The documentation for this class was generated from the following file:

Open Inventor by FEI reference manual, generated on 19 Aug 2019
Copyright © FEI S.A.S. All rights reserved.
http://www.openinventor.com/