double VarianceOfReal_bn ( sensv_bn*  sens,   const node_bn*  Fnode )

Measures how much a finding at one node (called the "findings node") is expected to reduce the variance of another node (called the "query node").

The query node is set by the particular sensv_bn created (see NewSensvToFinding_bn). The findings node is passed as Fnode.

VarianceOfReal_bn can only be used with query nodes that are discretized continuous nodes, or that have a real numeric value associated with each state. It measures the expected change squared in the expected real value of the query node, due to a finding at the findings node. This turns out to be the same as the expected decrease in variance of the expected real value of the query node, due to a finding at findings node. The findings nodes do not have to be continuous or have real numeric values attached to their states.

The maximum possible decrease in variance of the query node, is when variance goes to zero, i.e. all uncertainty is removed. That happens when a finding is obtained for the query node. So to find the variance of a node, measure the variance reduction between a node and itself.

To create a sensv_bn that can measure variance reduction, pass REAL_SENSV + VARIANCE_SENSV for what_find when calling NewSensvToFinding_bn. For its Fnodes argument, pass a list of all the nodes that might later be passed as Fnode to this function.

This function is available as "Network -> Sensitivity to Finding" in Netica Application. For more information on it, contact Norsys for the "Sensitivity" document.

Version:

Versions 2.03 and later have this function.

See also:

NewSensvToFinding_bn  Create the sensv_bn required to measure variance reduction due to finding
MutualInfo_bn  Use a different measure of sensitivity: mutual info (entropy reduction)