Synergy, suppression and immorality: forward differences of the entropy function.
Published in arXiv 1501.04368
* This author is not affiliated with PMI.
Conditional mutual information is important in the selection and interpretation of graphical models. Its empirical version is well known as a generalised likelihood ratio test and that it may be represented as a difference in entropy. We consider the forward difference expansion of the entropy function defined on all subsets of the variables under study. The elements of this expansion are invariant to permutation of their suffices and relate higher order mutual informations to lower order ones. The third order difference is expressible as an, apparently assymmetric, difference between a marginal and a conditional mutual information. Its role in the decomposition for explained information provides a technical definition for synergy between three random variables. Positive values occur when two variables provide alternative explanations for a third; negative values, termed synergies, occur when the sum of explained information is greater than the sum of its parts. Synergies tend to be infrequent; they connect the seemingly unrelated concepts of suppressor variables in regression, on the one hand, and unshielded colliders in Bayes networks (immoralities), on the other. We give novel characterizations of these phenomena that generalise to categorical variables and to higher dimensions. We propose an algorithm for systematically computing low order differences from a given graph. Examples from small scale real-life studies indicate the potential of these techniques for empirical statistical analysis.
Published OnJanuary 18, 2015