Venn diagram of information theoretic measures for three variables
x, y, and
z. Each circle represents an individual
entropy:
is the lower left circle,
the lower right, and
is the upper circle. The intersections of any two circles represents the
mutual information for the two associated variables (e.g.
is yellow and gray). The union of any two circles is the
joint entropy for the two associated variables (e.g.
is everything but green). The joint entropy
of all three variables is the union of all three circles. It is partitioned into 7 pieces, red, blue, and green being the
conditional entropies respectively, yellow, magenta and cyan being the
conditional mutual informations and
respectively, and gray being the
interaction information . The interaction information is the only one of all that may be negative.