Kullback-Leibler for Bayesian networks

Creative Commons License

aGrUM

interactive online version

In [1]:
import os

%matplotlib inline

from pylab import *
import matplotlib.pyplot as plt

import pyAgrum and pyAgrum.lib.notebook (for … notebooks :-) )

In [2]:
import pyAgrum as gum
import pyAgrum.lib.notebook as gnb

Create a first BN : bn

In [3]:
bn=gum.loadBN("res/asia.bif")
# randomly re-generate parameters for every Conditional Probability Table
bn.generateCPTs()
bn
Out[3]:
G dyspnoea dyspnoea visit_to_Asia visit_to_Asia tuberculosis tuberculosis visit_to_Asia->tuberculosis lung_cancer lung_cancer tuberculos_or_cancer tuberculos_or_cancer lung_cancer->tuberculos_or_cancer smoking smoking smoking->lung_cancer bronchitis bronchitis smoking->bronchitis positive_XraY positive_XraY tuberculos_or_cancer->dyspnoea tuberculos_or_cancer->positive_XraY tuberculosis->tuberculos_or_cancer bronchitis->dyspnoea

Create a second BN : bn2

In [4]:
bn2=gum.loadBN("res/asia.bif")
bn2.generateCPTs()
bn2
Out[4]:
G dyspnoea dyspnoea visit_to_Asia visit_to_Asia tuberculosis tuberculosis visit_to_Asia->tuberculosis lung_cancer lung_cancer tuberculos_or_cancer tuberculos_or_cancer lung_cancer->tuberculos_or_cancer smoking smoking smoking->lung_cancer bronchitis bronchitis smoking->bronchitis positive_XraY positive_XraY tuberculos_or_cancer->dyspnoea tuberculos_or_cancer->positive_XraY tuberculosis->tuberculos_or_cancer bronchitis->dyspnoea

bn vs bn2 : different parameters

In [5]:
gnb.flow.row(bn.cpt(3),bn2.cpt(3),
              captions=["a CPT in bn","same CPT in bn2 (with different parameters)"])
positive_XraY
tuberculos_or_cancer
0
1
0
0.58000.4200
1
0.52830.4717

a CPT in bn
positive_XraY
tuberculos_or_cancer
0
1
0
0.06610.9339
1
0.79640.2036

same CPT in bn2 (with different parameters)

Exact and (Gibbs) approximated KL-divergence

In order to compute KL-divergence, we just need to be sure that the 2 distributions are defined on the same domain (same variables, etc.)

Exact KL

In [6]:
g1=gum.ExactBNdistance(bn,bn2)
print(g1.compute())
{'klPQ': 3.619246109133107, 'errorPQ': 0, 'klQP': 3.713941855819993, 'errorQP': 0, 'hellinger': 0.9741606546160653, 'bhattacharya': 0.6433946044213296, 'jensen-shannon': 0.5777579725222465}

If the models are not on the same domain :

In [7]:
bn_different_domain=gum.loadBN("res/alarm.dsl")

# g=gum.BruteForceKL(bn,bn_different_domain) # a KL-divergence between asia and alarm ... :(
#
# would cause
#---------------------------------------------------------------------------
#OperationNotAllowed                       Traceback (most recent call last)
#
#OperationNotAllowed: this operation is not allowed : KL : the 2 BNs are not compatible (not the same vars : visit_to_Asia?)

Gibbs-approximated KL

In [8]:
g=gum.GibbsBNdistance(bn,bn2)
g.setVerbosity(True)
g.setMaxTime(120)
g.setBurnIn(5000)
g.setEpsilon(1e-7)
g.setPeriodSize(500)
In [9]:
print(g.compute())
print("Computed in {0} s".format(g.currentTime()))
{'klPQ': 3.6315320432282965, 'errorPQ': 0, 'klQP': 2.881333436440626, 'errorQP': 0, 'hellinger': 0.9398788392658488, 'bhattacharya': 0.657374744502834, 'jensen-shannon': 0.5427610638334074}
Computed in 0.379881552 s
In [10]:
print("--")

print(g.messageApproximationScheme())
print("--")

print("Temps de calcul : {0}".format(g.currentTime()))
print("Nombre d'itérations : {0}".format(g.nbrIterations()))
--
stopped with epsilon=1e-07
--
Temps de calcul : 0.379881552
Nombre d'itérations : 65500
In [11]:
p=plot(g.history(), 'g')
../_images/notebooks_96-Tools_klForBns_21_0.svg

Animation of Gibbs KL

Since it may be difficult to know what happens during approximation algorithm, pyAgrum allows to follow the iteration using animated matplotlib figure

In [12]:
g=gum.GibbsBNdistance(bn,bn2)
g.setMaxTime(60)
g.setBurnIn(500)
g.setEpsilon(1e-7)
g.setPeriodSize(5000)
In [13]:
gnb.animApproximationScheme(g) # logarithmique scale for Y
g.compute()
Out[13]:
{'klPQ': 3.6236349916804294,
 'errorPQ': 0,
 'klQP': 2.622139244395869,
 'errorQP': 0,
 'hellinger': 0.9258179847783811,
 'bhattacharya': 0.6507499948949181,
 'jensen-shannon': 0.5294204920245149}
../_images/notebooks_96-Tools_klForBns_25_1.svg