Credal Networks
In [1]:
import os
%matplotlib inline
from pylab import *
import matplotlib.pyplot as plt
In [2]:
import pyAgrum as gum
import pyAgrum.lib.notebook as gnb
gnb.configuration()
Library | Version |
---|---|
OS | posix [darwin] |
Python | 3.12.4 (main, Jun 6 2024, 18:26:44) [Clang 15.0.0 (clang-1500.3.9.4)] |
IPython | 8.26.0 |
Matplotlib | 3.9.1 |
Numpy | 2.0.1 |
pyDot | 3.0.1 |
pyAgrum | 1.15.0.9 |
Sat Jul 27 17:28:32 2024 CEST
Credal Net from BN
In [3]:
bn=gum.fastBN("A->B[3]->C<-D<-A->E->F")
bn_min=gum.BayesNet(bn)
bn_max=gum.BayesNet(bn)
for n in bn.nodes():
x=0.4*min(bn.cpt(n).min(),1-bn.cpt(n).max())
bn_min.cpt(n).translate(-x)
bn_max.cpt(n).translate(x)
cn=gum.CredalNet(bn_min,bn_max)
cn.intervalToCredal()
gnb.flow.row(bn,bn.cpt("B"),cn,bn_min.cpt("B"),bn_max.cpt("B"),captions=["Bayes Net","CPT","Credal Net","CPTmin","CPTmax"])
|
|
| |
---|---|---|---|
0.6914 | 0.0076 | 0.3010 | |
0.3896 | 0.2499 | 0.3606 |
|
|
| |
---|---|---|---|
0.6884 | 0.0046 | 0.2979 | |
0.3865 | 0.2468 | 0.3575 |
|
|
| |
---|---|---|---|
0.6945 | 0.0107 | 0.3040 | |
0.3926 | 0.2529 | 0.3636 |
We can use LBP on CN (L2U) only for binary credal networks (here B is not binary). We then propose the classical binarization (but warn the user that this leads to approximation in the inference)
In [4]:
cn2=gum.CredalNet(bn_min,bn_max)
cn2.intervalToCredal()
cn2.approximatedBinarization()
cn2.computeBinaryCPTMinMax()
gnb.flow.row(cn,cn2,captions=["Credal net","Binarized credal net"])
Here, \(B\) becomes - \(B\)-b\(i\) : the \(i\)-th bit of B - instrumental \(B\)-v\(k\) : the indicator variable for each modality \(k\) of \(B\)
In [5]:
ie_mc=gum.CNMonteCarloSampling(cn)
ie2_lbp=gum.CNLoopyPropagation(cn2)
ie2_mc=gum.CNMonteCarloSampling(cn2)
In [6]:
gnb.sideBySide(gnb.getInference(cn,engine=ie_mc),
gnb.getInference(cn2,engine=ie2_mc),
gnb.getInference(cn2,engine=ie2_lbp))
In [7]:
gnb.sideBySide(ie_mc.CN(),ie_mc.marginalMin("F"),ie_mc.marginalMax("F"),
ie_mc.CN(),ie2_lbp.marginalMin("F"),ie2_lbp.marginalMax("F"),
ncols=3)
print(cn)
A:Range([0,1])
<> : [[0.198433 , 0.801567] , [0.463014 , 0.536986]]
B:Range([0,2])
<A:0> : [[0.688353 , 0.00763708 , 0.30401] , [0.688353 , 0.0106922 , 0.300955] , [0.691408 , 0.0106922 , 0.2979] , [0.694463 , 0.00763631 , 0.2979] , [0.691408 , 0.0045819 , 0.30401] , [0.694463 , 0.0045819 , 0.300955]]
<A:1> : [[0.386515 , 0.249879 , 0.363606] , [0.386515 , 0.252934 , 0.360551] , [0.389571 , 0.252934 , 0.357496] , [0.392627 , 0.249877 , 0.357496] , [0.389572 , 0.246822 , 0.363606] , [0.392627 , 0.246822 , 0.360551]]
C:Range([0,1])
<B:0|D:0> : [[0.386325 , 0.613675] , [0.484117 , 0.515883]]
<B:1|D:0> : [[0.384309 , 0.615691] , [0.4821 , 0.5179]]
<B:2|D:0> : [[0.402582 , 0.597418] , [0.500373 , 0.499627]]
<B:0|D:1> : [[0.0733427 , 0.926657] , [0.171134 , 0.828866]]
<B:1|D:1> : [[0.319932 , 0.680068] , [0.417722 , 0.582278]]
<B:2|D:1> : [[0.474088 , 0.525912] , [0.57188 , 0.42812]]
D:Range([0,1])
<A:0> : [[0.176926 , 0.823074] , [0.351076 , 0.648924]]
<A:1> : [[0.130612 , 0.869388] , [0.304762 , 0.695238]]
E:Range([0,1])
<A:0> : [[0.484153 , 0.515847] , [0.778923 , 0.221077]]
<A:1> : [[0.369248 , 0.630752] , [0.664018 , 0.335982]]
F:Range([0,1])
<E:0> : [[0.519612 , 0.480388] , [0.79412 , 0.20588]]
<E:1> : [[0.387978 , 0.612022] , [0.662486 , 0.337514]]
Credal Net from bif files
In [8]:
cn=gum.CredalNet("res/cn/2Umin.bif","res/cn/2Umax.bif")
cn.intervalToCredal()
In [9]:
gnb.showCN(cn,"2")
In [10]:
ie=gum.CNMonteCarloSampling(cn)
ie.insertEvidenceFile("res/cn/L2U.evi")
In [11]:
ie.setRepetitiveInd(False)
ie.setMaxTime(1)
ie.setMaxIter(1000)
ie.makeInference()
In [12]:
cn
In [13]:
gnb.showInference(cn,targets={"A","H","L","D"},engine=ie,evs={"L":[0,1],"G":[1,0]})
Comparing inference in credal networks
In [14]:
import pyAgrum as gum
def showDiffInference(model,mc,lbp):
for i in model.current_bn().nodes():
a,b=mc.marginalMin(i)[:]
c,d=mc.marginalMax(i)[:]
e,f=lbp.marginalMin(i)[:]
g,h=lbp.marginalMax(i)[:]
plt.scatter([a,b,c,d],[e,f,g,h])
cn=gum.CredalNet("res/cn/2Umin.bif","res/cn/2Umax.bif")
cn.intervalToCredal()
The two inference give quite the same result
In [15]:
ie_mc=gum.CNMonteCarloSampling(cn)
ie_mc.makeInference()
cn.computeBinaryCPTMinMax()
ie_lbp=gum.CNLoopyPropagation(cn)
ie_lbp.makeInference()
showDiffInference(cn,ie_mc,ie_lbp)
but not when evidence are inserted
In [16]:
ie_mc=gum.CNMonteCarloSampling(cn)
ie_mc.insertEvidenceFile("res/cn/L2U.evi")
ie_mc.makeInference()
ie_lbp=gum.CNLoopyPropagation(cn)
ie_lbp.insertEvidenceFile("res/cn/L2U.evi")
ie_lbp.makeInference()
showDiffInference(cn,ie_mc,ie_lbp)
Dynamical Credal Net
In [17]:
cn=gum.CredalNet("res/cn/bn_c_8.bif","res/cn/den_c_8.bif")
cn.bnToCredal(0.8,False)
In [18]:
ie=gum.CNMonteCarloSampling(cn)
ie.insertModalsFile("res/cn/modalities.modal")
ie.setRepetitiveInd(True)
ie.setMaxTime(5)
ie.setMaxIter(1000)
ie.makeInference()
In [19]:
print(ie.dynamicExpMax("temp"))
(14.203404648293022, 11.817699847864338, 12.190483075680442, 11.99476087981647, 11.975306510688327, 11.964926735931694, 11.965025343870192, 11.965014533608835, 11.965015735850175)
In [20]:
fig=figure()
ax=fig.add_subplot(111)
ax.fill_between(range(9),ie.dynamicExpMax("temp"),ie.dynamicExpMin("temp"))
plt.show()
In [21]:
ie=gum.CNMonteCarloSampling(cn)
ie.insertModalsFile("res/cn/modalities.modal")
ie.setRepetitiveInd(False)
ie.setMaxTime(5)
ie.setMaxIter(1000)
ie.makeInference()
print(ie.messageApproximationScheme())
stopped with epsilon=0
In [22]:
fig=figure()
ax=fig.add_subplot(111)
ax.fill_between(range(9),ie.dynamicExpMax("temp"),ie.dynamicExpMin("temp"))
plt.show()
In [23]:
ie=gum.CNMonteCarloSampling(cn)
ie.insertModalsFile("res/cn/modalities.modal")
ie.setRepetitiveInd(False)
ie.setMaxTime(5)
ie.setMaxIter(5000)
gnb.animApproximationScheme(ie)
ie.makeInference()
In [24]:
fig=figure()
ax=fig.add_subplot(111)
ax.fill_between(range(9),ie.dynamicExpMax("temp"),ie.dynamicExpMin("temp"));
plt.show()