Credal Networks
In [1]:
import os
%matplotlib inline
from pylab import *
import matplotlib.pyplot as plt
In [2]:
import pyAgrum as gum
import pyAgrum.lib.notebook as gnb
gnb.configuration()
Library | Version |
---|---|
OS | posix [darwin] |
Python | 3.13.0 (main, Oct 7 2024, 05:02:14) [Clang 16.0.0 (clang-1600.0.26.3)] |
IPython | 8.28.0 |
Matplotlib | 3.9.2 |
Numpy | 2.1.2 |
pyDot | 3.0.2 |
pyAgrum | 1.16.0 |
Thu Oct 17 15:40:29 2024 CEST
Credal Net from BN
In [3]:
bn=gum.fastBN("A->B[3]->C<-D<-A->E->F")
bn_min=gum.BayesNet(bn)
bn_max=gum.BayesNet(bn)
for n in bn.nodes():
x=0.4*min(bn.cpt(n).min(),1-bn.cpt(n).max())
bn_min.cpt(n).translate(-x)
bn_max.cpt(n).translate(x)
cn=gum.CredalNet(bn_min,bn_max)
cn.intervalToCredal()
gnb.flow.row(bn,bn.cpt("B"),cn,bn_min.cpt("B"),bn_max.cpt("B"),captions=["Bayes Net","CPT","Credal Net","CPTmin","CPTmax"])
|
|
| |
---|---|---|---|
0.2975 | 0.3630 | 0.3395 | |
0.2928 | 0.6050 | 0.1023 |
|
|
| |
---|---|---|---|
0.2566 | 0.3221 | 0.2986 | |
0.2519 | 0.5641 | 0.0614 |
|
|
| |
---|---|---|---|
0.3384 | 0.4039 | 0.3804 | |
0.3337 | 0.6459 | 0.1432 |
We can use LBP on CN (L2U) only for binary credal networks (here B is not binary). We then propose the classical binarization (but warn the user that this leads to approximation in the inference)
In [4]:
cn2=gum.CredalNet(bn_min,bn_max)
cn2.intervalToCredal()
cn2.approximatedBinarization()
cn2.computeBinaryCPTMinMax()
gnb.flow.row(cn,cn2,captions=["Credal net","Binarized credal net"])
Here, \(B\) becomes - \(B\)-b\(i\) : the \(i\)-th bit of B - instrumental \(B\)-v\(k\) : the indicator variable for each modality \(k\) of \(B\)
In [5]:
ie_mc=gum.CNMonteCarloSampling(cn)
ie2_lbp=gum.CNLoopyPropagation(cn2)
ie2_mc=gum.CNMonteCarloSampling(cn2)
In [6]:
gnb.sideBySide(gnb.getInference(cn,engine=ie_mc),
gnb.getInference(cn2,engine=ie2_mc),
gnb.getInference(cn2,engine=ie2_lbp))
In [7]:
gnb.sideBySide(ie_mc.CN(),ie_mc.marginalMin("F"),ie_mc.marginalMax("F"),
ie_mc.CN(),ie2_lbp.marginalMin("F"),ie2_lbp.marginalMax("F"),
ncols=3)
print(cn)
A:Range([0,1])
<> : [[0.464623 , 0.535377] , [0.770552 , 0.229448]]
B:Range([0,2])
<A:0> : [[0.256574 , 0.363016 , 0.38041] , [0.256574 , 0.403918 , 0.339508] , [0.29748 , 0.403918 , 0.298602] , [0.338384 , 0.363014 , 0.298602] , [0.29748 , 0.32211 , 0.38041] , [0.338384 , 0.32211 , 0.339506]]
<A:1> : [[0.25187 , 0.604965 , 0.143165] , [0.25187 , 0.645868 , 0.102261] , [0.292775 , 0.645868 , 0.0613563] , [0.333679 , 0.604965 , 0.0613563] , [0.292774 , 0.564061 , 0.143165] , [0.333679 , 0.564061 , 0.10226]]
C:Range([0,1])
<B:0|D:0> : [[0.371595 , 0.628405] , [0.381014 , 0.618986]]
<B:1|D:0> : [[0.682366 , 0.317634] , [0.691786 , 0.308214]]
<B:2|D:0> : [[0.488501 , 0.511499] , [0.497921 , 0.502079]]
<B:0|D:1> : [[0.00706436 , 0.992936] , [0.0164856 , 0.983514]]
<B:1|D:1> : [[0.515797 , 0.484203] , [0.525215 , 0.474785]]
<B:2|D:1> : [[0.564607 , 0.435393] , [0.574028 , 0.425972]]
D:Range([0,1])
<A:0> : [[0.610309 , 0.389691] , [0.83299 , 0.16701]]
<A:1> : [[0.25433 , 0.74567] , [0.477009 , 0.522991]]
E:Range([0,1])
<A:0> : [[0.502496 , 0.497504] , [0.502747 , 0.497253]]
<A:1> : [[0.999556 , 0.000443853] , [0.999809 , 0.000190767]]
F:Range([0,1])
<E:0> : [[0.31159 , 0.68841] , [0.582919 , 0.417081]]
<E:1> : [[0.525177 , 0.474823] , [0.796504 , 0.203496]]
Credal Net from bif files
In [8]:
cn=gum.CredalNet("res/cn/2Umin.bif","res/cn/2Umax.bif")
cn.intervalToCredal()
In [9]:
gnb.showCN(cn,"2")
In [10]:
ie=gum.CNMonteCarloSampling(cn)
ie.insertEvidenceFile("res/cn/L2U.evi")
In [11]:
ie.setRepetitiveInd(False)
ie.setMaxTime(1)
ie.setMaxIter(1000)
ie.makeInference()
In [12]:
cn
In [13]:
gnb.showInference(cn,targets={"A","H","L","D"},engine=ie,evs={"L":[0,1],"G":[1,0]})
Comparing inference in credal networks
In [14]:
import pyAgrum as gum
def showDiffInference(model,mc,lbp):
for i in model.current_bn().nodes():
a,b=mc.marginalMin(i)[:]
c,d=mc.marginalMax(i)[:]
e,f=lbp.marginalMin(i)[:]
g,h=lbp.marginalMax(i)[:]
plt.scatter([a,b,c,d],[e,f,g,h])
cn=gum.CredalNet("res/cn/2Umin.bif","res/cn/2Umax.bif")
cn.intervalToCredal()
The two inference give quite the same result
In [15]:
ie_mc=gum.CNMonteCarloSampling(cn)
ie_mc.makeInference()
cn.computeBinaryCPTMinMax()
ie_lbp=gum.CNLoopyPropagation(cn)
ie_lbp.makeInference()
showDiffInference(cn,ie_mc,ie_lbp)
but not when evidence are inserted
In [16]:
ie_mc=gum.CNMonteCarloSampling(cn)
ie_mc.insertEvidenceFile("res/cn/L2U.evi")
ie_mc.makeInference()
ie_lbp=gum.CNLoopyPropagation(cn)
ie_lbp.insertEvidenceFile("res/cn/L2U.evi")
ie_lbp.makeInference()
showDiffInference(cn,ie_mc,ie_lbp)
Dynamical Credal Net
In [17]:
cn=gum.CredalNet("res/cn/bn_c_8.bif","res/cn/den_c_8.bif")
cn.bnToCredal(0.8,False)
---------------------------------------------------------------------------
TypeError Traceback (most recent call last)
/var/folders/r1/pj4vdx_n4_d_xpsb04kzf97r0000gp/T/ipykernel_53905/814548779.py in ?()
1 cn=gum.CredalNet("res/cn/bn_c_8.bif","res/cn/den_c_8.bif")
----> 2 cn.bnToCredal(0.8,False)
~/.virtualenvs/devAgrum/lib/python3.13/site-packages/pyAgrum/pyAgrum.py in ?(self, beta, oneNet, keepZeroes)
25125 keepZeroes : bool
25126 used as a flag as whether or not - respectively True or False - we keep zeroes as zeroes. Default is False, i.e. zeroes are not kept
25127
25128 """
> 25129 return _pyAgrum.CredalNet_bnToCredal(self, beta, oneNet, keepZeroes)
TypeError: in method 'CredalNet_bnToCredal', argument 4 of type 'bool'
Additional information:
Wrong number or type of arguments for overloaded function 'CredalNet_bnToCredal'.
Possible C/C++ prototypes are:
gum::credal::CredalNet< double >::bnToCredal(double const,bool const,bool const)
gum::credal::CredalNet< double >::bnToCredal(double const,bool const)
In [ ]:
ie=gum.CNMonteCarloSampling(cn)
ie.insertModalsFile("res/cn/modalities.modal")
ie.setRepetitiveInd(True)
ie.setMaxTime(5)
ie.setMaxIter(1000)
ie.makeInference()
In [ ]:
print(ie.dynamicExpMax("temp"))
In [ ]:
fig=figure()
ax=fig.add_subplot(111)
ax.fill_between(range(9),ie.dynamicExpMax("temp"),ie.dynamicExpMin("temp"))
plt.show()
In [ ]:
ie=gum.CNMonteCarloSampling(cn)
ie.insertModalsFile("res/cn/modalities.modal")
ie.setRepetitiveInd(False)
ie.setMaxTime(5)
ie.setMaxIter(1000)
ie.makeInference()
print(ie.messageApproximationScheme())
In [ ]:
fig=figure()
ax=fig.add_subplot(111)
ax.fill_between(range(9),ie.dynamicExpMax("temp"),ie.dynamicExpMin("temp"))
plt.show()
In [ ]:
ie=gum.CNMonteCarloSampling(cn)
ie.insertModalsFile("res/cn/modalities.modal")
ie.setRepetitiveInd(False)
ie.setMaxTime(5)
ie.setMaxIter(5000)
gnb.animApproximationScheme(ie)
ie.makeInference()
In [ ]:
fig=figure()
ax=fig.add_subplot(111)
ax.fill_between(range(9),ie.dynamicExpMax("temp"),ie.dynamicExpMin("temp"));
plt.show()