Some other features in Bayesian inference

Creative Commons License

aGrUM

interactive online version

Lazy Propagation uses a secondary structure called the “Junction Tree” to perform the inference.

In [1]:
import pyAgrum as gum
import pyAgrum.lib.notebook as gnb

bn=gum.loadBN("res/alarm.dsl")
gnb.showJunctionTreeMap(bn);
../_images/notebooks_43-Inference_LazyPropagationAdvancedFeatures_3_0.svg

But this junction tree can be transformed to build different probabilistic queries.

In [2]:
bn=gum.fastBN("A->B->C->D;A->E->D;F->B;C->H")
ie=gum.LazyPropagation(bn)
bn
Out[2]:
G C C H H C->H D D C->D B B B->C A A A->B E E A->E E->D F F F->B

Evidence impact

Evidence Impact allows the user to analyze the effect of any variables on any other variables

In [3]:
ie.evidenceImpact("B",["A","H"])
Out[3]:
B
A
H
0
1
0
0
0.46310.5369
1
0.57610.4239
1
0
0.38790.6121
1
0.49960.5004

Evidence impact is able to find the minimum set of variables which effectively conditions the analyzed variable

In [4]:
ie.evidenceImpact("E",["A","F","B","D"]) # {A,D,B} d-separates E and F
Out[4]:
E
A
B
D
0
1
0
0
0
0.19070.8093
1
0.31570.6843
1
0
0.10250.8975
1
0.42300.5770
1
0
0
0.28970.7103
1
0.44400.5560
1
0
0.16510.8349
1
0.55920.4408
In [5]:
ie.evidenceImpact("E",["A","B","C","D","F"]) # {A,C,D} d-separates E and {B,F}
Out[5]:
E
C
A
D
0
1
0
0
0
0.32510.6749
1
0.01330.9867
1
0
0.45460.5454
1
0.02290.9771
1
0
0
0.06330.9367
1
0.45910.5409
1
0
0.10470.8953
1
0.59500.4050

Evidence Joint Imapct

In [6]:
ie.evidenceJointImpact(["A","F"],["B","C","D","E","H"]) # {B,E} d-separates [A,F] and [C,D,H]
Out[6]:
A
E
B
F
0
1
0
0
0
0.09770.3931
1
0.01700.4922
1
0
0.01730.5420
1
0.06960.3711
1
0
0
0.15610.3627
1
0.02720.4541
1
0
0.02820.5096
1
0.11330.3489
In [ ]: