Some other features in Bayesian inference

Creative Commons License

aGrUM

interactive online version

Lazy Propagation uses a secondary structure called the “Junction Tree” to perform the inference.

In [1]:
import pyAgrum as gum
import pyAgrum.lib.notebook as gnb

bn=gum.loadBN("res/alarm.dsl")
gnb.showJunctionTreeMap(bn);
../_images/notebooks_43-Inference_LazyPropagationAdvancedFeatures_3_0.svg

But this junction tree can be transformed to build different probabilistic queries.

In [2]:
bn=gum.fastBN("A->B->C->D;A->E->D;F->B;C->H")
ie=gum.LazyPropagation(bn)
bn
Out[2]:
G C C H H C->H D D C->D F F B B F->B A A A->B E E A->E B->C E->D

Evidence impact

Evidence Impact allows the user to analyze the effect of any variables on any other variables

In [3]:
ie.evidenceImpact("B",["A","H"])
Out[3]:
B
A
H
0
1
0
0
0.44210.5579
1
0.39930.6007
1
0
0.50340.4966
1
0.45950.5405

Evidence impact is able to find the minimum set of variables which effectively conditions the analyzed variable

In [4]:
ie.evidenceImpact("E",["A","F","B","D"]) # {A,D,B} d-separates E and F
Out[4]:
E
A
B
D
0
1
0
0
0
0.23140.7686
1
0.07480.9252
1
0
0.23470.7653
1
0.06410.9359
1
0
0
0.93660.0634
1
0.79850.2015
1
0
0.93770.0623
1
0.77050.2295
In [5]:
ie.evidenceImpact("E",["A","B","C","D","F"]) # {A,C,D} d-separates E and {B,F}
Out[5]:
E
C
A
D
0
1
0
0
0
0.21970.7803
1
0.10930.8907
1
0
0.93250.0675
1
0.85750.1425
1
0
0
0.24320.7568
1
0.03380.9662
1
0
0.94030.0597
1
0.63180.3682

Evidence Joint Imapct

In [6]:
ie.evidenceJointImpact(["A","F"],["B","C","D","E","H"]) # {B,E} d-separates [A,F] and [C,D,H]
Out[6]:
A
E
B
F
0
1
0
0
0
0.14320.5830
1
0.09540.1784
1
0
0.26040.5634
1
0.02580.1505
1
0
0
0.56340.0468
1
0.37560.0143
1
0
0.86590.0382
1
0.08570.0102
In [ ]: