Some other features in Bayesian inference

Creative Commons License

aGrUM

interactive online version

Lazy Propagation uses a secondary structure called the “Junction Tree” to perform the inference.

In [1]:
import pyAgrum as gum
import pyAgrum.lib.notebook as gnb

bn=gum.loadBN("res/alarm.dsl")
gnb.showJunctionTreeMap(bn);
../_images/notebooks_43-Inference_LazyPropagationAdvancedFeatures_3_0.svg

But this junction tree can be transformed to build different probabilistic queries.

In [2]:
bn=gum.fastBN("A->B->C->D;A->E->D;F->B;C->H")
ie=gum.LazyPropagation(bn)
bn
Out[2]:
G E E D D E->D A A A->E B B A->B H H F F F->B C C B->C C->D C->H

Evidence impact

Evidence Impact allows the user to analyze the effect of any variables on any other variables

In [3]:
ie.evidenceImpact("B",["A","H"])
Out[3]:
B
A
H
0
1
0
0
0.37820.6218
1
0.32570.6743
1
0
0.39240.6076
1
0.33900.6610

Evidence impact is able to find the minimum set of variables which effectively conditions the analyzed variable

In [4]:
ie.evidenceImpact("E",["A","F","B","D"]) # {A,D,B} d-separates E and F
Out[4]:
E
A
B
D
0
1
0
0
0
0.59260.4074
1
0.77110.2289
1
0
0.57530.4247
1
0.77640.2236
1
0
0
0.35760.6424
1
0.56320.4368
1
0
0.34130.6587
1
0.57060.4294
In [5]:
ie.evidenceImpact("E",["A","B","C","D","F"]) # {A,C,D} d-separates E and {B,F}
Out[5]:
E
C
A
D
0
1
0
0
0
0.40820.5918
1
0.81170.1883
1
0
0.20880.7912
1
0.62250.3775
1
0
0
0.59690.4031
1
0.76980.2302
1
0
0.36160.6384
1
0.56130.4387

Evidence Joint Impact

In [6]:
ie.evidenceJointImpact(["A","F"],["B","C","D","E","H"]) # {B,E} d-separates [A,F] and [C,D,H]
Out[6]:
A
E
B
F
0
1
0
0
0
0.75160.0473
1
0.14950.0515
1
0
0.20690.0400
1
0.69940.0536
1
0
0
0.64820.1067
1
0.12900.1161
1
0
0.17980.0909
1
0.60760.1217