Some other features in Bayesian inference
Lazy Propagation uses a secondary structure called the “Junction Tree” to perform the inference.
In [1]:
import pyAgrum as gum
import pyAgrum.lib.notebook as gnb
bn=gum.loadBN("res/alarm.dsl")
gnb.showJunctionTreeMap(bn);
But this junction tree can be transformed to build different probabilistic queries.
In [2]:
bn=gum.fastBN("A->B->C->D;A->E->D;F->B;C->H")
ie=gum.LazyPropagation(bn)
bn
Out[2]:
Evidence impact
Evidence Impact allows the user to analyze the effect of any variables on any other variables
In [3]:
ie.evidenceImpact("B",["A","H"])
Out[3]:
|
| ||
---|---|---|---|
| 0.6858 | 0.3142 | |
0.5835 | 0.4165 | ||
| 0.3822 | 0.6178 | |
0.2842 | 0.7158 |
Evidence impact is able to find the minimum set of variables which effectively conditions the analyzed variable
In [4]:
ie.evidenceImpact("E",["A","F","B","D"]) # {A,D,B} d-separates E and F
Out[4]:
|
| |||
---|---|---|---|---|
|
| 0.9122 | 0.0878 | |
0.9423 | 0.0577 | |||
| 0.9224 | 0.0776 | ||
0.9383 | 0.0617 | |||
|
| 0.4980 | 0.5020 | |
0.6095 | 0.3905 | |||
| 0.5316 | 0.4684 | ||
0.5922 | 0.4078 |
In [5]:
ie.evidenceImpact("E",["A","B","C","D","F"]) # {A,C,D} d-separates E and {B,F}
Out[5]:
|
| |||
---|---|---|---|---|
|
| 0.9243 | 0.0757 | |
0.9363 | 0.0637 | |||
| 0.5384 | 0.4616 | ||
0.5838 | 0.4162 | |||
|
| 0.8915 | 0.1085 | |
0.9442 | 0.0558 | |||
| 0.4397 | 0.5603 | ||
0.6179 | 0.3821 |
Evidence Joint Impact
In [6]:
ie.evidenceJointImpact(["A","F"],["B","C","D","E","H"]) # {B,E} d-separates [A,F] and [C,D,H]
Out[6]:
|
| |||
---|---|---|---|---|
|
| 0.0429 | 0.0126 | |
0.8323 | 0.1122 | |||
| 0.1569 | 0.0418 | ||
0.5085 | 0.2928 | |||
|
| 0.0197 | 0.0605 | |
0.3815 | 0.5383 | |||
| 0.0376 | 0.1051 | ||
0.1220 | 0.7353 |