Some other features in Bayesian inference
Lazy Propagation uses a secondary structure called the “Junction Tree” to perform the inference.
In [1]:
import pyAgrum as gum
import pyAgrum.lib.notebook as gnb
bn=gum.loadBN("res/alarm.dsl")
gnb.showJunctionTreeMap(bn);
But this junction tree can be transformed to build different probabilistic queries.
In [2]:
bn=gum.fastBN("A->B->C->D;A->E->D;F->B;C->H")
ie=gum.LazyPropagation(bn)
bn
Out[2]:
Evidence impact
Evidence Impact allows the user to analyze the effect of any variables on any other variables
In [3]:
ie.evidenceImpact("B",["A","H"])
Out[3]:
|
| ||
---|---|---|---|
| 0.6872 | 0.3128 | |
0.7283 | 0.2717 | ||
| 0.8549 | 0.1451 | |
0.8779 | 0.1221 |
Evidence impact is able to find the minimum set of variables which effectively conditions the analyzed variable
In [4]:
ie.evidenceImpact("E",["A","F","B","D"]) # {A,D,B} d-separates E and F
Out[4]:
|
| |||
---|---|---|---|---|
|
| 0.9819 | 0.0181 | |
0.9734 | 0.0266 | |||
| 0.9821 | 0.0179 | ||
0.9774 | 0.0226 | |||
|
| 0.2244 | 0.7756 | |
0.1632 | 0.8368 | |||
| 0.2268 | 0.7732 | ||
0.1877 | 0.8123 |
In [5]:
ie.evidenceImpact("E",["A","B","C","D","F"]) # {A,C,D} d-separates E and {B,F}
Out[5]:
|
| |||
---|---|---|---|---|
|
| 0.9834 | 0.0166 | |
0.9792 | 0.0208 | |||
| 0.2406 | 0.7594 | ||
0.2005 | 0.7995 | |||
|
| 0.9818 | 0.0182 | |
0.9060 | 0.0940 | |||
| 0.2233 | 0.7767 | ||
0.0489 | 0.9511 |
Evidence Joint Impact
In [6]:
ie.evidenceJointImpact(["A","F"],["B","C","D","E","H"]) # {B,E} d-separates [A,F] and [C,D,H]
Out[6]:
|
| |||
---|---|---|---|---|
|
| 0.1756 | 0.0200 | |
0.7134 | 0.0910 | |||
| 0.3009 | 0.0256 | ||
0.6546 | 0.0189 | |||
|
| 0.0081 | 0.1730 | |
0.0329 | 0.7861 | |||
| 0.0323 | 0.5160 | ||
0.0704 | 0.3813 |