Some other features in Bayesian inference
Lazy Propagation uses a secondary structure called the “Junction Tree” to perform the inference.
In [1]:
import pyAgrum as gum
import pyAgrum.lib.notebook as gnb
bn=gum.loadBN("res/alarm.dsl")
gnb.showJunctionTreeMap(bn);
But this junction tree can be transformed to build different probabilistic queries.
In [2]:
bn=gum.fastBN("A->B->C->D;A->E->D;F->B;C->H")
ie=gum.LazyPropagation(bn)
bn
Out[2]:
Evidence impact
Evidence Impact allows the user to analyze the effect of any variables on any other variables
In [3]:
ie.evidenceImpact("B",["A","H"])
Out[3]:
|
| ||
---|---|---|---|
| 0.1241 | 0.8759 | |
0.1326 | 0.8674 | ||
| 0.5310 | 0.4690 | |
0.5499 | 0.4501 |
Evidence impact is able to find the minimum set of variables which effectively conditions the analyzed variable
In [4]:
ie.evidenceImpact("E",["A","F","B","D"]) # {A,D,B} d-separates E and F
Out[4]:
|
| |||
---|---|---|---|---|
|
| 0.2855 | 0.7145 | |
0.2627 | 0.7373 | |||
| 0.2831 | 0.7169 | ||
0.2661 | 0.7339 | |||
|
| 0.7251 | 0.2749 | |
0.7017 | 0.2983 | |||
| 0.7228 | 0.2772 | ||
0.7054 | 0.2946 |
In [5]:
ie.evidenceImpact("E",["A","B","C","D","F"]) # {A,C,D} d-separates E and {B,F}
Out[5]:
|
| |||
---|---|---|---|---|
|
| 0.2758 | 0.7242 | |
0.2766 | 0.7234 | |||
| 0.7155 | 0.2845 | ||
0.7163 | 0.2837 | |||
|
| 0.2959 | 0.7041 | |
0.2464 | 0.7536 | |||
| 0.7352 | 0.2648 | ||
0.6835 | 0.3165 |
Evidence Joint Impact
In [6]:
ie.evidenceJointImpact(["A","F"],["B","C","D","E","H"]) # {B,E} d-separates [A,F] and [C,D,H]
Out[6]:
|
| |||
---|---|---|---|---|
|
| 0.0068 | 0.0127 | |
0.0147 | 0.9658 | |||
| 0.0096 | 0.1643 | ||
0.1395 | 0.6866 | |||
|
| 0.0399 | 0.0113 | |
0.0866 | 0.8622 | |||
| 0.0344 | 0.0895 | ||
0.5020 | 0.3740 |