Some other features in Bayesian inference
Lazy Propagation uses a secondary structure called the “Junction Tree” to perform the inference.
In [1]:
import pyAgrum as gum
import pyAgrum.lib.notebook as gnb
bn=gum.loadBN("res/alarm.dsl")
gnb.showJunctionTreeMap(bn);
But this junction tree can be transformed to build different probabilistic queries.
In [2]:
bn=gum.fastBN("A->B->C->D;A->E->D;F->B;C->H")
ie=gum.LazyPropagation(bn)
bn
Out[2]:
Evidence impact
Evidence Impact allows the user to analyze the effect of any variables on any other variables
In [3]:
ie.evidenceImpact("B",["A","H"])
Out[3]:
|
| ||
---|---|---|---|
| 0.5371 | 0.4629 | |
0.6266 | 0.3734 | ||
| 0.6283 | 0.3717 | |
0.7097 | 0.2903 |
Evidence impact is able to find the minimum set of variables which effectively conditions the analyzed variable
In [4]:
ie.evidenceImpact("E",["A","F","B","D"]) # {A,D,B} d-separates E and F
Out[4]:
|
| |||
---|---|---|---|---|
|
| 0.4296 | 0.5704 | |
0.1308 | 0.8692 | |||
| 0.3500 | 0.6500 | ||
0.1389 | 0.8611 | |||
|
| 0.5953 | 0.4047 | |
0.2272 | 0.7728 | |||
| 0.5126 | 0.4874 | ||
0.2397 | 0.7603 |
In [5]:
ie.evidenceImpact("E",["A","B","C","D","F"]) # {A,C,D} d-separates E and {B,F}
Out[5]:
|
| |||
---|---|---|---|---|
|
| 0.3086 | 0.6914 | |
0.1482 | 0.8518 | |||
| 0.4658 | 0.5342 | ||
0.2537 | 0.7463 | |||
|
| 0.4462 | 0.5538 | |
0.1298 | 0.8702 | |||
| 0.6115 | 0.3885 | ||
0.2256 | 0.7744 |
Evidence Joint Impact
In [6]:
ie.evidenceJointImpact(["A","F"],["B","C","D","E","H"]) # {B,E} d-separates [A,F] and [C,D,H]
Out[6]:
|
| |||
---|---|---|---|---|
|
| 0.3295 | 0.4631 | |
0.1522 | 0.0552 | |||
| 0.5416 | 0.2587 | ||
0.0335 | 0.1662 | |||
|
| 0.4411 | 0.3174 | |
0.2037 | 0.0378 | |||
| 0.6833 | 0.1671 | ||
0.0423 | 0.1073 |