# Inference¶

Inference is the process that consists in computing new probabilistc information from a Bayesian network and some evidence. aGrUM/pyAgrum mainly focus and the computation of (joint) posterior for some variables of the Bayesian networks given soft or hard evidence that are the form of likelihoods on some variables. Inference is a hard task (NP-complete). aGrUM/pyAgrum implements exact inference but also approximated inference that can converge slowly and (even) not exactly but thant can in many cases be useful for applications.

# Exact Inference¶

## Lazy Propagation¶

Lazy Propagation is the main exact inference for classical Bayesian networks in aGrUM/pyAgrum.

class pyAgrum.LazyPropagation(*args)

Class used for Lazy Propagation

Available ructors:
LazyPropagation(bn) -> LazyPropagation
Parameters: bn (pyAgrum.BayesNet) – a Bayesian network
BN(LazyPropagation self)
Returns: A ant reference over the IBayesNet referenced by this class. pyAgrum.IBayesNet gum.UndefinedElement – If no Bayes net has been assigned to the inference.
H(LazyPropagation self, int X)

H(LazyPropagation self, str nodeName) -> double

Parameters: X (int) – a node Id nodeName (str) – a node name the computed Shanon’s entropy of a node given the observation double
I(LazyPropagation self, int X, int Y)
Parameters: X (int) – a node Id Y (int) – another node Id the computed Shanon’s entropy of a node given the observation double
VI(LazyPropagation self, int X, int Y)
Parameters: X (int) – a node Id Y (int) – another node Id variation of information between X and Y double
addAllTargets(LazyPropagation self)

Add all the nodes as targets.

addEvidence(LazyPropagation self, int id, int val)

addEvidence(LazyPropagation self, str nodeName, int val) addEvidence(LazyPropagation self, int id, str val) addEvidence(LazyPropagation self, str nodeName, str val) addEvidence(LazyPropagation self, int id, Vector vals) addEvidence(LazyPropagation self, str nodeName, Vector vals)

Adds a new evidence on a node (might be soft or hard).

Parameters: id (int) – a node Id nodeName (int) – a node name val – (int) a node value val – (str) the label of the node value vals (list) – a list of values gum.InvalidArgument – If the node already has an evidence gum.InvalidArgument – If val is not a value for the node gum.InvalidArgument – If the size of vals is different from the domain side of the node gum.FatalError – If vals is a vector of 0s gum.UndefinedElement – If the node does not belong to the Bayesian network
addJointTarget(LazyPropagation self, PyObject * list)

Add a list of nodes as a new joint target. As a collateral effect, every node is added as a marginal target.

Parameters: list – a list of names of nodes gum.UndefinedElement – If some node(s) do not belong to the Bayesian network
addTarget(LazyPropagation self, int target)

Add a marginal target to the list of targets.

Parameters: target (int) – a node Id nodeName (str) – a node name gum.UndefinedElement – If target is not a NodeId in the Bayes net
chgEvidence(LazyPropagation self, int id, int val)

chgEvidence(LazyPropagation self, str nodeName, int val) chgEvidence(LazyPropagation self, int id, str val) chgEvidence(LazyPropagation self, str nodeName, str val) chgEvidence(LazyPropagation self, int id, Vector vals) chgEvidence(LazyPropagation self, str nodeName, Vector vals)

Change the value of an already existing evidence on a node (might be soft or hard).

Parameters: id (int) – a node Id nodeName (int) – a node name val – (int) a node value val – (str) the label of the node value vals (list) – a list of values gum.InvalidArgument – If the node does not already have an evidence gum.InvalidArgument – If val is not a value for the node gum.InvalidArgument – If the size of vals is different from the domain side of the node gum.FatalError – If vals is a vector of 0s gum.UndefinedElement – If the node does not belong to the Bayesian network
eraseAllEvidence(LazyPropagation self)

Removes all the evidence entered into the network.

eraseAllJointTargets(LazyPropagation self)

Clear all previously defined joint targets.

eraseAllMarginalTargets(LazyPropagation self)

Clear all the previously defined marginal targets.

eraseAllTargets(LazyPropagation self)

Clear all previously defined targets (marginal and joint targets).

As a result, no posterior can be computed (since we can only compute the posteriors of the marginal or joint targets that have been added by the user).

eraseEvidence(LazyPropagation self, int id)

eraseEvidence(LazyPropagation self, str nodeName)

Remove the evidence, if any, corresponding to the node Id or name.

Parameters: id (int) – a node Id nodeName (int) – a node name gum.IndexError – If the node does not belong to the Bayesian network
eraseJointTarget(LazyPropagation self, PyObject * list)

Remove, if existing, the joint target.

Parameters: list – a list of names or Ids of nodes gum.IndexError – If one of the node does not belong to the Bayesian network gum.UndefinedElement – If node Id is not in the Bayesian network
eraseTarget(LazyPropagation self, int target)

eraseTarget(LazyPropagation self, str nodeName)

Remove, if existing, the marginal target.

Parameters: target (int) – a node Id nodeName (int) – a node name gum.IndexError – If one of the node does not belong to the Bayesian network gum.UndefinedElement – If node Id is not in the Bayesian network
evidenceImpact(LazyPropagation self, int target, PyObject * evs)

evidenceImpact(LazyPropagation self, str target, Vector_string evs) -> Potential

Create a pyAgrum.Potential for P(target|evs) (for all instanciation of target and evs)

Parameters: target (set) – a set of targets ids or names. evs (set) – a set of nodes ids or names.

Warning

if some evs are d-separated, they are not included in the Potential.

Returns: a Potential for P(targets|evs) pyAgrum.Potential
evidenceJointImpact(LazyPropagation self, PyObject * targets, PyObject * evs)

evidenceJointImpact(LazyPropagation self, Vector_string targets, Vector_string evs) -> Potential

Create a pyAgrum.Potential for P(joint targets|evs) (for all instanciation of targets and evs)

Parameters: targets – (int) a node Id targets – (str) a node name evs (set) – a set of nodes ids or names. a Potential for P(target|evs) pyAgrum.Potential gum.Exception – If some evidene entered into the Bayes net are incompatible (their joint proba = 0)
evidenceProbability(LazyPropagation self)
Returns: the probability of evidence double
hardEvidenceNodes(LazyPropagation self)
Returns: the set of nodes with hard evidence set
hasEvidence(LazyPropagation self, int id)

hasEvidence(LazyPropagation self, str nodeName) -> bool

Parameters: id (int) – a node Id nodeName (str) – a node name True if some node(s) (or the one in parameters) have received evidence bool gum.IndexError – If the node does not belong to the Bayesian network
hasHardEvidence(LazyPropagation self, str nodeName)
Parameters: id (int) – a node Id nodeName (str) – a node name True if node has received a hard evidence bool gum.IndexError – If the node does not belong to the Bayesian network
hasSoftEvidence(LazyPropagation self, int id)

hasSoftEvidence(LazyPropagation self, str nodeName) -> bool

Parameters: id (int) – a node Id nodeName (str) – a node name True if node has received a soft evidence bool gum.IndexError – If the node does not belong to the Bayesian network
isJointTarget(LazyPropagation self, PyObject * list)
Parameters: list – a list of nodes ids or names. True if target is a joint target. bool gum.IndexError – If the node does not belong to the Bayesian network gum.UndefinedElement – If node Id is not in the Bayesian network
isTarget(LazyPropagation self, int variable)

isTarget(LazyPropagation self, str nodeName) -> bool

Parameters: variable (int) – a node Id nodeName (str) – a node name True if variable is a (marginal) target bool gum.IndexError – If the node does not belong to the Bayesian network gum.UndefinedElement – If node Id is not in the Bayesian network
joinTree(LazyPropagation self)
Returns: the current join tree used pyAgrum.CliqueGraph
jointMutualInformation(LazyPropagation self, PyObject * targets)
jointPosterior(LazyPropagation self, PyObject * list)

Compute the joint posterior of a set of nodes.

Parameters: list – the list of nodes whose posterior joint probability is wanted

Warning

The order of the variables given by the list here or when the jointTarget is declared can not be assumed to be used bu the Potential.

Returns: a ref to the posterior joint probability of the set of nodes. pyAgrum.Potential gum.UndefinedElement – If an element of nodes is not in targets
jointTargets(LazyPropagation self)
Returns: the list of target sets list
junctionTree(LazyPropagation self)
Returns: the current junction tree pyAgrum.CliqueGraph
makeInference(LazyPropagation self)

Perform the heavy computations needed to compute the targets’ posteriors

In a Junction tree propagation scheme, for instance, the heavy computations are those of the messages sent in the JT. This is precisely what makeInference should compute. Later, the computations of the posteriors can be done ‘lightly’ by multiplying and projecting those messages.

nbrEvidence(LazyPropagation self)
Returns: the number of evidence entered into the Bayesian network int
nbrHardEvidence(LazyPropagation self)
Returns: the number of hard evidence entered into the Bayesian network int
nbrJointTargets(LazyPropagation self)
Returns: the number of joint targets int
nbrSoftEvidence(LazyPropagation self)
Returns: the number of soft evidence entered into the Bayesian network int
nbrTargets(LazyPropagation self)
Returns: the number of marginal targets int
posterior(LazyPropagation self, int var)

posterior(LazyPropagation self, str nodeName) -> Potential

Computes and returns the posterior of a node.

Parameters: var (int) – the node Id of the node for which we need a posterior probability nodeName (str) – the node name of the node for which we need a posterior probability a ref to the posterior probability of the node pyAgrum.Potential gum.UndefinedElement – If an element of nodes is not in targets
setEvidence(evidces)

Erase all the evidences and apply addEvidence(key,value) for every pairs in evidces.

Parameters: evidces (dict) – a dict of evidences gum.InvalidArgument – If one value is not a value for the node gum.InvalidArgument – If the size of a value is different from the domain side of the node gum.FatalError – If one value is a vector of 0s gum.UndefinedElement – If one node does not belong to the Bayesian network
setFindBarrenNodesType(LazyPropagation self, pyAgrum.FindBarrenNodesType type)

sets how we determine barren nodes

Barren nodes are unnecessary for probability inference, so they can be safely discarded in this case (type = FIND_BARREN_NODES). This speeds-up inference. However, there are some cases in which we do not want to remove barren nodes, typically when we want to answer queries such as Most Probable Explanations (MPE).

0 = FIND_NO_BARREN_NODES 1 = FIND_BARREN_NODES

Parameters: type (int) – the finder type gum.InvalidArgument – If type is not implemented
setRelevantPotentialsFinderType(LazyPropagation self, pyAgrum.RelevantPotentialsFinderType type)

sets how we determine the relevant potentials to combine

When a clique sends a message to a separator, it first itute the set of the potentials it contains and of the potentials contained in the messages it received. If RelevantPotentialsFinderType = FIND_ALL, all these potentials are combined and projected to produce the message sent to the separator. If RelevantPotentialsFinderType = DSEP_BAYESBALL_NODES, then only the set of potentials d-connected to the variables of the separator are kept for combination and projection.

0 = FIND_ALL 1 = DSEP_BAYESBALL_NODES 2 = DSEP_BAYESBALL_POTENTIALS 3 = DSEP_KOLLER_FRIEDMAN_2009

Parameters: type (int) – the finder type gum.InvalidArgument – If type is not implemented
setTargets(targets)

Remove all the targets and add the ones in parameter.

Parameters: targets (set) – a set of targets gum.UndefinedElement – If one target is not in the Bayes net
setTriangulation(LazyPropagation self, Triangulation new_triangulation)
softEvidenceNodes(LazyPropagation self)
Returns: the set of nodes with soft evidence set
targets(LazyPropagation self)
Returns: the list of marginal targets list
updateEvidence(evidces)

Apply chgEvidence(key,value) for every pairs in evidces (or addEvidence).

Parameters: evidces (dict) – a dict of evidences gum.InvalidArgument – If one value is not a value for the node gum.InvalidArgument – If the size of a value is different from the domain side of the node gum.FatalError – If one value is a vector of 0s gum.UndefinedElement – If one node does not belong to the Bayesian network

## Shafer Shenoy Inference¶

class pyAgrum.ShaferShenoyInference(*args)

Class used for Shafer-Shenoy inferences.

Available ructors:
ShaferShenoyInference(bn) -> ShaferShenoyInference
Parameters: bn (pyAgrum.BayesNet) – a Bayesian network
BN(ShaferShenoyInference self)
Returns: A ant reference over the IBayesNet referenced by this class. pyAgrum.IBayesNet gum.UndefinedElement – If no Bayes net has been assigned to the inference.
H(ShaferShenoyInference self, int X)

H(ShaferShenoyInference self, str nodeName) -> double

Parameters: X (int) – a node Id nodeName (str) – a node name the computed Shanon’s entropy of a node given the observation double
I(ShaferShenoyInference self, int X, int Y)
Parameters: X (int) – a node Id Y (int) – another node Id the computed Shanon’s entropy of a node given the observation double
VI(ShaferShenoyInference self, int X, int Y)
Parameters: X (int) – a node Id Y (int) – another node Id variation of information between X and Y double
addAllTargets(ShaferShenoyInference self)

Add all the nodes as targets.

addEvidence(ShaferShenoyInference self, int id, int val)

addEvidence(ShaferShenoyInference self, str nodeName, int val) addEvidence(ShaferShenoyInference self, int id, str val) addEvidence(ShaferShenoyInference self, str nodeName, str val) addEvidence(ShaferShenoyInference self, int id, Vector vals) addEvidence(ShaferShenoyInference self, str nodeName, Vector vals)

Adds a new evidence on a node (might be soft or hard).

Parameters: id (int) – a node Id nodeName (int) – a node name val – (int) a node value val – (str) the label of the node value vals (list) – a list of values gum.InvalidArgument – If the node already has an evidence gum.InvalidArgument – If val is not a value for the node gum.InvalidArgument – If the size of vals is different from the domain side of the node gum.FatalError – If vals is a vector of 0s gum.UndefinedElement – If the node does not belong to the Bayesian network
addJointTarget(ShaferShenoyInference self, PyObject * list)

Add a list of nodes as a new joint target. As a collateral effect, every node is added as a marginal target.

Parameters: list – a list of names of nodes gum.UndefinedElement – If some node(s) do not belong to the Bayesian network
addTarget(ShaferShenoyInference self, int target)

Add a marginal target to the list of targets.

Parameters: target (int) – a node Id nodeName (str) – a node name gum.UndefinedElement – If target is not a NodeId in the Bayes net
chgEvidence(ShaferShenoyInference self, int id, int val)

chgEvidence(ShaferShenoyInference self, str nodeName, int val) chgEvidence(ShaferShenoyInference self, int id, str val) chgEvidence(ShaferShenoyInference self, str nodeName, str val) chgEvidence(ShaferShenoyInference self, int id, Vector vals) chgEvidence(ShaferShenoyInference self, str nodeName, Vector vals)

Change the value of an already existing evidence on a node (might be soft or hard).

Parameters: id (int) – a node Id nodeName (int) – a node name val – (int) a node value val – (str) the label of the node value vals (list) – a list of values gum.InvalidArgument – If the node does not already have an evidence gum.InvalidArgument – If val is not a value for the node gum.InvalidArgument – If the size of vals is different from the domain side of the node gum.FatalError – If vals is a vector of 0s gum.UndefinedElement – If the node does not belong to the Bayesian network
eraseAllEvidence(ShaferShenoyInference self)

Removes all the evidence entered into the network.

eraseAllJointTargets(ShaferShenoyInference self)

Clear all previously defined joint targets.

eraseAllMarginalTargets(ShaferShenoyInference self)

Clear all the previously defined marginal targets.

eraseAllTargets(ShaferShenoyInference self)

Clear all previously defined targets (marginal and joint targets).

As a result, no posterior can be computed (since we can only compute the posteriors of the marginal or joint targets that have been added by the user).

eraseEvidence(ShaferShenoyInference self, int id)

eraseEvidence(ShaferShenoyInference self, str nodeName)

Remove the evidence, if any, corresponding to the node Id or name.

Parameters: id (int) – a node Id nodeName (int) – a node name gum.IndexError – If the node does not belong to the Bayesian network
eraseJointTarget(ShaferShenoyInference self, PyObject * list)

Remove, if existing, the joint target.

Parameters: list – a list of names or Ids of nodes gum.IndexError – If one of the node does not belong to the Bayesian network gum.UndefinedElement – If node Id is not in the Bayesian network
eraseTarget(ShaferShenoyInference self, int target)

eraseTarget(ShaferShenoyInference self, str nodeName)

Remove, if existing, the marginal target.

Parameters: target (int) – a node Id nodeName (int) – a node name gum.IndexError – If one of the node does not belong to the Bayesian network gum.UndefinedElement – If node Id is not in the Bayesian network
evidenceImpact(ShaferShenoyInference self, int target, PyObject * evs)

evidenceImpact(ShaferShenoyInference self, str target, Vector_string evs) -> Potential

Create a pyAgrum.Potential for P(target|evs) (for all instanciation of target and evs)

Parameters: target (set) – a set of targets ids or names. evs (set) – a set of nodes ids or names.

Warning

if some evs are d-separated, they are not included in the Potential.

Returns: a Potential for P(targets|evs) pyAgrum.Potential
evidenceJointImpact(ShaferShenoyInference self, PyObject * targets, PyObject * evs)

evidenceJointImpact(ShaferShenoyInference self, Vector_string targets, Vector_string evs) -> Potential

Create a pyAgrum.Potential for P(joint targets|evs) (for all instanciation of targets and evs)

Parameters: targets – (int) a node Id targets – (str) a node name evs (set) – a set of nodes ids or names. a Potential for P(target|evs) pyAgrum.Potential gum.Exception – If some evidene entered into the Bayes net are incompatible (their joint proba = 0)
evidenceProbability(ShaferShenoyInference self)
Returns: the probability of evidence double
hardEvidenceNodes(ShaferShenoyInference self)
Returns: the set of nodes with hard evidence set
hasEvidence(ShaferShenoyInference self, int id)

hasEvidence(ShaferShenoyInference self, str nodeName) -> bool

Parameters: id (int) – a node Id nodeName (str) – a node name True if some node(s) (or the one in parameters) have received evidence bool gum.IndexError – If the node does not belong to the Bayesian network
hasHardEvidence(ShaferShenoyInference self, str nodeName)
Parameters: id (int) – a node Id nodeName (str) – a node name True if node has received a hard evidence bool gum.IndexError – If the node does not belong to the Bayesian network
hasSoftEvidence(ShaferShenoyInference self, int id)

hasSoftEvidence(ShaferShenoyInference self, str nodeName) -> bool

Parameters: id (int) – a node Id nodeName (str) – a node name True if node has received a soft evidence bool gum.IndexError – If the node does not belong to the Bayesian network
isJointTarget(ShaferShenoyInference self, PyObject * list)
Parameters: list – a list of nodes ids or names. True if target is a joint target. bool gum.IndexError – If the node does not belong to the Bayesian network gum.UndefinedElement – If node Id is not in the Bayesian network
isTarget(ShaferShenoyInference self, int variable)

isTarget(ShaferShenoyInference self, str nodeName) -> bool

Parameters: variable (int) – a node Id nodeName (str) – a node name True if variable is a (marginal) target bool gum.IndexError – If the node does not belong to the Bayesian network gum.UndefinedElement – If node Id is not in the Bayesian network
joinTree(ShaferShenoyInference self)
Returns: the current join tree used pyAgrum.CliqueGraph
jointMutualInformation(ShaferShenoyInference self, PyObject * targets)
jointPosterior(ShaferShenoyInference self, PyObject * list)

Compute the joint posterior of a set of nodes.

Parameters: list – the list of nodes whose posterior joint probability is wanted

Warning

The order of the variables given by the list here or when the jointTarget is declared can not be assumed to be used bu the Potential.

Returns: a ref to the posterior joint probability of the set of nodes. pyAgrum.Potential gum.UndefinedElement – If an element of nodes is not in targets
jointTargets(ShaferShenoyInference self)
Returns: the list of target sets list
junctionTree(ShaferShenoyInference self)
Returns: the current junction tree pyAgrum.CliqueGraph
makeInference(ShaferShenoyInference self)

Perform the heavy computations needed to compute the targets’ posteriors

In a Junction tree propagation scheme, for instance, the heavy computations are those of the messages sent in the JT. This is precisely what makeInference should compute. Later, the computations of the posteriors can be done ‘lightly’ by multiplying and projecting those messages.

nbrEvidence(ShaferShenoyInference self)
Returns: the number of evidence entered into the Bayesian network int
nbrHardEvidence(ShaferShenoyInference self)
Returns: the number of hard evidence entered into the Bayesian network int
nbrJointTargets(ShaferShenoyInference self)
Returns: the number of joint targets int
nbrSoftEvidence(ShaferShenoyInference self)
Returns: the number of soft evidence entered into the Bayesian network int
nbrTargets(ShaferShenoyInference self)
Returns: the number of marginal targets int
posterior(ShaferShenoyInference self, int var)

posterior(ShaferShenoyInference self, str nodeName) -> Potential

Computes and returns the posterior of a node.

Parameters: var (int) – the node Id of the node for which we need a posterior probability nodeName (str) – the node name of the node for which we need a posterior probability a ref to the posterior probability of the node pyAgrum.Potential gum.UndefinedElement – If an element of nodes is not in targets
setEvidence(evidces)

Erase all the evidences and apply addEvidence(key,value) for every pairs in evidces.

Parameters: evidces (dict) – a dict of evidences gum.InvalidArgument – If one value is not a value for the node gum.InvalidArgument – If the size of a value is different from the domain side of the node gum.FatalError – If one value is a vector of 0s gum.UndefinedElement – If one node does not belong to the Bayesian network
setFindBarrenNodesType(ShaferShenoyInference self, pyAgrum.FindBarrenNodesType type)

sets how we determine barren nodes

Barren nodes are unnecessary for probability inference, so they can be safely discarded in this case (type = FIND_BARREN_NODES). This speeds-up inference. However, there are some cases in which we do not want to remove barren nodes, typically when we want to answer queries such as Most Probable Explanations (MPE).

0 = FIND_NO_BARREN_NODES 1 = FIND_BARREN_NODES

Parameters: type (int) – the finder type gum.InvalidArgument – If type is not implemented
setTargets(targets)

Remove all the targets and add the ones in parameter.

Parameters: targets (set) – a set of targets gum.UndefinedElement – If one target is not in the Bayes net
setTriangulation(ShaferShenoyInference self, Triangulation new_triangulation)
softEvidenceNodes(ShaferShenoyInference self)
Returns: the set of nodes with soft evidence set
targets(ShaferShenoyInference self)
Returns: the list of marginal targets list
updateEvidence(evidces)

Apply chgEvidence(key,value) for every pairs in evidces (or addEvidence).

Parameters: evidces (dict) – a dict of evidences gum.InvalidArgument – If one value is not a value for the node gum.InvalidArgument – If the size of a value is different from the domain side of the node gum.FatalError – If one value is a vector of 0s gum.UndefinedElement – If one node does not belong to the Bayesian network

## Variable Elimination¶

class pyAgrum.VariableElimination(*args)

Class used for Variable Elimination inference algorithm.

Available ructors:
VariableElimination(bn) -> VariableElimination
Parameters: bn (pyAgrum.BayesNet) – a Bayesian network
BN(VariableElimination self)
Returns: A ant reference over the IBayesNet referenced by this class. pyAgrum.IBayesNet gum.UndefinedElement – If no Bayes net has been assigned to the inference.
H(VariableElimination self, int X)

H(VariableElimination self, str nodeName) -> double

Parameters: X (int) – a node Id nodeName (str) – a node name the computed Shanon’s entropy of a node given the observation double
addAllTargets(VariableElimination self)

Add all the nodes as targets.

addEvidence(VariableElimination self, int id, int val)

addEvidence(VariableElimination self, str nodeName, int val) addEvidence(VariableElimination self, int id, str val) addEvidence(VariableElimination self, str nodeName, str val) addEvidence(VariableElimination self, int id, Vector vals) addEvidence(VariableElimination self, str nodeName, Vector vals)

Adds a new evidence on a node (might be soft or hard).

Parameters: id (int) – a node Id nodeName (int) – a node name val – (int) a node value val – (str) the label of the node value vals (list) – a list of values gum.InvalidArgument – If the node already has an evidence gum.InvalidArgument – If val is not a value for the node gum.InvalidArgument – If the size of vals is different from the domain side of the node gum.FatalError – If vals is a vector of 0s gum.UndefinedElement – If the node does not belong to the Bayesian network
addJointTarget(VariableElimination self, PyObject * list)

Add a list of nodes as a new joint target. As a collateral effect, every node is added as a marginal target.

Parameters: list – a list of names of nodes gum.UndefinedElement – If some node(s) do not belong to the Bayesian network
addTarget(VariableElimination self, int target)

Add a marginal target to the list of targets.

Parameters: target (int) – a node Id nodeName (str) – a node name gum.UndefinedElement – If target is not a NodeId in the Bayes net
chgEvidence(VariableElimination self, int id, int val)

chgEvidence(VariableElimination self, str nodeName, int val) chgEvidence(VariableElimination self, int id, str val) chgEvidence(VariableElimination self, str nodeName, str val) chgEvidence(VariableElimination self, int id, Vector vals) chgEvidence(VariableElimination self, str nodeName, Vector vals)

Change the value of an already existing evidence on a node (might be soft or hard).

Parameters: id (int) – a node Id nodeName (int) – a node name val – (int) a node value val – (str) the label of the node value vals (list) – a list of values gum.InvalidArgument – If the node does not already have an evidence gum.InvalidArgument – If val is not a value for the node gum.InvalidArgument – If the size of vals is different from the domain side of the node gum.FatalError – If vals is a vector of 0s gum.UndefinedElement – If the node does not belong to the Bayesian network
eraseAllEvidence(VariableElimination self)

Removes all the evidence entered into the network.

eraseAllTargets(VariableElimination self)

Clear all previously defined targets (marginal and joint targets).

As a result, no posterior can be computed (since we can only compute the posteriors of the marginal or joint targets that have been added by the user).

eraseEvidence(VariableElimination self, int id)

eraseEvidence(VariableElimination self, str nodeName)

Remove the evidence, if any, corresponding to the node Id or name.

Parameters: id (int) – a node Id nodeName (int) – a node name gum.IndexError – If the node does not belong to the Bayesian network
eraseJointTarget(VariableElimination self, PyObject * list)

Remove, if existing, the joint target.

Parameters: list – a list of names or Ids of nodes gum.IndexError – If one of the node does not belong to the Bayesian network gum.UndefinedElement – If node Id is not in the Bayesian network
eraseTarget(VariableElimination self, int target)

eraseTarget(VariableElimination self, str nodeName)

Remove, if existing, the marginal target.

Parameters: target (int) – a node Id nodeName (int) – a node name gum.IndexError – If one of the node does not belong to the Bayesian network gum.UndefinedElement – If node Id is not in the Bayesian network
evidenceImpact(VariableElimination self, int target, PyObject * evs)

evidenceImpact(VariableElimination self, str target, Vector_string evs) -> Potential

Create a pyAgrum.Potential for P(target|evs) (for all instanciation of target and evs)

Parameters: target (set) – a set of targets ids or names. evs (set) – a set of nodes ids or names.

Warning

if some evs are d-separated, they are not included in the Potential.

Returns: a Potential for P(targets|evs) pyAgrum.Potential
evidenceJointImpact(VariableElimination self, PyObject * targets, PyObject * evs)

Create a pyAgrum.Potential for P(joint targets|evs) (for all instanciation of targets and evs)

Parameters: targets – (int) a node Id targets – (str) a node name evs (set) – a set of nodes ids or names. a Potential for P(target|evs) pyAgrum.Potential gum.Exception – If some evidene entered into the Bayes net are incompatible (their joint proba = 0)
hardEvidenceNodes(VariableElimination self)
Returns: the set of nodes with hard evidence set
hasEvidence(VariableElimination self, int id)

hasEvidence(VariableElimination self, str nodeName) -> bool

Parameters: id (int) – a node Id nodeName (str) – a node name True if some node(s) (or the one in parameters) have received evidence bool gum.IndexError – If the node does not belong to the Bayesian network
hasHardEvidence(VariableElimination self, str nodeName)
Parameters: id (int) – a node Id nodeName (str) – a node name True if node has received a hard evidence bool gum.IndexError – If the node does not belong to the Bayesian network
hasSoftEvidence(VariableElimination self, int id)

hasSoftEvidence(VariableElimination self, str nodeName) -> bool

Parameters: id (int) – a node Id nodeName (str) – a node name True if node has received a soft evidence bool gum.IndexError – If the node does not belong to the Bayesian network
isJointTarget(VariableElimination self, PyObject * list)
Parameters: list – a list of nodes ids or names. True if target is a joint target. bool gum.IndexError – If the node does not belong to the Bayesian network gum.UndefinedElement – If node Id is not in the Bayesian network
isTarget(VariableElimination self, int variable)

isTarget(VariableElimination self, str nodeName) -> bool

Parameters: variable (int) – a node Id nodeName (str) – a node name True if variable is a (marginal) target bool gum.IndexError – If the node does not belong to the Bayesian network gum.UndefinedElement – If node Id is not in the Bayesian network
jointMutualInformation(VariableElimination self, PyObject * targets)
jointPosterior(VariableElimination self, PyObject * list)

Compute the joint posterior of a set of nodes.

Parameters: list – the list of nodes whose posterior joint probability is wanted

Warning

The order of the variables given by the list here or when the jointTarget is declared can not be assumed to be used bu the Potential.

Returns: a ref to the posterior joint probability of the set of nodes. pyAgrum.Potential gum.UndefinedElement – If an element of nodes is not in targets
jointTargets(VariableElimination self)
Returns: the list of target sets list
junctionTree(VariableElimination self, int id)
Returns: the current junction tree pyAgrum.CliqueGraph
makeInference(VariableElimination self)

Perform the heavy computations needed to compute the targets’ posteriors

In a Junction tree propagation scheme, for instance, the heavy computations are those of the messages sent in the JT. This is precisely what makeInference should compute. Later, the computations of the posteriors can be done ‘lightly’ by multiplying and projecting those messages.

nbrEvidence(VariableElimination self)
Returns: the number of evidence entered into the Bayesian network int
nbrHardEvidence(VariableElimination self)
Returns: the number of hard evidence entered into the Bayesian network int
nbrSoftEvidence(VariableElimination self)
Returns: the number of soft evidence entered into the Bayesian network int
nbrTargets(VariableElimination self)
Returns: the number of marginal targets int
posterior(VariableElimination self, int var)

posterior(VariableElimination self, str nodeName) -> Potential

Computes and returns the posterior of a node.

Parameters: var (int) – the node Id of the node for which we need a posterior probability nodeName (str) – the node name of the node for which we need a posterior probability a ref to the posterior probability of the node pyAgrum.Potential gum.UndefinedElement – If an element of nodes is not in targets
setEvidence(evidces)

Erase all the evidences and apply addEvidence(key,value) for every pairs in evidces.

Parameters: evidces (dict) – a dict of evidences gum.InvalidArgument – If one value is not a value for the node gum.InvalidArgument – If the size of a value is different from the domain side of the node gum.FatalError – If one value is a vector of 0s gum.UndefinedElement – If one node does not belong to the Bayesian network
setFindBarrenNodesType(VariableElimination self, pyAgrum.FindBarrenNodesType type)

sets how we determine barren nodes

Barren nodes are unnecessary for probability inference, so they can be safely discarded in this case (type = FIND_BARREN_NODES). This speeds-up inference. However, there are some cases in which we do not want to remove barren nodes, typically when we want to answer queries such as Most Probable Explanations (MPE).

0 = FIND_NO_BARREN_NODES 1 = FIND_BARREN_NODES

Parameters: type (int) – the finder type gum.InvalidArgument – If type is not implemented
setRelevantPotentialsFinderType(VariableElimination self, pyAgrum.RelevantPotentialsFinderType type)

sets how we determine the relevant potentials to combine

When a clique sends a message to a separator, it first itute the set of the potentials it contains and of the potentials contained in the messages it received. If RelevantPotentialsFinderType = FIND_ALL, all these potentials are combined and projected to produce the message sent to the separator. If RelevantPotentialsFinderType = DSEP_BAYESBALL_NODES, then only the set of potentials d-connected to the variables of the separator are kept for combination and projection.

0 = FIND_ALL 1 = DSEP_BAYESBALL_NODES 2 = DSEP_BAYESBALL_POTENTIALS 3 = DSEP_KOLLER_FRIEDMAN_2009

Parameters: type (int) – the finder type gum.InvalidArgument – If type is not implemented
setTargets(targets)

Remove all the targets and add the ones in parameter.

Parameters: targets (set) – a set of targets gum.UndefinedElement – If one target is not in the Bayes net
setTriangulation(VariableElimination self, Triangulation new_triangulation)
softEvidenceNodes(VariableElimination self)
Returns: the set of nodes with soft evidence set
targets(VariableElimination self)
Returns: the list of marginal targets list
updateEvidence(evidces)

Apply chgEvidence(key,value) for every pairs in evidces (or addEvidence).

Parameters: evidces (dict) – a dict of evidences gum.InvalidArgument – If one value is not a value for the node gum.InvalidArgument – If the size of a value is different from the domain side of the node gum.FatalError – If one value is a vector of 0s gum.UndefinedElement – If one node does not belong to the Bayesian network

# Approximated Inference¶

## Loopy Belief Propagation¶

class pyAgrum.LoopyBeliefPropagation(bn: pyAgrum.pyAgrum.IBayesNet)

Class used for inferences using loopy belief propagation algorithm.

Available ructors:
LoopyBeliefPropagation(bn) -> LoopyBeliefPropagation
Parameters: bn (pyAgrum.BayesNet) – a Bayesian network
BN(LoopyBeliefPropagation self)
Returns: A ant reference over the IBayesNet referenced by this class. pyAgrum.IBayesNet gum.UndefinedElement – If no Bayes net has been assigned to the inference.
H(LoopyBeliefPropagation self, int X)

H(LoopyBeliefPropagation self, str nodeName) -> double

Parameters: X (int) – a node Id nodeName (str) – a node name the computed Shanon’s entropy of a node given the observation double
addAllTargets(LoopyBeliefPropagation self)

Add all the nodes as targets.

addEvidence(LoopyBeliefPropagation self, int id, int val)

addEvidence(LoopyBeliefPropagation self, str nodeName, int val) addEvidence(LoopyBeliefPropagation self, int id, str val) addEvidence(LoopyBeliefPropagation self, str nodeName, str val) addEvidence(LoopyBeliefPropagation self, int id, Vector vals) addEvidence(LoopyBeliefPropagation self, str nodeName, Vector vals)

Adds a new evidence on a node (might be soft or hard).

Parameters: id (int) – a node Id nodeName (int) – a node name val – (int) a node value val – (str) the label of the node value vals (list) – a list of values gum.InvalidArgument – If the node already has an evidence gum.InvalidArgument – If val is not a value for the node gum.InvalidArgument – If the size of vals is different from the domain side of the node gum.FatalError – If vals is a vector of 0s gum.UndefinedElement – If the node does not belong to the Bayesian network
addTarget(LoopyBeliefPropagation self, int target)

Add a marginal target to the list of targets.

Parameters: target (int) – a node Id nodeName (str) – a node name gum.UndefinedElement – If target is not a NodeId in the Bayes net
chgEvidence(LoopyBeliefPropagation self, int id, int val)

chgEvidence(LoopyBeliefPropagation self, str nodeName, int val) chgEvidence(LoopyBeliefPropagation self, int id, str val) chgEvidence(LoopyBeliefPropagation self, str nodeName, str val) chgEvidence(LoopyBeliefPropagation self, int id, Vector vals) chgEvidence(LoopyBeliefPropagation self, str nodeName, Vector vals)

Change the value of an already existing evidence on a node (might be soft or hard).

Parameters: id (int) – a node Id nodeName (int) – a node name val – (int) a node value val – (str) the label of the node value vals (list) – a list of values gum.InvalidArgument – If the node does not already have an evidence gum.InvalidArgument – If val is not a value for the node gum.InvalidArgument – If the size of vals is different from the domain side of the node gum.FatalError – If vals is a vector of 0s gum.UndefinedElement – If the node does not belong to the Bayesian network
currentTime(LoopyBeliefPropagation self)
Returns: get the current running time in second (double) double
epsilon(LoopyBeliefPropagation self)
Returns: the value of epsilon double
eraseAllEvidence(LoopyBeliefPropagation self)

Removes all the evidence entered into the network.

eraseAllTargets(LoopyBeliefPropagation self)

Clear all previously defined targets (marginal and joint targets).

As a result, no posterior can be computed (since we can only compute the posteriors of the marginal or joint targets that have been added by the user).

eraseEvidence(LoopyBeliefPropagation self, int id)

eraseEvidence(LoopyBeliefPropagation self, str nodeName)

Remove the evidence, if any, corresponding to the node Id or name.

Parameters: id (int) – a node Id nodeName (int) – a node name gum.IndexError – If the node does not belong to the Bayesian network
eraseTarget(LoopyBeliefPropagation self, int target)

eraseTarget(LoopyBeliefPropagation self, str nodeName)

Remove, if existing, the marginal target.

Parameters: target (int) – a node Id nodeName (int) – a node name gum.IndexError – If one of the node does not belong to the Bayesian network gum.UndefinedElement – If node Id is not in the Bayesian network
evidenceImpact(LoopyBeliefPropagation self, int target, PyObject * evs)

evidenceImpact(LoopyBeliefPropagation self, str target, Vector_string evs) -> Potential

Create a pyAgrum.Potential for P(target|evs) (for all instanciation of target and evs)

Parameters: target (set) – a set of targets ids or names. evs (set) – a set of nodes ids or names.

Warning

if some evs are d-separated, they are not included in the Potential.

Returns: a Potential for P(targets|evs) pyAgrum.Potential
hardEvidenceNodes(LoopyBeliefPropagation self)
Returns: the set of nodes with hard evidence set
hasEvidence(LoopyBeliefPropagation self, int id)

hasEvidence(LoopyBeliefPropagation self, str nodeName) -> bool

Parameters: id (int) – a node Id nodeName (str) – a node name True if some node(s) (or the one in parameters) have received evidence bool gum.IndexError – If the node does not belong to the Bayesian network
hasHardEvidence(LoopyBeliefPropagation self, str nodeName)
Parameters: id (int) – a node Id nodeName (str) – a node name True if node has received a hard evidence bool gum.IndexError – If the node does not belong to the Bayesian network
hasSoftEvidence(LoopyBeliefPropagation self, int id)

hasSoftEvidence(LoopyBeliefPropagation self, str nodeName) -> bool

Parameters: id (int) – a node Id nodeName (str) – a node name True if node has received a soft evidence bool gum.IndexError – If the node does not belong to the Bayesian network
history(LoopyBeliefPropagation self)
Returns: the scheme history tuple gum.OperationNotAllowed – If the scheme did not performed or if verbosity is set to false
isTarget(LoopyBeliefPropagation self, int variable)

isTarget(LoopyBeliefPropagation self, str nodeName) -> bool

Parameters: variable (int) – a node Id nodeName (str) – a node name True if variable is a (marginal) target bool gum.IndexError – If the node does not belong to the Bayesian network gum.UndefinedElement – If node Id is not in the Bayesian network
makeInference(LoopyBeliefPropagation self)

Perform the heavy computations needed to compute the targets’ posteriors

In a Junction tree propagation scheme, for instance, the heavy computations are those of the messages sent in the JT. This is precisely what makeInference should compute. Later, the computations of the posteriors can be done ‘lightly’ by multiplying and projecting those messages.

maxIter(LoopyBeliefPropagation self)
Returns: the criterion on number of iterations int
maxTime(LoopyBeliefPropagation self)
Returns: the timeout(in seconds) double
messageApproximationScheme(LoopyBeliefPropagation self)
Returns: the approximation scheme message str
minEpsilonRate(LoopyBeliefPropagation self)
Returns: the value of the minimal epsilon rate double
nbrEvidence(LoopyBeliefPropagation self)
Returns: the number of evidence entered into the Bayesian network int
nbrHardEvidence(LoopyBeliefPropagation self)
Returns: the number of hard evidence entered into the Bayesian network int
nbrIterations(LoopyBeliefPropagation self)
Returns: the number of iterations int
nbrSoftEvidence(LoopyBeliefPropagation self)
Returns: the number of soft evidence entered into the Bayesian network int
nbrTargets(LoopyBeliefPropagation self)
Returns: the number of marginal targets int
periodSize(LoopyBeliefPropagation self)
Returns: the number of samples between 2 stopping int gum.OutOfLowerBound – If p<1
posterior(LoopyBeliefPropagation self, int var)

posterior(LoopyBeliefPropagation self, str nodeName) -> Potential

Computes and returns the posterior of a node.

Parameters: var (int) – the node Id of the node for which we need a posterior probability nodeName (str) – the node name of the node for which we need a posterior probability a ref to the posterior probability of the node pyAgrum.Potential gum.UndefinedElement – If an element of nodes is not in targets
setEpsilon(LoopyBeliefPropagation self, double eps)
Parameters: eps (double) – the epsilon we want to use gum.OutOfLowerBound – If eps<0
setEvidence(evidces)

Erase all the evidences and apply addEvidence(key,value) for every pairs in evidces.

Parameters: evidces (dict) – a dict of evidences gum.InvalidArgument – If one value is not a value for the node gum.InvalidArgument – If the size of a value is different from the domain side of the node gum.FatalError – If one value is a vector of 0s gum.UndefinedElement – If one node does not belong to the Bayesian network
setMaxIter(LoopyBeliefPropagation self, int max)
Parameters: max (int) – the maximum number of iteration gum.OutOfLowerBound – If max <= 1
setMaxTime(LoopyBeliefPropagation self, double timeout)
Parameters: tiemout (double) – stopping criterion on timeout (in seconds) gum.OutOfLowerBound – If timeout<=0.0
setMinEpsilonRate(LoopyBeliefPropagation self, double rate)
Parameters: rate (double) – the minimal epsilon rate
setPeriodSize(LoopyBeliefPropagation self, int p)
Parameters: p (int) – number of samples between 2 stopping gum.OutOfLowerBound – If p<1
setTargets(targets)

Remove all the targets and add the ones in parameter.

Parameters: targets (set) – a set of targets gum.UndefinedElement – If one target is not in the Bayes net
setVerbosity(LoopyBeliefPropagation self, bool v)
Parameters: v (bool) – verbosity
softEvidenceNodes(LoopyBeliefPropagation self)
Returns: the set of nodes with soft evidence set
targets(LoopyBeliefPropagation self)
Returns: the list of marginal targets list
updateEvidence(evidces)

Apply chgEvidence(key,value) for every pairs in evidces (or addEvidence).

Parameters: evidces (dict) – a dict of evidences gum.InvalidArgument – If one value is not a value for the node gum.InvalidArgument – If the size of a value is different from the domain side of the node gum.FatalError – If one value is a vector of 0s gum.UndefinedElement – If one node does not belong to the Bayesian network
verbosity(LoopyBeliefPropagation self)
Returns: True if the verbosity is enabled bool

## Sampling¶

### Gibbs Sampling¶

class pyAgrum.GibbsSampling(bn: pyAgrum.pyAgrum.IBayesNet)

Class for making Gibbs sampling inference in bayesian networks.

Available ructors:
GibbsSampling(bn) -> GibbsSampling
Parameters: bn (pyAgrum.BayesNet) – a Bayesian network
BN(GibbsSampling self)
Returns: A ant reference over the IBayesNet referenced by this class. pyAgrum.IBayesNet gum.UndefinedElement – If no Bayes net has been assigned to the inference.
H(GibbsSampling self, int X)

H(GibbsSampling self, str nodeName) -> double

Parameters: X (int) – a node Id nodeName (str) – a node name the computed Shanon’s entropy of a node given the observation double
addAllTargets(GibbsSampling self)

Add all the nodes as targets.

addEvidence(GibbsSampling self, int id, int val)

addEvidence(GibbsSampling self, str nodeName, int val) addEvidence(GibbsSampling self, int id, str val) addEvidence(GibbsSampling self, str nodeName, str val) addEvidence(GibbsSampling self, int id, Vector vals) addEvidence(GibbsSampling self, str nodeName, Vector vals)

Adds a new evidence on a node (might be soft or hard).

Parameters: id (int) – a node Id nodeName (int) – a node name val – (int) a node value val – (str) the label of the node value vals (list) – a list of values gum.InvalidArgument – If the node already has an evidence gum.InvalidArgument – If val is not a value for the node gum.InvalidArgument – If the size of vals is different from the domain side of the node gum.FatalError – If vals is a vector of 0s gum.UndefinedElement – If the node does not belong to the Bayesian network
addTarget(GibbsSampling self, int target)

Add a marginal target to the list of targets.

Parameters: target (int) – a node Id nodeName (str) – a node name gum.UndefinedElement – If target is not a NodeId in the Bayes net
burnIn(GibbsSampling self)
Returns: size of burn in on number of iteration int
chgEvidence(GibbsSampling self, int id, int val)

chgEvidence(GibbsSampling self, str nodeName, int val) chgEvidence(GibbsSampling self, int id, str val) chgEvidence(GibbsSampling self, str nodeName, str val) chgEvidence(GibbsSampling self, int id, Vector vals) chgEvidence(GibbsSampling self, str nodeName, Vector vals)

Change the value of an already existing evidence on a node (might be soft or hard).

Parameters: id (int) – a node Id nodeName (int) – a node name val – (int) a node value val – (str) the label of the node value vals (list) – a list of values gum.InvalidArgument – If the node does not already have an evidence gum.InvalidArgument – If val is not a value for the node gum.InvalidArgument – If the size of vals is different from the domain side of the node gum.FatalError – If vals is a vector of 0s gum.UndefinedElement – If the node does not belong to the Bayesian network
currentPosterior(GibbsSampling self, int id)

currentPosterior(GibbsSampling self, str name) -> Potential

Computes and returns the current posterior of a node.

Parameters: var (int) – the node Id of the node for which we need a posterior probability nodeName (str) – the node name of the node for which we need a posterior probability a ref to the current posterior probability of the node pyAgrum.Potential UndefinedElement – If an element of nodes is not in targets
currentTime(GibbsSampling self)
Returns: get the current running time in second (double) double
epsilon(GibbsSampling self)
Returns: the value of epsilon double
eraseAllEvidence(GibbsSampling self)

Removes all the evidence entered into the network.

eraseAllTargets(GibbsSampling self)

Clear all previously defined targets (marginal and joint targets).

As a result, no posterior can be computed (since we can only compute the posteriors of the marginal or joint targets that have been added by the user).

eraseEvidence(GibbsSampling self, int id)

eraseEvidence(GibbsSampling self, str nodeName)

Remove the evidence, if any, corresponding to the node Id or name.

Parameters: id (int) – a node Id nodeName (int) – a node name gum.IndexError – If the node does not belong to the Bayesian network
eraseTarget(GibbsSampling self, int target)

eraseTarget(GibbsSampling self, str nodeName)

Remove, if existing, the marginal target.

Parameters: target (int) – a node Id nodeName (int) – a node name gum.IndexError – If one of the node does not belong to the Bayesian network gum.UndefinedElement – If node Id is not in the Bayesian network
evidenceImpact(GibbsSampling self, int target, PyObject * evs)

evidenceImpact(GibbsSampling self, str target, Vector_string evs) -> Potential

Create a pyAgrum.Potential for P(target|evs) (for all instanciation of target and evs)

Parameters: target (set) – a set of targets ids or names. evs (set) – a set of nodes ids or names.

Warning

if some evs are d-separated, they are not included in the Potential.

Returns: a Potential for P(targets|evs) pyAgrum.Potential
hardEvidenceNodes(GibbsSampling self)
Returns: the set of nodes with hard evidence set
hasEvidence(GibbsSampling self, int id)

hasEvidence(GibbsSampling self, str nodeName) -> bool

Parameters: id (int) – a node Id nodeName (str) – a node name True if some node(s) (or the one in parameters) have received evidence bool gum.IndexError – If the node does not belong to the Bayesian network
hasHardEvidence(GibbsSampling self, str nodeName)
Parameters: id (int) – a node Id nodeName (str) – a node name True if node has received a hard evidence bool gum.IndexError – If the node does not belong to the Bayesian network
hasSoftEvidence(GibbsSampling self, int id)

hasSoftEvidence(GibbsSampling self, str nodeName) -> bool

Parameters: id (int) – a node Id nodeName (str) – a node name True if node has received a soft evidence bool gum.IndexError – If the node does not belong to the Bayesian network
history(GibbsSampling self)
Returns: the scheme history tuple gum.OperationNotAllowed – If the scheme did not performed or if verbosity is set to false
isDrawnAtRandom(GibbsSampling self)
Returns: True if variables are drawn at random bool
isTarget(GibbsSampling self, int variable)

isTarget(GibbsSampling self, str nodeName) -> bool

Parameters: variable (int) – a node Id nodeName (str) – a node name True if variable is a (marginal) target bool gum.IndexError – If the node does not belong to the Bayesian network gum.UndefinedElement – If node Id is not in the Bayesian network
makeInference(GibbsSampling self)

Perform the heavy computations needed to compute the targets’ posteriors

In a Junction tree propagation scheme, for instance, the heavy computations are those of the messages sent in the JT. This is precisely what makeInference should compute. Later, the computations of the posteriors can be done ‘lightly’ by multiplying and projecting those messages.

maxIter(GibbsSampling self)
Returns: the criterion on number of iterations int
maxTime(GibbsSampling self)
Returns: the timeout(in seconds) double
messageApproximationScheme(GibbsSampling self)
Returns: the approximation scheme message str
minEpsilonRate(GibbsSampling self)
Returns: the value of the minimal epsilon rate double
nbrDrawnVar(GibbsSampling self)
Returns: the number of variable drawn at each iteration int
nbrEvidence(GibbsSampling self)
Returns: the number of evidence entered into the Bayesian network int
nbrHardEvidence(GibbsSampling self)
Returns: the number of hard evidence entered into the Bayesian network int
nbrIterations(GibbsSampling self)
Returns: the number of iterations int
nbrSoftEvidence(GibbsSampling self)
Returns: the number of soft evidence entered into the Bayesian network int
nbrTargets(GibbsSampling self)
Returns: the number of marginal targets int
periodSize(GibbsSampling self)
Returns: the number of samples between 2 stopping int gum.OutOfLowerBound – If p<1
posterior(GibbsSampling self, int var)

posterior(GibbsSampling self, str nodeName) -> Potential

Computes and returns the posterior of a node.

Parameters: var (int) – the node Id of the node for which we need a posterior probability nodeName (str) – the node name of the node for which we need a posterior probability a ref to the posterior probability of the node pyAgrum.Potential gum.UndefinedElement – If an element of nodes is not in targets
setBurnIn(GibbsSampling self, int b)
Parameters: b (int) – size of burn in on number of iteration
setDrawnAtRandom(GibbsSampling self, bool _atRandom)
Parameters: _atRandom (bool) – indicates if variables should be drawn at random
setEpsilon(GibbsSampling self, double eps)
Parameters: eps (double) – the epsilon we want to use gum.OutOfLowerBound – If eps<0
setEvidence(evidces)

Erase all the evidences and apply addEvidence(key,value) for every pairs in evidces.

Parameters: evidces (dict) – a dict of evidences gum.InvalidArgument – If one value is not a value for the node gum.InvalidArgument – If the size of a value is different from the domain side of the node gum.FatalError – If one value is a vector of 0s gum.UndefinedElement – If one node does not belong to the Bayesian network
setMaxIter(GibbsSampling self, int max)
Parameters: max (int) – the maximum number of iteration gum.OutOfLowerBound – If max <= 1
setMaxTime(GibbsSampling self, double timeout)
Parameters: tiemout (double) – stopping criterion on timeout (in seconds) gum.OutOfLowerBound – If timeout<=0.0
setMinEpsilonRate(GibbsSampling self, double rate)
Parameters: rate (double) – the minimal epsilon rate
setNbrDrawnVar(GibbsSampling self, int _nbr)
Parameters: _nbr (int) – the number of variables to be drawn at each iteration
setPeriodSize(GibbsSampling self, int p)
Parameters: p (int) – number of samples between 2 stopping gum.OutOfLowerBound – If p<1
setTargets(targets)

Remove all the targets and add the ones in parameter.

Parameters: targets (set) – a set of targets gum.UndefinedElement – If one target is not in the Bayes net
setVerbosity(GibbsSampling self, bool v)
Parameters: v (bool) – verbosity
softEvidenceNodes(GibbsSampling self)
Returns: the set of nodes with soft evidence set
targets(GibbsSampling self)
Returns: the list of marginal targets list
updateEvidence(evidces)

Apply chgEvidence(key,value) for every pairs in evidces (or addEvidence).

Parameters: evidces (dict) – a dict of evidences gum.InvalidArgument – If one value is not a value for the node gum.InvalidArgument – If the size of a value is different from the domain side of the node gum.FatalError – If one value is a vector of 0s gum.UndefinedElement – If one node does not belong to the Bayesian network
verbosity(GibbsSampling self)
Returns: True if the verbosity is enabled bool

### Monte Carlo Sampling¶

class pyAgrum.MonteCarloSampling(bn: pyAgrum.pyAgrum.IBayesNet)

Class used for Monte Carlo sampling inference algorithm.

Available ructors:
MonteCarloSampling(bn) -> MonteCarloSampling
Parameters: bn (pyAgrum.BayesNet) – a Bayesian network
BN(MonteCarloSampling self)
Returns: A ant reference over the IBayesNet referenced by this class. pyAgrum.IBayesNet gum.UndefinedElement – If no Bayes net has been assigned to the inference.
H(MonteCarloSampling self, int X)

H(MonteCarloSampling self, str nodeName) -> double

Parameters: X (int) – a node Id nodeName (str) – a node name the computed Shanon’s entropy of a node given the observation double
addAllTargets(MonteCarloSampling self)

Add all the nodes as targets.

addEvidence(MonteCarloSampling self, int id, int val)

addEvidence(MonteCarloSampling self, str nodeName, int val) addEvidence(MonteCarloSampling self, int id, str val) addEvidence(MonteCarloSampling self, str nodeName, str val) addEvidence(MonteCarloSampling self, int id, Vector vals) addEvidence(MonteCarloSampling self, str nodeName, Vector vals)

Adds a new evidence on a node (might be soft or hard).

Parameters: id (int) – a node Id nodeName (int) – a node name val – (int) a node value val – (str) the label of the node value vals (list) – a list of values gum.InvalidArgument – If the node already has an evidence gum.InvalidArgument – If val is not a value for the node gum.InvalidArgument – If the size of vals is different from the domain side of the node gum.FatalError – If vals is a vector of 0s gum.UndefinedElement – If the node does not belong to the Bayesian network
addTarget(MonteCarloSampling self, int target)

Add a marginal target to the list of targets.

Parameters: target (int) – a node Id nodeName (str) – a node name gum.UndefinedElement – If target is not a NodeId in the Bayes net
chgEvidence(MonteCarloSampling self, int id, int val)

chgEvidence(MonteCarloSampling self, str nodeName, int val) chgEvidence(MonteCarloSampling self, int id, str val) chgEvidence(MonteCarloSampling self, str nodeName, str val) chgEvidence(MonteCarloSampling self, int id, Vector vals) chgEvidence(MonteCarloSampling self, str nodeName, Vector vals)

Change the value of an already existing evidence on a node (might be soft or hard).

Parameters: id (int) – a node Id nodeName (int) – a node name val – (int) a node value val – (str) the label of the node value vals (list) – a list of values gum.InvalidArgument – If the node does not already have an evidence gum.InvalidArgument – If val is not a value for the node gum.InvalidArgument – If the size of vals is different from the domain side of the node gum.FatalError – If vals is a vector of 0s gum.UndefinedElement – If the node does not belong to the Bayesian network
currentPosterior(MonteCarloSampling self, int id)

currentPosterior(MonteCarloSampling self, str name) -> Potential

Computes and returns the current posterior of a node.

Parameters: var (int) – the node Id of the node for which we need a posterior probability nodeName (str) – the node name of the node for which we need a posterior probability a ref to the current posterior probability of the node pyAgrum.Potential UndefinedElement – If an element of nodes is not in targets
currentTime(MonteCarloSampling self)
Returns: get the current running time in second (double) double
epsilon(MonteCarloSampling self)
Returns: the value of epsilon double
eraseAllEvidence(MonteCarloSampling self)

Removes all the evidence entered into the network.

eraseAllTargets(MonteCarloSampling self)

Clear all previously defined targets (marginal and joint targets).

As a result, no posterior can be computed (since we can only compute the posteriors of the marginal or joint targets that have been added by the user).

eraseEvidence(MonteCarloSampling self, int id)

eraseEvidence(MonteCarloSampling self, str nodeName)

Remove the evidence, if any, corresponding to the node Id or name.

Parameters: id (int) – a node Id nodeName (int) – a node name gum.IndexError – If the node does not belong to the Bayesian network
eraseTarget(MonteCarloSampling self, int target)

eraseTarget(MonteCarloSampling self, str nodeName)

Remove, if existing, the marginal target.

Parameters: target (int) – a node Id nodeName (int) – a node name gum.IndexError – If one of the node does not belong to the Bayesian network gum.UndefinedElement – If node Id is not in the Bayesian network
evidenceImpact(MonteCarloSampling self, int target, PyObject * evs)

evidenceImpact(MonteCarloSampling self, str target, Vector_string evs) -> Potential

Create a pyAgrum.Potential for P(target|evs) (for all instanciation of target and evs)

Parameters: target (set) – a set of targets ids or names. evs (set) – a set of nodes ids or names.

Warning

if some evs are d-separated, they are not included in the Potential.

Returns: a Potential for P(targets|evs) pyAgrum.Potential
hardEvidenceNodes(MonteCarloSampling self)
Returns: the set of nodes with hard evidence set
hasEvidence(MonteCarloSampling self, int id)

hasEvidence(MonteCarloSampling self, str nodeName) -> bool

Parameters: id (int) – a node Id nodeName (str) – a node name True if some node(s) (or the one in parameters) have received evidence bool gum.IndexError – If the node does not belong to the Bayesian network
hasHardEvidence(MonteCarloSampling self, str nodeName)
Parameters: id (int) – a node Id nodeName (str) – a node name True if node has received a hard evidence bool gum.IndexError – If the node does not belong to the Bayesian network
hasSoftEvidence(MonteCarloSampling self, int id)

hasSoftEvidence(MonteCarloSampling self, str nodeName) -> bool

Parameters: id (int) – a node Id nodeName (str) – a node name True if node has received a soft evidence bool gum.IndexError – If the node does not belong to the Bayesian network
history(MonteCarloSampling self)
Returns: the scheme history tuple gum.OperationNotAllowed – If the scheme did not performed or if verbosity is set to false
isTarget(MonteCarloSampling self, int variable)

isTarget(MonteCarloSampling self, str nodeName) -> bool

Parameters: variable (int) – a node Id nodeName (str) – a node name True if variable is a (marginal) target bool gum.IndexError – If the node does not belong to the Bayesian network gum.UndefinedElement – If node Id is not in the Bayesian network
makeInference(MonteCarloSampling self)

Perform the heavy computations needed to compute the targets’ posteriors

In a Junction tree propagation scheme, for instance, the heavy computations are those of the messages sent in the JT. This is precisely what makeInference should compute. Later, the computations of the posteriors can be done ‘lightly’ by multiplying and projecting those messages.

maxIter(MonteCarloSampling self)
Returns: the criterion on number of iterations int
maxTime(MonteCarloSampling self)
Returns: the timeout(in seconds) double
messageApproximationScheme(MonteCarloSampling self)
Returns: the approximation scheme message str
minEpsilonRate(MonteCarloSampling self)
Returns: the value of the minimal epsilon rate double
nbrEvidence(MonteCarloSampling self)
Returns: the number of evidence entered into the Bayesian network int
nbrHardEvidence(MonteCarloSampling self)
Returns: the number of hard evidence entered into the Bayesian network int
nbrIterations(MonteCarloSampling self)
Returns: the number of iterations int
nbrSoftEvidence(MonteCarloSampling self)
Returns: the number of soft evidence entered into the Bayesian network int
nbrTargets(MonteCarloSampling self)
Returns: the number of marginal targets int
periodSize(MonteCarloSampling self)
Returns: the number of samples between 2 stopping int gum.OutOfLowerBound – If p<1
posterior(MonteCarloSampling self, int var)

posterior(MonteCarloSampling self, str nodeName) -> Potential

Computes and returns the posterior of a node.

Parameters: var (int) – the node Id of the node for which we need a posterior probability nodeName (str) – the node name of the node for which we need a posterior probability a ref to the posterior probability of the node pyAgrum.Potential gum.UndefinedElement – If an element of nodes is not in targets
setEpsilon(MonteCarloSampling self, double eps)
Parameters: eps (double) – the epsilon we want to use gum.OutOfLowerBound – If eps<0
setEvidence(evidces)

Erase all the evidences and apply addEvidence(key,value) for every pairs in evidces.

Parameters: evidces (dict) – a dict of evidences gum.InvalidArgument – If one value is not a value for the node gum.InvalidArgument – If the size of a value is different from the domain side of the node gum.FatalError – If one value is a vector of 0s gum.UndefinedElement – If one node does not belong to the Bayesian network
setMaxIter(MonteCarloSampling self, int max)
Parameters: max (int) – the maximum number of iteration gum.OutOfLowerBound – If max <= 1
setMaxTime(MonteCarloSampling self, double timeout)
Parameters: tiemout (double) – stopping criterion on timeout (in seconds) gum.OutOfLowerBound – If timeout<=0.0
setMinEpsilonRate(MonteCarloSampling self, double rate)
Parameters: rate (double) – the minimal epsilon rate
setPeriodSize(MonteCarloSampling self, int p)
Parameters: p (int) – number of samples between 2 stopping gum.OutOfLowerBound – If p<1
setTargets(targets)

Remove all the targets and add the ones in parameter.

Parameters: targets (set) – a set of targets gum.UndefinedElement – If one target is not in the Bayes net
setVerbosity(MonteCarloSampling self, bool v)
Parameters: v (bool) – verbosity
softEvidenceNodes(MonteCarloSampling self)
Returns: the set of nodes with soft evidence set
targets(MonteCarloSampling self)
Returns: the list of marginal targets list
updateEvidence(evidces)

Apply chgEvidence(key,value) for every pairs in evidces (or addEvidence).

Parameters: evidces (dict) – a dict of evidences gum.InvalidArgument – If one value is not a value for the node gum.InvalidArgument – If the size of a value is different from the domain side of the node gum.FatalError – If one value is a vector of 0s gum.UndefinedElement – If one node does not belong to the Bayesian network
verbosity(MonteCarloSampling self)
Returns: True if the verbosity is enabled bool

### Weighted Sampling¶

class pyAgrum.WeightedSampling(bn: pyAgrum.pyAgrum.IBayesNet)

Class used for Weighted sampling inference algorithm.

Available ructors:
WeightedSampling(bn) -> WeightedSampling
Parameters: bn (pyAgrum.BayesNet) – a Bayesian network
BN(WeightedSampling self)
Returns: A ant reference over the IBayesNet referenced by this class. pyAgrum.IBayesNet gum.UndefinedElement – If no Bayes net has been assigned to the inference.
H(WeightedSampling self, int X)

H(WeightedSampling self, str nodeName) -> double

Parameters: X (int) – a node Id nodeName (str) – a node name the computed Shanon’s entropy of a node given the observation double
addAllTargets(WeightedSampling self)

Add all the nodes as targets.

addEvidence(WeightedSampling self, int id, int val)

addEvidence(WeightedSampling self, str nodeName, int val) addEvidence(WeightedSampling self, int id, str val) addEvidence(WeightedSampling self, str nodeName, str val) addEvidence(WeightedSampling self, int id, Vector vals) addEvidence(WeightedSampling self, str nodeName, Vector vals)

Adds a new evidence on a node (might be soft or hard).

Parameters: id (int) – a node Id nodeName (int) – a node name val – (int) a node value val – (str) the label of the node value vals (list) – a list of values gum.InvalidArgument – If the node already has an evidence gum.InvalidArgument – If val is not a value for the node gum.InvalidArgument – If the size of vals is different from the domain side of the node gum.FatalError – If vals is a vector of 0s gum.UndefinedElement – If the node does not belong to the Bayesian network
addTarget(WeightedSampling self, int target)

Add a marginal target to the list of targets.

Parameters: target (int) – a node Id nodeName (str) – a node name gum.UndefinedElement – If target is not a NodeId in the Bayes net
chgEvidence(WeightedSampling self, int id, int val)

chgEvidence(WeightedSampling self, str nodeName, int val) chgEvidence(WeightedSampling self, int id, str val) chgEvidence(WeightedSampling self, str nodeName, str val) chgEvidence(WeightedSampling self, int id, Vector vals) chgEvidence(WeightedSampling self, str nodeName, Vector vals)

Change the value of an already existing evidence on a node (might be soft or hard).

Parameters: id (int) – a node Id nodeName (int) – a node name val – (int) a node value val – (str) the label of the node value vals (list) – a list of values gum.InvalidArgument – If the node does not already have an evidence gum.InvalidArgument – If val is not a value for the node gum.InvalidArgument – If the size of vals is different from the domain side of the node gum.FatalError – If vals is a vector of 0s gum.UndefinedElement – If the node does not belong to the Bayesian network
currentPosterior(WeightedSampling self, int id)

currentPosterior(WeightedSampling self, str name) -> Potential

Computes and returns the current posterior of a node.

Parameters: var (int) – the node Id of the node for which we need a posterior probability nodeName (str) – the node name of the node for which we need a posterior probability a ref to the current posterior probability of the node pyAgrum.Potential UndefinedElement – If an element of nodes is not in targets
currentTime(WeightedSampling self)
Returns: get the current running time in second (double) double
epsilon(WeightedSampling self)
Returns: the value of epsilon double
eraseAllEvidence(WeightedSampling self)

Removes all the evidence entered into the network.

eraseAllTargets(WeightedSampling self)

Clear all previously defined targets (marginal and joint targets).

As a result, no posterior can be computed (since we can only compute the posteriors of the marginal or joint targets that have been added by the user).

eraseEvidence(WeightedSampling self, int id)

eraseEvidence(WeightedSampling self, str nodeName)

Remove the evidence, if any, corresponding to the node Id or name.

Parameters: id (int) – a node Id nodeName (int) – a node name gum.IndexError – If the node does not belong to the Bayesian network
eraseTarget(WeightedSampling self, int target)

eraseTarget(WeightedSampling self, str nodeName)

Remove, if existing, the marginal target.

Parameters: target (int) – a node Id nodeName (int) – a node name gum.IndexError – If one of the node does not belong to the Bayesian network gum.UndefinedElement – If node Id is not in the Bayesian network
evidenceImpact(WeightedSampling self, int target, PyObject * evs)

evidenceImpact(WeightedSampling self, str target, Vector_string evs) -> Potential

Create a pyAgrum.Potential for P(target|evs) (for all instanciation of target and evs)

Parameters: target (set) – a set of targets ids or names. evs (set) – a set of nodes ids or names.

Warning

if some evs are d-separated, they are not included in the Potential.

Returns: a Potential for P(targets|evs) pyAgrum.Potential
hardEvidenceNodes(WeightedSampling self)
Returns: the set of nodes with hard evidence set
hasEvidence(WeightedSampling self, int id)

hasEvidence(WeightedSampling self, str nodeName) -> bool

Parameters: id (int) – a node Id nodeName (str) – a node name True if some node(s) (or the one in parameters) have received evidence bool gum.IndexError – If the node does not belong to the Bayesian network
hasHardEvidence(WeightedSampling self, str nodeName)
Parameters: id (int) – a node Id nodeName (str) – a node name True if node has received a hard evidence bool gum.IndexError – If the node does not belong to the Bayesian network
hasSoftEvidence(WeightedSampling self, int id)

hasSoftEvidence(WeightedSampling self, str nodeName) -> bool

Parameters: id (int) – a node Id nodeName (str) – a node name True if node has received a soft evidence bool gum.IndexError – If the node does not belong to the Bayesian network
history(WeightedSampling self)
Returns: the scheme history tuple gum.OperationNotAllowed – If the scheme did not performed or if verbosity is set to false
isTarget(WeightedSampling self, int variable)

isTarget(WeightedSampling self, str nodeName) -> bool

Parameters: variable (int) – a node Id nodeName (str) – a node name True if variable is a (marginal) target bool gum.IndexError – If the node does not belong to the Bayesian network gum.UndefinedElement – If node Id is not in the Bayesian network
makeInference(WeightedSampling self)

Perform the heavy computations needed to compute the targets’ posteriors

In a Junction tree propagation scheme, for instance, the heavy computations are those of the messages sent in the JT. This is precisely what makeInference should compute. Later, the computations of the posteriors can be done ‘lightly’ by multiplying and projecting those messages.

maxIter(WeightedSampling self)
Returns: the criterion on number of iterations int
maxTime(WeightedSampling self)
Returns: the timeout(in seconds) double
messageApproximationScheme(WeightedSampling self)
Returns: the approximation scheme message str
minEpsilonRate(WeightedSampling self)
Returns: the value of the minimal epsilon rate double
nbrEvidence(WeightedSampling self)
Returns: the number of evidence entered into the Bayesian network int
nbrHardEvidence(WeightedSampling self)
Returns: the number of hard evidence entered into the Bayesian network int
nbrIterations(WeightedSampling self)
Returns: the number of iterations int
nbrSoftEvidence(WeightedSampling self)
Returns: the number of soft evidence entered into the Bayesian network int
nbrTargets(WeightedSampling self)
Returns: the number of marginal targets int
periodSize(WeightedSampling self)
Returns: the number of samples between 2 stopping int gum.OutOfLowerBound – If p<1
posterior(WeightedSampling self, int var)

posterior(WeightedSampling self, str nodeName) -> Potential

Computes and returns the posterior of a node.

Parameters: var (int) – the node Id of the node for which we need a posterior probability nodeName (str) – the node name of the node for which we need a posterior probability a ref to the posterior probability of the node pyAgrum.Potential gum.UndefinedElement – If an element of nodes is not in targets
setEpsilon(WeightedSampling self, double eps)
Parameters: eps (double) – the epsilon we want to use gum.OutOfLowerBound – If eps<0
setEvidence(evidces)

Erase all the evidences and apply addEvidence(key,value) for every pairs in evidces.

Parameters: evidces (dict) – a dict of evidences gum.InvalidArgument – If one value is not a value for the node gum.InvalidArgument – If the size of a value is different from the domain side of the node gum.FatalError – If one value is a vector of 0s gum.UndefinedElement – If one node does not belong to the Bayesian network
setMaxIter(WeightedSampling self, int max)
Parameters: max (int) – the maximum number of iteration gum.OutOfLowerBound – If max <= 1
setMaxTime(WeightedSampling self, double timeout)
Parameters: tiemout (double) – stopping criterion on timeout (in seconds) gum.OutOfLowerBound – If timeout<=0.0
setMinEpsilonRate(WeightedSampling self, double rate)
Parameters: rate (double) – the minimal epsilon rate
setPeriodSize(WeightedSampling self, int p)
Parameters: p (int) – number of samples between 2 stopping gum.OutOfLowerBound – If p<1
setTargets(targets)

Remove all the targets and add the ones in parameter.

Parameters: targets (set) – a set of targets gum.UndefinedElement – If one target is not in the Bayes net
setVerbosity(WeightedSampling self, bool v)
Parameters: v (bool) – verbosity
softEvidenceNodes(WeightedSampling self)
Returns: the set of nodes with soft evidence set
targets(WeightedSampling self)
Returns: the list of marginal targets list
updateEvidence(evidces)

Apply chgEvidence(key,value) for every pairs in evidces (or addEvidence).

Parameters: evidces (dict) – a dict of evidences gum.InvalidArgument – If one value is not a value for the node gum.InvalidArgument – If the size of a value is different from the domain side of the node gum.FatalError – If one value is a vector of 0s gum.UndefinedElement – If one node does not belong to the Bayesian network
verbosity(WeightedSampling self)
Returns: True if the verbosity is enabled bool

### Importance Sampling¶

class pyAgrum.ImportanceSampling(bn: pyAgrum.pyAgrum.IBayesNet)

Class used for inferences using the Importance Sampling algorithm.

Available ructors:
ImportanceSampling(bn) -> ImportanceSampling
Parameters: bn (pyAgrum.BayesNet) – a Bayesian network
BN(ImportanceSampling self)
Returns: A ant reference over the IBayesNet referenced by this class. pyAgrum.IBayesNet gum.UndefinedElement – If no Bayes net has been assigned to the inference.
H(ImportanceSampling self, int X)

H(ImportanceSampling self, str nodeName) -> double

Parameters: X (int) – a node Id nodeName (str) – a node name the computed Shanon’s entropy of a node given the observation double
addAllTargets(ImportanceSampling self)

Add all the nodes as targets.

addEvidence(ImportanceSampling self, int id, int val)

addEvidence(ImportanceSampling self, str nodeName, int val) addEvidence(ImportanceSampling self, int id, str val) addEvidence(ImportanceSampling self, str nodeName, str val) addEvidence(ImportanceSampling self, int id, Vector vals) addEvidence(ImportanceSampling self, str nodeName, Vector vals)

Adds a new evidence on a node (might be soft or hard).

Parameters: id (int) – a node Id nodeName (int) – a node name val – (int) a node value val – (str) the label of the node value vals (list) – a list of values gum.InvalidArgument – If the node already has an evidence gum.InvalidArgument – If val is not a value for the node gum.InvalidArgument – If the size of vals is different from the domain side of the node gum.FatalError – If vals is a vector of 0s gum.UndefinedElement – If the node does not belong to the Bayesian network
addTarget(ImportanceSampling self, int target)

Add a marginal target to the list of targets.

Parameters: target (int) – a node Id nodeName (str) – a node name gum.UndefinedElement – If target is not a NodeId in the Bayes net
chgEvidence(ImportanceSampling self, int id, int val)

chgEvidence(ImportanceSampling self, str nodeName, int val) chgEvidence(ImportanceSampling self, int id, str val) chgEvidence(ImportanceSampling self, str nodeName, str val) chgEvidence(ImportanceSampling self, int id, Vector vals) chgEvidence(ImportanceSampling self, str nodeName, Vector vals)

Change the value of an already existing evidence on a node (might be soft or hard).

Parameters: id (int) – a node Id nodeName (int) – a node name val – (int) a node value val – (str) the label of the node value vals (list) – a list of values gum.InvalidArgument – If the node does not already have an evidence gum.InvalidArgument – If val is not a value for the node gum.InvalidArgument – If the size of vals is different from the domain side of the node gum.FatalError – If vals is a vector of 0s gum.UndefinedElement – If the node does not belong to the Bayesian network
currentPosterior(ImportanceSampling self, int id)

currentPosterior(ImportanceSampling self, str name) -> Potential

Computes and returns the current posterior of a node.

Parameters: var (int) – the node Id of the node for which we need a posterior probability nodeName (str) – the node name of the node for which we need a posterior probability a ref to the current posterior probability of the node pyAgrum.Potential UndefinedElement – If an element of nodes is not in targets
currentTime(ImportanceSampling self)
Returns: get the current running time in second (double) double
epsilon(ImportanceSampling self)
Returns: the value of epsilon double
eraseAllEvidence(ImportanceSampling self)

Removes all the evidence entered into the network.

eraseAllTargets(ImportanceSampling self)

Clear all previously defined targets (marginal and joint targets).

As a result, no posterior can be computed (since we can only compute the posteriors of the marginal or joint targets that have been added by the user).

eraseEvidence(ImportanceSampling self, int id)

eraseEvidence(ImportanceSampling self, str nodeName)

Remove the evidence, if any, corresponding to the node Id or name.

Parameters: id (int) – a node Id nodeName (int) – a node name gum.IndexError – If the node does not belong to the Bayesian network
eraseTarget(ImportanceSampling self, int target)

eraseTarget(ImportanceSampling self, str nodeName)

Remove, if existing, the marginal target.

Parameters: target (int) – a node Id nodeName (int) – a node name gum.IndexError – If one of the node does not belong to the Bayesian network gum.UndefinedElement – If node Id is not in the Bayesian network
evidenceImpact(ImportanceSampling self, int target, PyObject * evs)

evidenceImpact(ImportanceSampling self, str target, Vector_string evs) -> Potential

Create a pyAgrum.Potential for P(target|evs) (for all instanciation of target and evs)

Parameters: target (set) – a set of targets ids or names. evs (set) – a set of nodes ids or names.

Warning

if some evs are d-separated, they are not included in the Potential.

Returns: a Potential for P(targets|evs) pyAgrum.Potential
hardEvidenceNodes(ImportanceSampling self)
Returns: the set of nodes with hard evidence set
hasEvidence(ImportanceSampling self, int id)

hasEvidence(ImportanceSampling self, str nodeName) -> bool

Parameters: id (int) – a node Id nodeName (str) – a node name True if some node(s) (or the one in parameters) have received evidence bool gum.IndexError – If the node does not belong to the Bayesian network
hasHardEvidence(ImportanceSampling self, str nodeName)
Parameters: id (int) – a node Id nodeName (str) – a node name True if node has received a hard evidence bool gum.IndexError – If the node does not belong to the Bayesian network
hasSoftEvidence(ImportanceSampling self, int id)

hasSoftEvidence(ImportanceSampling self, str nodeName) -> bool

Parameters: id (int) – a node Id nodeName (str) – a node name True if node has received a soft evidence bool gum.IndexError – If the node does not belong to the Bayesian network
history(ImportanceSampling self)
Returns: the scheme history tuple gum.OperationNotAllowed – If the scheme did not performed or if verbosity is set to false
isTarget(ImportanceSampling self, int variable)

isTarget(ImportanceSampling self, str nodeName) -> bool

Parameters: variable (int) – a node Id nodeName (str) – a node name True if variable is a (marginal) target bool gum.IndexError – If the node does not belong to the Bayesian network gum.UndefinedElement – If node Id is not in the Bayesian network
makeInference(ImportanceSampling self)

Perform the heavy computations needed to compute the targets’ posteriors

In a Junction tree propagation scheme, for instance, the heavy computations are those of the messages sent in the JT. This is precisely what makeInference should compute. Later, the computations of the posteriors can be done ‘lightly’ by multiplying and projecting those messages.

maxIter(ImportanceSampling self)
Returns: the criterion on number of iterations int
maxTime(ImportanceSampling self)
Returns: the timeout(in seconds) double
messageApproximationScheme(ImportanceSampling self)
Returns: the approximation scheme message str
minEpsilonRate(ImportanceSampling self)
Returns: the value of the minimal epsilon rate double
nbrEvidence(ImportanceSampling self)
Returns: the number of evidence entered into the Bayesian network int
nbrHardEvidence(ImportanceSampling self)
Returns: the number of hard evidence entered into the Bayesian network int
nbrIterations(ImportanceSampling self)
Returns: the number of iterations int
nbrSoftEvidence(ImportanceSampling self)
Returns: the number of soft evidence entered into the Bayesian network int
nbrTargets(ImportanceSampling self)
Returns: the number of marginal targets int
periodSize(ImportanceSampling self)
Returns: the number of samples between 2 stopping int gum.OutOfLowerBound – If p<1
posterior(ImportanceSampling self, int var)

posterior(ImportanceSampling self, str nodeName) -> Potential

Computes and returns the posterior of a node.

Parameters: var (int) – the node Id of the node for which we need a posterior probability nodeName (str) – the node name of the node for which we need a posterior probability a ref to the posterior probability of the node pyAgrum.Potential gum.UndefinedElement – If an element of nodes is not in targets
setEpsilon(ImportanceSampling self, double eps)
Parameters: eps (double) – the epsilon we want to use gum.OutOfLowerBound – If eps<0
setEvidence(evidces)

Erase all the evidences and apply addEvidence(key,value) for every pairs in evidces.

Parameters: evidces (dict) – a dict of evidences gum.InvalidArgument – If one value is not a value for the node gum.InvalidArgument – If the size of a value is different from the domain side of the node gum.FatalError – If one value is a vector of 0s gum.UndefinedElement – If one node does not belong to the Bayesian network
setMaxIter(ImportanceSampling self, int max)
Parameters: max (int) – the maximum number of iteration gum.OutOfLowerBound – If max <= 1
setMaxTime(ImportanceSampling self, double timeout)
Parameters: tiemout (double) – stopping criterion on timeout (in seconds) gum.OutOfLowerBound – If timeout<=0.0
setMinEpsilonRate(ImportanceSampling self, double rate)
Parameters: rate (double) – the minimal epsilon rate
setPeriodSize(ImportanceSampling self, int p)
Parameters: p (int) – number of samples between 2 stopping gum.OutOfLowerBound – If p<1
setTargets(targets)

Remove all the targets and add the ones in parameter.

Parameters: targets (set) – a set of targets gum.UndefinedElement – If one target is not in the Bayes net
setVerbosity(ImportanceSampling self, bool v)
Parameters: v (bool) – verbosity
softEvidenceNodes(ImportanceSampling self)
Returns: the set of nodes with soft evidence set
targets(ImportanceSampling self)
Returns: the list of marginal targets list
updateEvidence(evidces)

Apply chgEvidence(key,value) for every pairs in evidces (or addEvidence).

Parameters: evidces (dict) – a dict of evidences gum.InvalidArgument – If one value is not a value for the node gum.InvalidArgument – If the size of a value is different from the domain side of the node gum.FatalError – If one value is a vector of 0s gum.UndefinedElement – If one node does not belong to the Bayesian network
verbosity(ImportanceSampling self)
Returns: True if the verbosity is enabled bool

## Loopy sampling¶

### Loopy Gibbs Sampling¶

class pyAgrum.LoopyGibbsSampling(bn: pyAgrum.pyAgrum.IBayesNet)

Class used for inferences using a loopy version of Gibbs sampling.

Available ructors:
LoopyGibbsSampling(bn) -> LoopyGibbsSampling
Parameters: bn (pyAgrum.BayesNet) – a Bayesian network
BN(LoopyGibbsSampling self)
Returns: A ant reference over the IBayesNet referenced by this class. pyAgrum.IBayesNet gum.UndefinedElement – If no Bayes net has been assigned to the inference.
H(LoopyGibbsSampling self, int X)

H(LoopyGibbsSampling self, str nodeName) -> double

Parameters: X (int) – a node Id nodeName (str) – a node name the computed Shanon’s entropy of a node given the observation double
addAllTargets(LoopyGibbsSampling self)

Add all the nodes as targets.

addEvidence(LoopyGibbsSampling self, int id, int val)

addEvidence(LoopyGibbsSampling self, str nodeName, int val) addEvidence(LoopyGibbsSampling self, int id, str val) addEvidence(LoopyGibbsSampling self, str nodeName, str val) addEvidence(LoopyGibbsSampling self, int id, Vector vals) addEvidence(LoopyGibbsSampling self, str nodeName, Vector vals)

Adds a new evidence on a node (might be soft or hard).

Parameters: id (int) – a node Id nodeName (int) – a node name val – (int) a node value val – (str) the label of the node value vals (list) – a list of values gum.InvalidArgument – If the node already has an evidence gum.InvalidArgument – If val is not a value for the node gum.InvalidArgument – If the size of vals is different from the domain side of the node gum.FatalError – If vals is a vector of 0s gum.UndefinedElement – If the node does not belong to the Bayesian network
addTarget(LoopyGibbsSampling self, int target)

Add a marginal target to the list of targets.

Parameters: target (int) – a node Id nodeName (str) – a node name gum.UndefinedElement – If target is not a NodeId in the Bayes net
burnIn(LoopyGibbsSampling self)
Returns: size of burn in on number of iteration int
chgEvidence(LoopyGibbsSampling self, int id, int val)

chgEvidence(LoopyGibbsSampling self, str nodeName, int val) chgEvidence(LoopyGibbsSampling self, int id, str val) chgEvidence(LoopyGibbsSampling self, str nodeName, str val) chgEvidence(LoopyGibbsSampling self, int id, Vector vals) chgEvidence(LoopyGibbsSampling self, str nodeName, Vector vals)

Change the value of an already existing evidence on a node (might be soft or hard).

Parameters: id (int) – a node Id nodeName (int) – a node name val – (int) a node value val – (str) the label of the node value vals (list) – a list of values gum.InvalidArgument – If the node does not already have an evidence gum.InvalidArgument – If val is not a value for the node gum.InvalidArgument – If the size of vals is different from the domain side of the node gum.FatalError – If vals is a vector of 0s gum.UndefinedElement – If the node does not belong to the Bayesian network
currentPosterior(LoopyGibbsSampling self, int id)

currentPosterior(LoopyGibbsSampling self, str name) -> Potential

Computes and returns the current posterior of a node.

Parameters: var (int) – the node Id of the node for which we need a posterior probability nodeName (str) – the node name of the node for which we need a posterior probability a ref to the current posterior probability of the node pyAgrum.Potential UndefinedElement – If an element of nodes is not in targets
currentTime(LoopyGibbsSampling self)
Returns: get the current running time in second (double) double
epsilon(LoopyGibbsSampling self)
Returns: the value of epsilon double
eraseAllEvidence(LoopyGibbsSampling self)

Removes all the evidence entered into the network.

eraseAllTargets(LoopyGibbsSampling self)

Clear all previously defined targets (marginal and joint targets).

As a result, no posterior can be computed (since we can only compute the posteriors of the marginal or joint targets that have been added by the user).

eraseEvidence(LoopyGibbsSampling self, int id)

eraseEvidence(LoopyGibbsSampling self, str nodeName)

Remove the evidence, if any, corresponding to the node Id or name.

Parameters: id (int) – a node Id nodeName (int) – a node name gum.IndexError – If the node does not belong to the Bayesian network
eraseTarget(LoopyGibbsSampling self, int target)

eraseTarget(LoopyGibbsSampling self, str nodeName)

Remove, if existing, the marginal target.

Parameters: target (int) – a node Id nodeName (int) – a node name gum.IndexError – If one of the node does not belong to the Bayesian network gum.UndefinedElement – If node Id is not in the Bayesian network
evidenceImpact(LoopyGibbsSampling self, int target, PyObject * evs)

evidenceImpact(LoopyGibbsSampling self, str target, Vector_string evs) -> Potential

Create a pyAgrum.Potential for P(target|evs) (for all instanciation of target and evs)

Parameters: target (set) – a set of targets ids or names. evs (set) – a set of nodes ids or names.

Warning

if some evs are d-separated, they are not included in the Potential.

Returns: a Potential for P(targets|evs) pyAgrum.Potential
hardEvidenceNodes(LoopyGibbsSampling self)
Returns: the set of nodes with hard evidence set
hasEvidence(LoopyGibbsSampling self, int id)

hasEvidence(LoopyGibbsSampling self, str nodeName) -> bool

Parameters: id (int) – a node Id nodeName (str) – a node name True if some node(s) (or the one in parameters) have received evidence bool gum.IndexError – If the node does not belong to the Bayesian network
hasHardEvidence(LoopyGibbsSampling self, str nodeName)
Parameters: id (int) – a node Id nodeName (str) – a node name True if node has received a hard evidence bool gum.IndexError – If the node does not belong to the Bayesian network
hasSoftEvidence(LoopyGibbsSampling self, int id)

hasSoftEvidence(LoopyGibbsSampling self, str nodeName) -> bool

Parameters: id (int) – a node Id nodeName (str) – a node name True if node has received a soft evidence bool gum.IndexError – If the node does not belong to the Bayesian network
history(LoopyGibbsSampling self)
Returns: the scheme history tuple gum.OperationNotAllowed – If the scheme did not performed or if verbosity is set to false
isDrawnAtRandom(LoopyGibbsSampling self)
Returns: True if variables are drawn at random bool
isTarget(LoopyGibbsSampling self, int variable)

isTarget(LoopyGibbsSampling self, str nodeName) -> bool

Parameters: variable (int) – a node Id nodeName (str) – a node name True if variable is a (marginal) target bool gum.IndexError – If the node does not belong to the Bayesian network gum.UndefinedElement – If node Id is not in the Bayesian network
makeInference(LoopyGibbsSampling self)

Perform the heavy computations needed to compute the targets’ posteriors

In a Junction tree propagation scheme, for instance, the heavy computations are those of the messages sent in the JT. This is precisely what makeInference should compute. Later, the computations of the posteriors can be done ‘lightly’ by multiplying and projecting those messages.

maxIter(LoopyGibbsSampling self)
Returns: the criterion on number of iterations int
maxTime(LoopyGibbsSampling self)
Returns: the timeout(in seconds) double
messageApproximationScheme(LoopyGibbsSampling self)
Returns: the approximation scheme message str
minEpsilonRate(LoopyGibbsSampling self)
Returns: the value of the minimal epsilon rate double
nbrDrawnVar(LoopyGibbsSampling self)
Returns: the number of variable drawn at each iteration int
nbrEvidence(LoopyGibbsSampling self)
Returns: the number of evidence entered into the Bayesian network int
nbrHardEvidence(LoopyGibbsSampling self)
Returns: the number of hard evidence entered into the Bayesian network int
nbrIterations(LoopyGibbsSampling self)
Returns: the number of iterations int
nbrSoftEvidence(LoopyGibbsSampling self)
Returns: the number of soft evidence entered into the Bayesian network int
nbrTargets(LoopyGibbsSampling self)
Returns: the number of marginal targets int
periodSize(LoopyGibbsSampling self)
Returns: the number of samples between 2 stopping int gum.OutOfLowerBound – If p<1
posterior(LoopyGibbsSampling self, int var)

posterior(LoopyGibbsSampling self, str nodeName) -> Potential

Computes and returns the posterior of a node.

Parameters: var (int) – the node Id of the node for which we need a posterior probability nodeName (str) – the node name of the node for which we need a posterior probability a ref to the posterior probability of the node pyAgrum.Potential gum.UndefinedElement – If an element of nodes is not in targets
setBurnIn(LoopyGibbsSampling self, int b)
Parameters: b (int) – size of burn in on number of iteration
setDrawnAtRandom(LoopyGibbsSampling self, bool _atRandom)
Parameters: _atRandom (bool) – indicates if variables should be drawn at random
setEpsilon(LoopyGibbsSampling self, double eps)
Parameters: eps (double) – the epsilon we want to use gum.OutOfLowerBound – If eps<0
setEvidence(evidces)

Erase all the evidences and apply addEvidence(key,value) for every pairs in evidces.

Parameters: evidces (dict) – a dict of evidences gum.InvalidArgument – If one value is not a value for the node gum.InvalidArgument – If the size of a value is different from the domain side of the node gum.FatalError – If one value is a vector of 0s gum.UndefinedElement – If one node does not belong to the Bayesian network
setMaxIter(LoopyGibbsSampling self, int max)
Parameters: max (int) – the maximum number of iteration gum.OutOfLowerBound – If max <= 1
setMaxTime(LoopyGibbsSampling self, double timeout)
Parameters: tiemout (double) – stopping criterion on timeout (in seconds) gum.OutOfLowerBound – If timeout<=0.0
setMinEpsilonRate(LoopyGibbsSampling self, double rate)
Parameters: rate (double) – the minimal epsilon rate
setNbrDrawnVar(LoopyGibbsSampling self, int _nbr)
Parameters: _nbr (int) – the number of variables to be drawn at each iteration
setPeriodSize(LoopyGibbsSampling self, int p)
Parameters: p (int) – number of samples between 2 stopping gum.OutOfLowerBound – If p<1
setTargets(targets)

Remove all the targets and add the ones in parameter.

Parameters: targets (set) – a set of targets gum.UndefinedElement – If one target is not in the Bayes net
setVerbosity(LoopyGibbsSampling self, bool v)
Parameters: v (bool) – verbosity
setVirtualLBPSize(LoopyGibbsSampling self, double vlbpsize)
Parameters: vlbpsize (double) – the size of the virtual LBP
softEvidenceNodes(LoopyGibbsSampling self)
Returns: the set of nodes with soft evidence set
targets(LoopyGibbsSampling self)
Returns: the list of marginal targets list
updateEvidence(evidces)

Apply chgEvidence(key,value) for every pairs in evidces (or addEvidence).

Parameters: evidces (dict) – a dict of evidences gum.InvalidArgument – If one value is not a value for the node gum.InvalidArgument – If the size of a value is different from the domain side of the node gum.FatalError – If one value is a vector of 0s gum.UndefinedElement – If one node does not belong to the Bayesian network
verbosity(LoopyGibbsSampling self)
Returns: True if the verbosity is enabled bool

### Loopy Monte Carlo Sampling¶

class pyAgrum.LoopyMonteCarloSampling(bn: pyAgrum.pyAgrum.IBayesNet)

Proxy of C++ pyAgrum.LoopySamplingInference< double,pyAgrum.MonteCarloSampling > class.

BN(LoopyMonteCarloSampling self)
Returns: A ant reference over the IBayesNet referenced by this class. pyAgrum.IBayesNet gum.UndefinedElement – If no Bayes net has been assigned to the inference.
H(LoopyMonteCarloSampling self, int X)

H(LoopyMonteCarloSampling self, str nodeName) -> double

Parameters: X (int) – a node Id nodeName (str) – a node name the computed Shanon’s entropy of a node given the observation double
addAllTargets(LoopyMonteCarloSampling self)

Add all the nodes as targets.

addEvidence(LoopyMonteCarloSampling self, int id, int val)

addEvidence(LoopyMonteCarloSampling self, str nodeName, int val) addEvidence(LoopyMonteCarloSampling self, int id, str val) addEvidence(LoopyMonteCarloSampling self, str nodeName, str val) addEvidence(LoopyMonteCarloSampling self, int id, Vector vals) addEvidence(LoopyMonteCarloSampling self, str nodeName, Vector vals)

Adds a new evidence on a node (might be soft or hard).

Parameters: id (int) – a node Id nodeName (int) – a node name val – (int) a node value val – (str) the label of the node value vals (list) – a list of values gum.InvalidArgument – If the node already has an evidence gum.InvalidArgument – If val is not a value for the node gum.InvalidArgument – If the size of vals is different from the domain side of the node gum.FatalError – If vals is a vector of 0s gum.UndefinedElement – If the node does not belong to the Bayesian network
addTarget(LoopyMonteCarloSampling self, int target)

Add a marginal target to the list of targets.

Parameters: target (int) – a node Id nodeName (str) – a node name gum.UndefinedElement – If target is not a NodeId in the Bayes net
chgEvidence(LoopyMonteCarloSampling self, int id, int val)

chgEvidence(LoopyMonteCarloSampling self, str nodeName, int val) chgEvidence(LoopyMonteCarloSampling self, int id, str val) chgEvidence(LoopyMonteCarloSampling self, str nodeName, str val) chgEvidence(LoopyMonteCarloSampling self, int id, Vector vals) chgEvidence(LoopyMonteCarloSampling self, str nodeName, Vector vals)

Change the value of an already existing evidence on a node (might be soft or hard).

Parameters: id (int) – a node Id nodeName (int) – a node name val – (int) a node value val – (str) the label of the node value vals (list) – a list of values gum.InvalidArgument – If the node does not already have an evidence gum.InvalidArgument – If val is not a value for the node gum.InvalidArgument – If the size of vals is different from the domain side of the node gum.FatalError – If vals is a vector of 0s gum.UndefinedElement – If the node does not belong to the Bayesian network
currentPosterior(LoopyMonteCarloSampling self, int id)

currentPosterior(LoopyMonteCarloSampling self, str name) -> Potential

Computes and returns the current posterior of a node.

Parameters: var (int) – the node Id of the node for which we need a posterior probability nodeName (str) – the node name of the node for which we need a posterior probability a ref to the current posterior probability of the node pyAgrum.Potential UndefinedElement – If an element of nodes is not in targets
currentTime(LoopyMonteCarloSampling self)
Returns: get the current running time in second (double) double
epsilon(LoopyMonteCarloSampling self)
Returns: the value of epsilon double
eraseAllEvidence(LoopyMonteCarloSampling self)

Removes all the evidence entered into the network.

eraseAllTargets(LoopyMonteCarloSampling self)

Clear all previously defined targets (marginal and joint targets).

As a result, no posterior can be computed (since we can only compute the posteriors of the marginal or joint targets that have been added by the user).

eraseEvidence(LoopyMonteCarloSampling self, int id)

eraseEvidence(LoopyMonteCarloSampling self, str nodeName)

Remove the evidence, if any, corresponding to the node Id or name.

Parameters: id (int) – a node Id nodeName (int) – a node name gum.IndexError – If the node does not belong to the Bayesian network
eraseTarget(LoopyMonteCarloSampling self, int target)

eraseTarget(LoopyMonteCarloSampling self, str nodeName)

Remove, if existing, the marginal target.

Parameters: target (int) – a node Id nodeName (int) – a node name gum.IndexError – If one of the node does not belong to the Bayesian network gum.UndefinedElement – If node Id is not in the Bayesian network
evidenceImpact(LoopyMonteCarloSampling self, int target, PyObject * evs)

evidenceImpact(LoopyMonteCarloSampling self, str target, Vector_string evs) -> Potential

Create a pyAgrum.Potential for P(target|evs) (for all instanciation of target and evs)

Parameters: target (set) – a set of targets ids or names. evs (set) – a set of nodes ids or names.

Warning

if some evs are d-separated, they are not included in the Potential.

Returns: a Potential for P(targets|evs) pyAgrum.Potential
hardEvidenceNodes(LoopyMonteCarloSampling self)
Returns: the set of nodes with hard evidence set
hasEvidence(LoopyMonteCarloSampling self, int id)

hasEvidence(LoopyMonteCarloSampling self, str nodeName) -> bool

Parameters: id (int) – a node Id nodeName (str) – a node name True if some node(s) (or the one in parameters) have received evidence bool gum.IndexError – If the node does not belong to the Bayesian network
hasHardEvidence(LoopyMonteCarloSampling self, str nodeName)
Parameters: id (int) – a node Id nodeName (str) – a node name True if node has received a hard evidence bool gum.IndexError – If the node does not belong to the Bayesian network
hasSoftEvidence(LoopyMonteCarloSampling self, int id)

hasSoftEvidence(LoopyMonteCarloSampling self, str nodeName) -> bool

Parameters: id (int) – a node Id nodeName (str) – a node name True if node has received a soft evidence bool gum.IndexError – If the node does not belong to the Bayesian network
history(LoopyMonteCarloSampling self)
Returns: the scheme history tuple gum.OperationNotAllowed – If the scheme did not performed or if verbosity is set to false
isTarget(LoopyMonteCarloSampling self, int variable)

isTarget(LoopyMonteCarloSampling self, str nodeName) -> bool

Parameters: variable (int) – a node Id nodeName (str) – a node name True if variable is a (marginal) target bool gum.IndexError – If the node does not belong to the Bayesian network gum.UndefinedElement – If node Id is not in the Bayesian network
makeInference(LoopyMonteCarloSampling self)

Perform the heavy computations needed to compute the targets’ posteriors

In a Junction tree propagation scheme, for instance, the heavy computations are those of the messages sent in the JT. This is precisely what makeInference should compute. Later, the computations of the posteriors can be done ‘lightly’ by multiplying and projecting those messages.

maxIter(LoopyMonteCarloSampling self)
Returns: the criterion on number of iterations int
maxTime(LoopyMonteCarloSampling self)
Returns: the timeout(in seconds) double
messageApproximationScheme(LoopyMonteCarloSampling self)
Returns: the approximation scheme message str
minEpsilonRate(LoopyMonteCarloSampling self)
Returns: the value of the minimal epsilon rate double
nbrEvidence(LoopyMonteCarloSampling self)
Returns: the number of evidence entered into the Bayesian network int
nbrHardEvidence(LoopyMonteCarloSampling self)
Returns: the number of hard evidence entered into the Bayesian network int
nbrIterations(LoopyMonteCarloSampling self)
Returns: the number of iterations int
nbrSoftEvidence(LoopyMonteCarloSampling self)
Returns: the number of soft evidence entered into the Bayesian network int
nbrTargets(LoopyMonteCarloSampling self)
Returns: the number of marginal targets int
periodSize(LoopyMonteCarloSampling self)
Returns: the number of samples between 2 stopping int gum.OutOfLowerBound – If p<1
posterior(LoopyMonteCarloSampling self, int var)

posterior(LoopyMonteCarloSampling self, str nodeName) -> Potential

Computes and returns the posterior of a node.

Parameters: var (int) – the node Id of the node for which we need a posterior probability nodeName (str) – the node name of the node for which we need a posterior probability a ref to the posterior probability of the node pyAgrum.Potential gum.UndefinedElement – If an element of nodes is not in targets
setEpsilon(LoopyMonteCarloSampling self, double eps)
Parameters: eps (double) – the epsilon we want to use gum.OutOfLowerBound – If eps<0
setEvidence(evidces)

Erase all the evidences and apply addEvidence(key,value) for every pairs in evidces.

Parameters: evidces (dict) – a dict of evidences gum.InvalidArgument – If one value is not a value for the node gum.InvalidArgument – If the size of a value is different from the domain side of the node gum.FatalError – If one value is a vector of 0s gum.UndefinedElement – If one node does not belong to the Bayesian network
setMaxIter(LoopyMonteCarloSampling self, int max)
Parameters: max (int) – the maximum number of iteration gum.OutOfLowerBound – If max <= 1
setMaxTime(LoopyMonteCarloSampling self, double timeout)
Parameters: tiemout (double) – stopping criterion on timeout (in seconds) gum.OutOfLowerBound – If timeout<=0.0
setMinEpsilonRate(LoopyMonteCarloSampling self, double rate)
Parameters: rate (double) – the minimal epsilon rate
setPeriodSize(LoopyMonteCarloSampling self, int p)
Parameters: p (int) – number of samples between 2 stopping gum.OutOfLowerBound – If p<1
setTargets(targets)

Remove all the targets and add the ones in parameter.

Parameters: targets (set) – a set of targets gum.UndefinedElement – If one target is not in the Bayes net
setVerbosity(LoopyMonteCarloSampling self, bool v)
Parameters: v (bool) – verbosity
setVirtualLBPSize(LoopyMonteCarloSampling self, double vlbpsize)
Parameters: vlbpsize (double) – the size of the virtual LBP
softEvidenceNodes(LoopyMonteCarloSampling self)
Returns: the set of nodes with soft evidence set
targets(LoopyMonteCarloSampling self)
Returns: the list of marginal targets list
updateEvidence(evidces)

Apply chgEvidence(key,value) for every pairs in evidces (or addEvidence).

Parameters: evidces (dict) – a dict of evidences gum.InvalidArgument – If one value is not a value for the node gum.InvalidArgument – If the size of a value is different from the domain side of the node gum.FatalError – If one value is a vector of 0s gum.UndefinedElement – If one node does not belong to the Bayesian network
verbosity(LoopyMonteCarloSampling self)
Returns: True if the verbosity is enabled bool

### Loopy Weighted Sampling¶

class pyAgrum.LoopyWeightedSampling(bn: pyAgrum.pyAgrum.IBayesNet)

Class used for inferences using a loopy version of weighted sampling.

Available ructors:
LoopyWeightedSampling(bn) -> LoopyWeightedSampling
Parameters: bn (pyAgrum.BayesNet) – a Bayesian network
BN(LoopyWeightedSampling self)
Returns: A ant reference over the IBayesNet referenced by this class. pyAgrum.IBayesNet gum.UndefinedElement – If no Bayes net has been assigned to the inference.
H(LoopyWeightedSampling self, int X)

H(LoopyWeightedSampling self, str nodeName) -> double

Parameters: X (int) – a node Id nodeName (str) – a node name the computed Shanon’s entropy of a node given the observation double
addAllTargets(LoopyWeightedSampling self)

Add all the nodes as targets.

addEvidence(LoopyWeightedSampling self, int id, int val)

addEvidence(LoopyWeightedSampling self, str nodeName, int val) addEvidence(LoopyWeightedSampling self, int id, str val) addEvidence(LoopyWeightedSampling self, str nodeName, str val) addEvidence(LoopyWeightedSampling self, int id, Vector vals) addEvidence(LoopyWeightedSampling self, str nodeName, Vector vals)

Adds a new evidence on a node (might be soft or hard).

Parameters: id (int) – a node Id nodeName (int) – a node name val – (int) a node value val – (str) the label of the node value vals (list) – a list of values gum.InvalidArgument – If the node already has an evidence gum.InvalidArgument – If val is not a value for the node gum.InvalidArgument – If the size of vals is different from the domain side of the node gum.FatalError – If vals is a vector of 0s gum.UndefinedElement – If the node does not belong to the Bayesian network
addTarget(LoopyWeightedSampling self, int target)

Add a marginal target to the list of targets.

Parameters: target (int) – a node Id nodeName (str) – a node name gum.UndefinedElement – If target is not a NodeId in the Bayes net
chgEvidence(LoopyWeightedSampling self, int id, int val)

chgEvidence(LoopyWeightedSampling self, str nodeName, int val) chgEvidence(LoopyWeightedSampling self, int id, str val) chgEvidence(LoopyWeightedSampling self, str nodeName, str val) chgEvidence(LoopyWeightedSampling self, int id, Vector vals) chgEvidence(LoopyWeightedSampling self, str nodeName, Vector vals)

Change the value of an already existing evidence on a node (might be soft or hard).

Parameters: id (int) – a node Id nodeName (int) – a node name val – (int) a node value val – (str) the label of the node value vals (list) – a list of values gum.InvalidArgument – If the node does not already have an evidence gum.InvalidArgument – If val is not a value for the node gum.InvalidArgument – If the size of vals is different from the domain side of the node gum.FatalError – If vals is a vector of 0s gum.UndefinedElement – If the node does not belong to the Bayesian network
currentPosterior(LoopyWeightedSampling self, int id)

currentPosterior(LoopyWeightedSampling self, str name) -> Potential

Computes and returns the current posterior of a node.

Parameters: var (int) – the node Id of the node for which we need a posterior probability nodeName (str) – the node name of the node for which we need a posterior probability a ref to the current posterior probability of the node pyAgrum.Potential UndefinedElement – If an element of nodes is not in targets
currentTime(LoopyWeightedSampling self)
Returns: get the current running time in second (double) double
epsilon(LoopyWeightedSampling self)
Returns: the value of epsilon double
eraseAllEvidence(LoopyWeightedSampling self)

Removes all the evidence entered into the network.

eraseAllTargets(LoopyWeightedSampling self)

Clear all previously defined targets (marginal and joint targets).

As a result, no posterior can be computed (since we can only compute the posteriors of the marginal or joint targets that have been added by the user).

eraseEvidence(LoopyWeightedSampling self, int id)

eraseEvidence(LoopyWeightedSampling self, str nodeName)

Remove the evidence, if any, corresponding to the node Id or name.

Parameters: id (int) – a node Id nodeName (int) – a node name gum.IndexError – If the node does not belong to the Bayesian network
eraseTarget(LoopyWeightedSampling self, int target)

eraseTarget(LoopyWeightedSampling self, str nodeName)

Remove, if existing, the marginal target.

Parameters: target (int) – a node Id nodeName (int) – a node name gum.IndexError – If one of the node does not belong to the Bayesian network gum.UndefinedElement – If node Id is not in the Bayesian network
evidenceImpact(LoopyWeightedSampling self, int target, PyObject * evs)

evidenceImpact(LoopyWeightedSampling self, str target, Vector_string evs) -> Potential

Create a pyAgrum.Potential for P(target|evs) (for all instanciation of target and evs)

Parameters: target (set) – a set of targets ids or names. evs (set) – a set of nodes ids or names.

Warning

if some evs are d-separated, they are not included in the Potential.

Returns: a Potential for P(targets|evs) pyAgrum.Potential
hardEvidenceNodes(LoopyWeightedSampling self)
Returns: the set of nodes with hard evidence set
hasEvidence(LoopyWeightedSampling self, int id)

hasEvidence(LoopyWeightedSampling self, str nodeName) -> bool

Parameters: id (int) – a node Id nodeName (str) – a node name True if some node(s) (or the one in parameters) have received evidence bool gum.IndexError – If the node does not belong to the Bayesian network
hasHardEvidence(LoopyWeightedSampling self, str nodeName)
Parameters: id (int) – a node Id nodeName (str) – a node name True if node has received a hard evidence bool gum.IndexError – If the node does not belong to the Bayesian network
hasSoftEvidence(LoopyWeightedSampling self, int id)

hasSoftEvidence(LoopyWeightedSampling self, str nodeName) -> bool

Parameters: id (int) – a node Id nodeName (str) – a node name True if node has received a soft evidence bool gum.IndexError – If the node does not belong to the Bayesian network
history(LoopyWeightedSampling self)
Returns: the scheme history tuple gum.OperationNotAllowed – If the scheme did not performed or if verbosity is set to false
isTarget(LoopyWeightedSampling self, int variable)

isTarget(LoopyWeightedSampling self, str nodeName) -> bool

Parameters: variable (int) – a node Id nodeName (str) – a node name True if variable is a (marginal) target bool gum.IndexError – If the node does not belong to the Bayesian network gum.UndefinedElement – If node Id is not in the Bayesian network
makeInference(LoopyWeightedSampling self)

Perform the heavy computations needed to compute the targets’ posteriors

In a Junction tree propagation scheme, for instance, the heavy computations are those of the messages sent in the JT. This is precisely what makeInference should compute. Later, the computations of the posteriors can be done ‘lightly’ by multiplying and projecting those messages.

maxIter(LoopyWeightedSampling self)
Returns: the criterion on number of iterations int
maxTime(LoopyWeightedSampling self)
Returns: the timeout(in seconds) double
messageApproximationScheme(LoopyWeightedSampling self)
Returns: the approximation scheme message str
minEpsilonRate(LoopyWeightedSampling self)
Returns: the value of the minimal epsilon rate double
nbrEvidence(LoopyWeightedSampling self)
Returns: the number of evidence entered into the Bayesian network int
nbrHardEvidence(LoopyWeightedSampling self)
Returns: the number of hard evidence entered into the Bayesian network int
nbrIterations(LoopyWeightedSampling self)
Returns: the number of iterations int
nbrSoftEvidence(LoopyWeightedSampling self)
Returns: the number of soft evidence entered into the Bayesian network int
nbrTargets(LoopyWeightedSampling self)
Returns: the number of marginal targets int
periodSize(LoopyWeightedSampling self)
Returns: the number of samples between 2 stopping int gum.OutOfLowerBound – If p<1
posterior(LoopyWeightedSampling self, int var)

posterior(LoopyWeightedSampling self, str nodeName) -> Potential

Computes and returns the posterior of a node.

Parameters: var (int) – the node Id of the node for which we need a posterior probability nodeName (str) – the node name of the node for which we need a posterior probability a ref to the posterior probability of the node pyAgrum.Potential gum.UndefinedElement – If an element of nodes is not in targets
setEpsilon(LoopyWeightedSampling self, double eps)
Parameters: eps (double) – the epsilon we want to use gum.OutOfLowerBound – If eps<0
setEvidence(evidces)

Erase all the evidences and apply addEvidence(key,value) for every pairs in evidces.

Parameters: evidces (dict) – a dict of evidences gum.InvalidArgument – If one value is not a value for the node gum.InvalidArgument – If the size of a value is different from the domain side of the node gum.FatalError – If one value is a vector of 0s gum.UndefinedElement – If one node does not belong to the Bayesian network
setMaxIter(LoopyWeightedSampling self, int max)
Parameters: max (int) – the maximum number of iteration gum.OutOfLowerBound – If max <= 1
setMaxTime(LoopyWeightedSampling self, double timeout)
Parameters: tiemout (double) – stopping criterion on timeout (in seconds) gum.OutOfLowerBound – If timeout<=0.0
setMinEpsilonRate(LoopyWeightedSampling self, double rate)
Parameters: rate (double) – the minimal epsilon rate
setPeriodSize(LoopyWeightedSampling self, int p)
Parameters: p (int) – number of samples between 2 stopping gum.OutOfLowerBound – If p<1
setTargets(targets)

Remove all the targets and add the ones in parameter.

Parameters: targets (set) – a set of targets gum.UndefinedElement – If one target is not in the Bayes net
setVerbosity(LoopyWeightedSampling self, bool v)
Parameters: v (bool) – verbosity
setVirtualLBPSize(LoopyWeightedSampling self, double vlbpsize)
Parameters: vlbpsize (double) – the size of the virtual LBP
softEvidenceNodes(LoopyWeightedSampling self)
Returns: the set of nodes with soft evidence set
targets(LoopyWeightedSampling self)
Returns: the list of marginal targets list
updateEvidence(evidces)

Apply chgEvidence(key,value) for every pairs in evidces (or addEvidence).

Parameters: evidces (dict) – a dict of evidences gum.InvalidArgument – If one value is not a value for the node gum.InvalidArgument – If the size of a value is different from the domain side of the node gum.FatalError – If one value is a vector of 0s gum.UndefinedElement – If one node does not belong to the Bayesian network
verbosity(LoopyWeightedSampling self)
Returns: True if the verbosity is enabled bool

### Loopy Importance Sampling¶

class pyAgrum.LoopyImportanceSampling(bn: pyAgrum.pyAgrum.IBayesNet)

Class used for inferences using a loopy version of importance sampling.

Available ructors:
LoopyImportanceSampling(bn) -> LoopyImportanceSampling
Parameters: bn (pyAgrum.BayesNet) – a Bayesian network
BN(LoopyImportanceSampling self)
Returns: A ant reference over the IBayesNet referenced by this class. pyAgrum.IBayesNet gum.UndefinedElement – If no Bayes net has been assigned to the inference.
H(LoopyImportanceSampling self, int X)

H(LoopyImportanceSampling self, str nodeName) -> double

Parameters: X (int) – a node Id nodeName (str) – a node name the computed Shanon’s entropy of a node given the observation double
addAllTargets(LoopyImportanceSampling self)

Add all the nodes as targets.

addEvidence(LoopyImportanceSampling self, int id, int val)

addEvidence(LoopyImportanceSampling self, str nodeName, int val) addEvidence(LoopyImportanceSampling self, int id, str val) addEvidence(LoopyImportanceSampling self, str nodeName, str val) addEvidence(LoopyImportanceSampling self, int id, Vector vals) addEvidence(LoopyImportanceSampling self, str nodeName, Vector vals)

Adds a new evidence on a node (might be soft or hard).

Parameters: id (int) – a node Id nodeName (int) – a node name val – (int) a node value val – (str) the label of the node value vals (list) – a list of values gum.InvalidArgument – If the node already has an evidence gum.InvalidArgument – If val is not a value for the node gum.InvalidArgument – If the size of vals is different from the domain side of the node gum.FatalError – If vals is a vector of 0s gum.UndefinedElement – If the node does not belong to the Bayesian network
addTarget(LoopyImportanceSampling self, int target)

Add a marginal target to the list of targets.

Parameters: target (int) – a node Id nodeName (str) – a node name gum.UndefinedElement – If target is not a NodeId in the Bayes net
chgEvidence(LoopyImportanceSampling self, int id, int val)

chgEvidence(LoopyImportanceSampling self, str nodeName, int val) chgEvidence(LoopyImportanceSampling self, int id, str val) chgEvidence(LoopyImportanceSampling self, str nodeName, str val) chgEvidence(LoopyImportanceSampling self, int id, Vector vals) chgEvidence(LoopyImportanceSampling self, str nodeName, Vector vals)

Change the value of an already existing evidence on a node (might be soft or hard).

Parameters: id (int) – a node Id nodeName (int) – a node name val – (int) a node value val – (str) the label of the node value vals (list) – a list of values gum.InvalidArgument – If the node does not already have an evidence gum.InvalidArgument – If val is not a value for the node gum.InvalidArgument – If the size of vals is different from the domain side of the node gum.FatalError – If vals is a vector of 0s gum.UndefinedElement – If the node does not belong to the Bayesian network
currentPosterior(LoopyImportanceSampling self, int id)

currentPosterior(LoopyImportanceSampling self, str name) -> Potential

Computes and returns the current posterior of a node.

Parameters: var (int) – the node Id of the node for which we need a posterior probability nodeName (str) – the node name of the node for which we need a posterior probability a ref to the current posterior probability of the node pyAgrum.Potential UndefinedElement – If an element of nodes is not in targets
currentTime(LoopyImportanceSampling self)
Returns: get the current running time in second (double) double
epsilon(LoopyImportanceSampling self)
Returns: the value of epsilon double
eraseAllEvidence(LoopyImportanceSampling self)

Removes all the evidence entered into the network.

eraseAllTargets(LoopyImportanceSampling self)

Clear all previously defined targets (marginal and joint targets).

As a result, no posterior can be computed (since we can only compute the posteriors of the marginal or joint targets that have been added by the user).

eraseEvidence(LoopyImportanceSampling self, int id)

eraseEvidence(LoopyImportanceSampling self, str nodeName)

Remove the evidence, if any, corresponding to the node Id or name.

Parameters: id (int) – a node Id nodeName (int) – a node name gum.IndexError – If the node does not belong to the Bayesian network
eraseTarget(LoopyImportanceSampling self, int target)

eraseTarget(LoopyImportanceSampling self, str nodeName)

Remove, if existing, the marginal target.

Parameters: target (int) – a node Id nodeName (int) – a node name gum.IndexError – If one of the node does not belong to the Bayesian network gum.UndefinedElement – If node Id is not in the Bayesian network
evidenceImpact(LoopyImportanceSampling self, int target, PyObject * evs)

evidenceImpact(LoopyImportanceSampling self, str target, Vector_string evs) -> Potential

Create a pyAgrum.Potential for P(target|evs) (for all instanciation of target and evs)

Parameters: target (set) – a set of targets ids or names. evs (set) – a set of nodes ids or names.

Warning

if some evs are d-separated, they are not included in the Potential.

Returns: a Potential for P(targets|evs) pyAgrum.Potential
hardEvidenceNodes(LoopyImportanceSampling self)
Returns: the set of nodes with hard evidence set
hasEvidence(LoopyImportanceSampling self, int id)

hasEvidence(LoopyImportanceSampling self, str nodeName) -> bool

Parameters: id (int) – a node Id nodeName (str) – a node name True if some node(s) (or the one in parameters) have received evidence bool gum.IndexError – If the node does not belong to the Bayesian network
hasHardEvidence(LoopyImportanceSampling self, str nodeName)
Parameters: id (int) – a node Id nodeName (str) – a node name True if node has received a hard evidence bool gum.IndexError – If the node does not belong to the Bayesian network
hasSoftEvidence(LoopyImportanceSampling self, int id)

hasSoftEvidence(LoopyImportanceSampling self, str nodeName) -> bool

Parameters: id (int) – a node Id nodeName (str) – a node name True if node has received a soft evidence bool gum.IndexError – If the node does not belong to the Bayesian network
history(LoopyImportanceSampling self)
Returns: the scheme history tuple gum.OperationNotAllowed – If the scheme did not performed or if verbosity is set to false
isTarget(LoopyImportanceSampling self, int variable)

isTarget(LoopyImportanceSampling self, str nodeName) -> bool

Parameters: variable (int) – a node Id nodeName (str) – a node name True if variable is a (marginal) target bool gum.IndexError – If the node does not belong to the Bayesian network gum.UndefinedElement – If node Id is not in the Bayesian network
makeInference(LoopyImportanceSampling self)

Perform the heavy computations needed to compute the targets’ posteriors

In a Junction tree propagation scheme, for instance, the heavy computations are those of the messages sent in the JT. This is precisely what makeInference should compute. Later, the computations of the posteriors can be done ‘lightly’ by multiplying and projecting those messages.

maxIter(LoopyImportanceSampling self)
Returns: the criterion on number of iterations int
maxTime(LoopyImportanceSampling self)
Returns: the timeout(in seconds) double
messageApproximationScheme(LoopyImportanceSampling self)
Returns: the approximation scheme message str
minEpsilonRate(LoopyImportanceSampling self)
Returns: the value of the minimal epsilon rate double
nbrEvidence(LoopyImportanceSampling self)
Returns: the number of evidence entered into the Bayesian network int
nbrHardEvidence(LoopyImportanceSampling self)
Returns: the number of hard evidence entered into the Bayesian network int
nbrIterations(LoopyImportanceSampling self)
Returns: the number of iterations int
nbrSoftEvidence(LoopyImportanceSampling self)
Returns: the number of soft evidence entered into the Bayesian network int
nbrTargets(LoopyImportanceSampling self)
Returns: the number of marginal targets int
periodSize(LoopyImportanceSampling self)
Returns: the number of samples between 2 stopping int gum.OutOfLowerBound – If p<1
posterior(LoopyImportanceSampling self, int var)

posterior(LoopyImportanceSampling self, str nodeName) -> Potential

Computes and returns the posterior of a node.

Parameters: var (int) – the node Id of the node for which we need a posterior probability nodeName (str) – the node name of the node for which we need a posterior probability a ref to the posterior probability of the node pyAgrum.Potential gum.UndefinedElement – If an element of nodes is not in targets
setEpsilon(LoopyImportanceSampling self, double eps)
Parameters: eps (double) – the epsilon we want to use gum.OutOfLowerBound – If eps<0
setEvidence(evidces)

Erase all the evidences and apply addEvidence(key,value) for every pairs in evidces.

Parameters: evidces (dict) – a dict of evidences gum.InvalidArgument – If one value is not a value for the node gum.InvalidArgument – If the size of a value is different from the domain side of the node gum.FatalError – If one value is a vector of 0s gum.UndefinedElement – If one node does not belong to the Bayesian network
setMaxIter(LoopyImportanceSampling self, int max)
Parameters: max (int) – the maximum number of iteration gum.OutOfLowerBound – If max <= 1
setMaxTime(LoopyImportanceSampling self, double timeout)
Parameters: tiemout (double) – stopping criterion on timeout (in seconds) gum.OutOfLowerBound – If timeout<=0.0
setMinEpsilonRate(LoopyImportanceSampling self, double rate)
Parameters: rate (double) – the minimal epsilon rate
setPeriodSize(LoopyImportanceSampling self, int p)
Parameters: p (int) – number of samples between 2 stopping gum.OutOfLowerBound – If p<1
setTargets(targets)

Remove all the targets and add the ones in parameter.

Parameters: targets (set) – a set of targets gum.UndefinedElement – If one target is not in the Bayes net
setVerbosity(LoopyImportanceSampling self, bool v)
Parameters: v (bool) – verbosity
setVirtualLBPSize(LoopyImportanceSampling self, double vlbpsize)
Parameters: vlbpsize (double) – the size of the virtual LBP
softEvidenceNodes(LoopyImportanceSampling self)
Returns: the set of nodes with soft evidence set
targets(LoopyImportanceSampling self)
Returns: the list of marginal targets list
updateEvidence(evidces)

Apply chgEvidence(key,value) for every pairs in evidces (or addEvidence).

Parameters: evidces (dict) – a dict of evidences gum.InvalidArgument – If one value is not a value for the node gum.InvalidArgument – If the size of a value is different from the domain side of the node gum.FatalError – If one value is a vector of 0s gum.UndefinedElement – If one node does not belong to the Bayesian network
verbosity(LoopyImportanceSampling self)
Returns: True if the verbosity is enabled bool