This preview has intentionally blurred sections. Sign up to view the full version.View Full Document
Unformatted text preview: 1 6.874/6.807/7.90 Computational functional genomics, lecture 17 (Jaakkola) Causal Bayesian Networks Ste7 Kss1 Fus3 Ste12 (1) (2) (3) (4) Figure 1: Simple Example While Bayesian networks should typically be viewed as acausal, it is possible to impose a causal interpretation on these models with additional care. What we get as a result is a probabilistic extension of the qualitative causal models. The extension is useful since the qualitative models discussed last time are a bit limited in their ability to quantify interactions, e.g., that Ste7 may only have a certain probability of activating Kss1. The modification necessary for maintaining a causal interpretation is exactly analogous to the qualitative models. Suppose we intervene and set the value of one variable, say we knock-out Kss1. Then the mechanism through which Kss1 is activated by Ste7 can no longer be used for inference (it wasn’t responsible for the value set in the intervention). Graphically, this just means deleting all the arrows to the variable(s) set in the intervention. More formally, the probability model associated with the graph in the above figure factors according to P ( x 1 ) P ( x 2 | x 1 ) P ( x 3 x 1 ) P ( x 4 x 2 , x 3 ) (1) | | If we now set set ( x 2 = − 1), i.e., knock out Kss1, then the probability model over the remaining variables is simply P ( x 1 ) P ( x 3 x 1 ) P ( x 4 x 2 = − 1 , x 3 ) (2) | | Note that the parameters in the conditional tables are exactly as before; the only difference is that one of the conditionals is missing. 2 6.874/6.807/7.90 Computational functional genomics, lecture 17 (Jaakkola) The estimation of Bayesian network proceeds otherwise as before. For example, suppose we have two observed expression patterns, one arising from a causal intervention (knock-out), the other one not. x 1 x 2 x 3 x 4 D 1 1 1 1 1 D 2 ( set ( x 2 = − 1) ) 0 − 1 0 0 To estimate the parameters we just write down the log-likelihood of the observed data while taking into account that the model may have to be modified slightly when incorporating causal measurements log P (...
View Full Document
- Spring '08
- Genomics, Decision tree learning, Bayesian network, Computational Functional Genomics