Wednesday 17 June 2015

Inference in Dynamic Bayesian Networks

Today, I will be talking about how the inference works in Dynamic Bayesian Networks.
We could have applied the following methods,
1) Naive Method:- There is one way where we could unroll the bayesian network as much as we'd like and then apply inference methods that we applied for the standard bayesian network. However, this methods lead to exponentially large graphs, thereby increasing the time that it takes for inference.
(It was a surprise to me, wondered if we could do so, but then leads to a lot of problems)
2) Forward and Backward Algorithm:- We could apply this algorithm but then this applies exclusively for hidden markov model(hmm). This involves converting each of those nodes into state spaces, basically increasing the sizes, again leading to huge complexity.
So as to reduce the complexity of inference, the methods are as follows:-
 We could compute a prior belief state by using some recursive estimation.
Let's assume that

Then we could propagate the state forward by the following algorithm.


So this complicated jargon would be always multiplied for a recursive procedure so as to do the filtering algorithm.
Other Algorithms are the Frontier Algorithms and the Interface Algorithms which are more popular for the inference on a large scale.
Apart from that, there could be certain expectation maximization algorithms which may help in computing the most probable path.
This is what I am planning to implement in a couple of weeks.

                                                                                                                         
                                       

No comments:

Post a Comment