Probability (Part-3)

  Probabilistic Inference : The computation from observed evidence, of posterior probabilities for query  Propositions. It leads to the Joint Probability Distribution.

               If P(A|B) = P(B|A)*P(A) /P(B) 

so here P(B|A) is the evidence , P(A) is the prior probability and P(B) is the posterior Probability.

General Inference Procedure : 

Let X be the query value which is a dependent variable.

Let E be the evidence variables and e be the observed values and is specific for them.

Let Y be the unobserved Variables.

    P(X|e) = α P(X,e) = α ΣyP(X,e,Y)

where α is the normalization constant.

Bayesian  Belief Network :  It is a probabilistic Graphical Model. It follows casuality, means   there will be the reason for something.Way to reduce its parameters is by making sum of some independent variables.

Product Rule : Applicable when there are two variables present.

                               P(A1A2) = P(A2|A1) P(A1)

Chain Rule : It is the generalized form of product rule.

 P(A1A2.......An) = P(An|A1A2.......An-1)P(An-1|A1A2............An-2)P(A2|A1)P(A1)

Naive Bayes as a Bayesian Network



P(x1,.....xn,C) = P(C) πni=1 (P(xi|C))

Here parameters needed to learn is 2n + 1 




Comments

Popular posts from this blog

Supervised Learning(Part-5)

Supervised Learning(Part-2)

Text Analysis (Part - 4)