User Tools

Site Tools


bayesian_statistics

Differences

This shows you the differences between two versions of the page.

Link to this comparison view

Both sides previous revision Previous revision
Next revision
Previous revision
Next revision Both sides next revision
bayesian_statistics [2019/10/03 02:34]
floyd
bayesian_statistics [2019/10/03 02:40]
floyd
Line 22: Line 22:
 and indeed 1/4 of the mice have white fur (2 out of 8). This is an awkward way to calculate the probability, but it does show the equation and relationship between the probabilities work.  and indeed 1/4 of the mice have white fur (2 out of 8). This is an awkward way to calculate the probability, but it does show the equation and relationship between the probabilities work. 
  
-Once we accept the relationships between the probabilities and equalities we can easily rearranged the system to the classical Bayesian equation +Once we accept the relationships between the probabilities we can easily rearranged the system to the classical Bayesian equation 
 $$P(M|D) P(D) = P(D|M) P(M)\mbox{,}$$ $$P(M|D) P(D) = P(D|M) P(M)\mbox{,}$$
 $$P(M|D) = \frac{P(D|M) P(M)}{P(D)}\mbox{.}$$ $$P(M|D) = \frac{P(D|M) P(M)}{P(D)}\mbox{.}$$
Line 28: Line 28:
 Let's bring this into our example.  Let's bring this into our example. 
 $$P(M_1|D) = \frac{P(D|M_1) P(M_1)}{P(D)}\mbox{.}$$ $$P(M_1|D) = \frac{P(D|M_1) P(M_1)}{P(D)}\mbox{.}$$
-We saw above that $P(D) = P(M_1) P(D|M_1) + P(M_2) P(D|M_2)\mbox{,}$ which makes this a bit more intutitve when we substitute it in. +We saw above that $P(D) = P(M_1) P(D|M_1) + P(M_2) P(D|M_2)\mbox{,}$ which makes this a bit more intuitive when we substitute it in. 
 $$P(M_1|D) = \frac{P(D|M_1) P(M_1)}{ P(D|M_1) P(M_1) +  P(D|M_2) P(M_2)}$$ $$P(M_1|D) = \frac{P(D|M_1) P(M_1)}{ P(D|M_1) P(M_1) +  P(D|M_2) P(M_2)}$$
 The equation is really a fraction out of the total of all possibilities (of the data under all models).  The equation is really a fraction out of the total of all possibilities (of the data under all models). 
Line 34: Line 34:
 $$P(M_1|D) = \frac{0.1 \times 0.05}{ 0.1 \times 0.05 + 1 \times 0.95} = \frac{0.005}{0.955} \approx 0.00524$$ $$P(M_1|D) = \frac{0.1 \times 0.05}{ 0.1 \times 0.05 + 1 \times 0.95} = \frac{0.005}{0.955} \approx 0.00524$$
  
-This $P(M_1|D)  0.004775$ is known as the posterior probability---the probability of our hypothesis after we have incorporated our prior and newly observed data. So, the observation of a single +/+ offspring brought the probability of a $t$/+ father down from 5% to approximatly 1/2%. Congratulations, you are doing Bayesian statistics. +This $P(M_1|D) \approx  0.00524$ is known as the posterior probability---the probability of our hypothesis after we have incorporated our prior and newly observed data. So, the observation of a single +/+ offspring brought the probability of a $t$/+ father down from 5% to approximately 1/2%. Congratulations, you are doing Bayesian statistics. 
  
 What if we observed a second +/+ offspring? The probability of the data under the first hypothesis is now $P(D|M_1) = 0.1^2=0.01$ and $P(D|M_2) = 1^2=1$. What if we observed a second +/+ offspring? The probability of the data under the first hypothesis is now $P(D|M_1) = 0.1^2=0.01$ and $P(D|M_2) = 1^2=1$.
Line 47: Line 47:
  
 Let's say that we didn't see a heterozygous offspring; how many observations are enough to be reasonably confident that the male parent was a +/+ homozygote? Bayes factor is a proposed way to address this. It is a ratio of the two probabilities of the data given the models (here $M_2$ is in the numerator because it has more support from the data).  Let's say that we didn't see a heterozygous offspring; how many observations are enough to be reasonably confident that the male parent was a +/+ homozygote? Bayes factor is a proposed way to address this. It is a ratio of the two probabilities of the data given the models (here $M_2$ is in the numerator because it has more support from the data). 
-$$K = \frac{P(D|M_2)}{P(D|M_1} $$+$$K = \frac{P(D|M_2)}{P(D|M_1)} $$
 So with the observation of a single +/+ offspring $K_1 = \frac{1}{0.1} = 10$. So with the observation of a single +/+ offspring $K_1 = \frac{1}{0.1} = 10$.
 Two +/+ offspring gives $K_2 = \frac{1}{0.01} = 100$.  Two +/+ offspring gives $K_2 = \frac{1}{0.01} = 100$. 
bayesian_statistics.txt ยท Last modified: 2019/10/03 18:21 by floyd