Bayesian Inference with Beta Prior in Coin Toss Experiment

Slide Note
Embed
Share

Suppose you have a Beta(4,.4) prior distribution on the probability of a coin yielding a head. After spinning the coin ten times and observing fewer than 3 heads, the exact posterior density is calculated. The posterior distribution is plotted and analyzed, showing how the prior influences the updated beliefs in light of new evidence.


Uploaded on Aug 03, 2024 | 5 Views


Download Presentation

Please find below an Image/Link to download the presentation.

The content on the website is provided AS IS for your information and personal use only. It may not be sold, licensed, or shared on other websites without obtaining consent from the author. Download presentation by click this link. If you encounter any issues during the download, it is possible that the publisher has removed the file from their server.

E N D

Presentation Transcript


  1. Exercise 2.1 Posterior inference: Suppose you have a Beta(4, 4) prior distribution on the probability that a coin will yield a head when spun in a specified manner. The coin is independently spun ten times, and heads appear fewer than 3 times. You are not told how many heads were seen, only that the number is less than 3. Calculate your exact posterior density (up to a proportionality constant) for and sketch it. 1

  2. Exercise 2.1 Suppose you have a Beta(4, 4) prior distribution on the probability that a coin will yield a head when spun in a specified manner. #ex. 1 theta = seq(0,1,0.01) prior_dens = dbeta(theta,4,4) plot(theta,prior_dens) 2

  3. Exercise 2.1 Suppose you have a Beta(4, 4) prior distribution on the probability that a coin will yield a head when spun in a specified manner. The coin is independently spun ten times, and heads appear fewer than 3 times. 3

  4. Exercise 2.1 Suppose you have a Beta(4, 4) prior distribution on the probability that a coin will yield a head when spun in a specified manner. The coin is independently spun ten times, and heads appear fewer than 3 times. Calculate your exact posterior density (up to a proportionality constant) for and sketch it. ? ???? ? ?? ???? 4

  5. #ex. 1 theta =seq(0.001,1,0.01) prior_dens =dbeta(theta,4,4) plot(theta,prior_dens) Exercise 2.1 posterior_dens = theta^3*(1-theta)^13 + 10*theta^4*(1-theta)^12 + 45*theta^5*(1- theta)^11 plot(theta,posterior_dens,col=2) 5

  6. Exercise 2.1 6

  7. Exercise 2.5 Posterior distribution as a compromise between prior information and data: let y be the number of heads in n spins of a coin, whose probability of heads is . (a) If your prior distribution for is uniform on the range [0, 1], derive your prior predictive distribution for y, 1 Pr ? = ? = Pr ? = ? ? ?? 0 for each k = 0, 1, . . . , n. (b) Suppose you assign a Beta( , ) prior distribution for , and then you observe y heads out of n spins. Show algebraically that your posterior mean of always lies between your prior mean, ?+?, and the observed relative frequency of heads, ? ? ?. (c) Show that, if the prior distribution on is uniform, the posterior variance of is always less than the prior variance. (d) Give an example of a Beta( , ) prior distribution and data y, n, in which the posterior variance of is higher than the prior variance. 7

  8. Exercise 2.5 a Let y be the number of heads in n spins of a coin, whose probability of heads is . (a) If your prior distribution for is uniform on the range [0, 1], derive your prior predictive distribution for y, 1 Pr ? = ? = Pr ? = ? ? ?? 0 for each k = 0, 1, . . . , n. 1 1 1 Pr ? = ? = ?(?,?)?? = Pr ? = ? ? ?(?)?? = Pr ? = ? ? ?? 0 0 0 Beta 1 ? ???(1 ?)? ??? = distribution 0 1 (? + 1) (? ? + 1) (? + 2) (? + 2) ? ? (? + 1) (? ? + 1)??(1 ?)? ??? = 0 (? + 1) (? ? + 1) (? + 2) ? ? = ? = ? 1 !, when ? integer ?! ?! ? ? ! ? + 1 ! 1 = = ? ? !?! ? + 1 8

  9. Exercise 2.5 b let y be the number of heads in n spins of a coin, whose probability of heads is . b. Suppose you assign a Beta( , ) prior distribution for , and then you observe y heads out of n spins. Show algebraically that your posterior mean of always lies between your prior mean, ?+?, and the observed relative frequency of heads, ? ? ?. ? ? ? ? ? ? ? ? ? ???(1 ?)? ? (?+?) (?) (?)?? 1(1 ?)? 1 = ? ? (?+?) (?) (?)??+? 1(1 ?)? ?+? 1 = ?+? ?+?+? posterior mean of =?(?|?) = 9

  10. Exercise 2.5 b let y be the number of heads in n spins of a coin, whose probability of heads is . b. Suppose you assign a Beta( , ) prior distribution for , and then you observe y heads out of n spins. Show algebraically that your posterior mean of always lies between your prior mean, ?+?, and the observed relative frequency of heads, ? ? ?. ?+? ?+?+? posterior mean of ?(?|?) = ? ?+? ?+?+?< ? ? ?+?< ?+? ?+?+?= ?+?+ (1 )? ? ?, and show that 0 1 We ll write ?+? ?+?+?= ?+?+ (1 )? ? ? ?+? ?+?+?, which is always between 0 and 1, hence: the posterior mean is a weighted average of the prior mean and the data = 10

  11. Exercise 2.5 c Show that, if the prior distribution on is uniform, the posterior variance of is always less than the prior variance. Uniform distribution is the same as Beta(1,1), so prior variance =1 12 ? ? ? ? ? ? ? ? Beta(y+1,n-y+1) ? ???(1 ?)? ? = 1 4 1 3 1 12 Posterior variance: (?+1)(? ?+1) ?+1 (?+2) (? ?+1) (?+2) 1 = (?+2)2(?+3) (?+3) ?+1 (?+2) (? ?+1) (?+2)this is two factors which sum to 1, hence: 1 4 11

  12. Exercise 2.5 c 1 4 Proof: Two factors sum to 1, then the product ? + ? = 1, ? = 1 ? ?? = ?(1 ?) To minimize: ? ??? ?2= 1 2? = 0 ? =1 ? =1 ?? =1 2 2 4 12

  13. Exercise 2.5 c Show that, if the prior distribution on is uniform, the posterior variance of is always less than the prior variance. 1 4 1 3 1 12 Posterior variance: (?+1)(? ?+1) ?+1 (?+2) (? ?+1) (?+2) 1 = (?+2)2(?+3) (?+3) ?+1 (?+2) (? ?+1) (?+2)this is two factors which sum to 1, hence: 1 4 13

  14. Exercise 2.5 d Give an example of a Beta( , ) prior distribution and data y, n, in which the posterior variance of is higher than the prior variance. 14

  15. 15

Related