Understanding Probability Density Functions for Continuous Random Variables

Slide Note
Embed
Share

Probability density functions (PDFs) are introduced for continuous random variables to represent the likelihood of events in a continuous space. Unlike discrete probability mass functions, PDFs operate with integration instead of summation, ensuring total probability is 1. Consistency and differentiation between impossible and possible events are key aspects of PDFs. Utilizing PDFs allows for analyzing ratios and relationships between different events in a continuous probability space.


Uploaded on Oct 01, 2024 | 0 Views


Download Presentation

Please find below an Image/Link to download the presentation.

The content on the website is provided AS IS for your information and personal use only. It may not be sold, licensed, or shared on other websites without obtaining consent from the author. Download presentation by click this link. If you encounter any issues during the download, it is possible that the publisher has removed the file from their server.

E N D

Presentation Transcript


  1. Continuous Zoo CSE 312 Spring 24 Lecture 16

  2. Lets start with the pmf For discrete random variables, we defined the pmf: ??? = (? = ?). We can t have a pmf quite like we did for discrete random variables. Let ? be a random real number between 0 and 1. 1 ?? ? = .1 = Let s try to maintain as many rules as we can Discrete ??? 0 Continuous ??? 0 ??(?) d? = 1 Use ?? instead of ?? to remember it s different . ??(?) = 1 ?

  3. The probability density function For Continuous random variables, the analogous object is the probability density function we write ??? instead of ??(?) Idea: Make it work right for events events since single outcomes don t make sense. ? integrating is analogous to sum. ? ? ? = ? ??? d? = ? ?

  4. The probability density function For Continuous random variables, the analogous object is the probability density function we write ??? instead of ??(?) Idea: Make it work right for events events since single outcomes don t make sense. ? ??? d? = ? integrating is analogous to sum. ? ? ? = ? ?

  5. PDF for uniform Let ? be a uniform real number between 0 and 1. What should ??(?) be to make all those events integrate to the right values? ??? = 0 if ? < 0 or ? > 1 1 if 0 ? 1

  6. Probability Density Function So ? = .1 =?? ??.1 = 1 The number that best represents (? = .1) is 0. This is different from ??(?) For continuous probability spaces: Impossible events have probability ?, but some probability ? events might be possible. So what is ??(?)???

  7. Using the PDF Let s look at a different pdf Compare the events: ? .2 and ? .5 (.2 ?/2 ? .2 + ?/2) What will the pdf give? .2 ?/2 ??.2 ? 2+?/2??? d? .2 .5 What happens if we look at the ratio (? .2) (? .5)

  8. Using the PDF Let s look at a different pdf Compare the events: ? .2 and ? .5 (.2 ?/2 ? .2 + ?/2) What will the pdf give? .2 ?/2 2+?/2??? d? ??.2 ? .2 .5 What happens if we look at the ratio .2 ? .5 ? 2 ? .2+? 2 ? .5+? (? .2) (? .5)= =???(.2) ???(.5)=??.2 2 ??(.5) 2

  9. So whats the pdf? It s the number that when integrated over gives the probability of an event. Equivalently, it s number such that: -integrating over all real numbers gives 1. -comparing ??? and ??( ) gives the relative chances of ? being near ? or .

  10. CDFs

  11. Whats a CDF? The Cumulative Distribution Function ??? = (? ?) analogous to the CDF for discrete variables. ? ??? d? ??? = ? ? = So how do I get from CDF to PDF? Taking the derivative! ? d d d???(?) = ??? d? = ??(?) ??

  12. Comparing Discrete and Continuous Discrete Random Variables Continuous Random Variables Probability ? Equivalent to impossible All impossible events have probability 0, but not conversely. PDF ??(?) gives chances relative to ??(? ) Integrate PDF to get probability Relative Chances PMF: ??? = (? = ?) Sum over PMF to get probability Events Convert from CDF to PMF Sum up PMF to get CDF. Look for breakpoints in CDF to get PMF. Integrate PDF to get CDF. Differentiate CDF to get PDF. ?[?] ?(?) ??(?) ? ??? d? ? ?[? ? ] ? ? ? ??(?) ?(?) ??? d? ? ? ?2 ? ? 2 ???(?) ? ?2 ? ? 2= 2??? d? ? ? ?

  13. What about expectation? For a random variable ?, we define: ?(?) ??? d? ? ? = Just replace summing over the pmf with integrating the pdf. It still represents the average value of ?.

  14. Expectation of a function For any function ? and any continuous random variable, ?: ? ? ? = ? ?(?) ??? ?? Again, analogous to the discrete case; just replace summation with integration and pmf with the pdf. We re going to treat this as a definition. Technically, this is really a theorem; since ?() is the pdf of ? and it only gives relative likelihoods for ?, we need a proof to guarantee it works for ?(?). Sometimes called Law of the Unconscious Statistician.

  15. Linearity of Expectation Still true! ? ?? + ?? + ? = ?? ? + ??[?] + ? For all ?,?; even if they re continuous. Won t show you the proof for just ?[?? + ?], it s ? ?? + ? = ?? ? + ? ??(?) d? ?? ? ??? ?? + ? ? ??? ?? + ? = ?? ? + ? ???? ?? ??? ?? = = ?

  16. Variance No surprises here ??? ? = ? ?? ? ? ?= ? ?? ??(?) ? ? ? ?

  17. Lets calculate an expectation Let ? be a uniform random number between ? and ?. ? ??? d? ? ? = ? ?? ? 0d? 1 = ? 0 d? + ? ? ? ? ? d? + 0 ? ?=?= ? ?d? + ? = 0 + ? ?2 ?2 ?2 ?2 ?2 2 ? ?= ?+? ? ? 2 ? ? =?+? = 2(? ?) 2 ? ?= 2(? ?) 2

  18. What about ? ? ? Let ?~Unif(?,?), what about ? ?2? ?2??? d? ? ?2= ? ??2 ?2 0 d? 1 ?2 0 d? + ? ??2 = ? ?d? + ? 1 = 0 + ? ? ?d? + 0 ? ?3 3 ?3 3 ?3 1 1 1 3 ? ? ? ? ?2+ ?? + ?2 = ? ? ?=?= = ? ? 3 =?2+??+?2 3

  19. Lets assemble the variance Var ? = ? ?2 ? ? 2 2 =?2+??+?2 ?+? 2 3(?2+2??+?2) 3 =4(?2+??+?2) 12 12 =?2 2??+?2 12 ? ?2 12 =

  20. Continuous Uniform Distribution ?~Unif(?,?) (uniform real number between ? and ?) 1 ? ? if ? ? ? 0 otherwise 0 ? ? ? ? if ? ? ? 1 PDF: ??? = if ? < ? CDF: ??? = if ? ? ? ? =?+? 2 ? ?2 12 Var ? =

  21. Continuous Zoo ?~?(?,??) ?~????(?,?) ?~???(?) ? ? ?? ??? ? ??? = ?? ?? for ? ? ? ? =? ??? = ? ? =? + ? ??? = ? ????? ? ? ? ? ? ?? ? ? = ? ??? ? = ?? ? ?? ?? ??? ? = ??? ? = It s a smaller zoo, but it s just as much fun!

  22. Exponential Random Variable Like a geometric random variable, but continuous time. How long do we wait until an event happens? (instead of how many flips until a heads ) Where waiting doesn t make the event happen any sooner. Geometric: ? = ? + 1 ? 1) = (? = ?) When the first flip is tails, the coin doesn t remember it came up tails, you ve made no progress. For an exponential random variable: ? ? + 1 ? 1) = (? ?)

  23. Exponential random variable If you take a Poisson random variable and ask what s the time until the next event you get an exponential distribution! Let s find the CDF for an exponential. Let ?~Exp(?), be the time until the first event, when we see an average of ? events per time unit. What s (? > ?)? What Poisson are we waiting on, and what event for it tells you that ? > ??

  24. Exponential random variable If you take a Poisson random variable and ask what s the time until the next event you get an exponential distribution! Let s find the CDF for an exponential. Let ?~Exp(?), be the time until the first event, when we see an average of ?events per time unit. What s (? > ?)? What Poisson are we waiting on? For ?~Poi(??) ? > ? = (? = 0) ??0? ?? 0! = ? ?? ? = 0 = ??? = ? ? = 1 ? ?? (for ? 0, ??? = 0 for ? < 0)

  25. Find the density We know the CDF, ??? = ? ? = 1 ? ?? What s the density? ??? =

  26. Find the density We know the CDF, ??? = ? ? = 1 ? ?? What s the density? ? ??1 ? ??= 0 ? ??? ??= ?? ??. ??? = For t 0it s that expression For ? < 0it s just 0.

  27. Exponential PDF Red: ? = 5 Blue: ? = 2 Purple: ? = 0.5

  28. Memorylessness ? ? + 1 ? 1 = (? ?+1 ? 1) (? ?+1) 1 (1 ? ? 1) = (? 1) =? ?(?+1) ? ? = ? ?? What about (? ?) (without conditioning on the first step)? 1 (1 ? ??) = ? ?? It s the same!!! More generally, for an exponential rv ?, ? ? + ? ? ? = (? ?)

  29. Side note I hid a trick in that algebra, ? 1 = 1 ? < 1 = 1 (? 1) The first step is the complementary law. The second step is using that 1 1??? d? = 0 In general, for continuous random variables we can switch out and < without anything changing. We can t make those switches for discrete random variables.

  30. Expectation of an exponential Don t worry about the derivation (it s here if you re interested; you re not responsible for the derivation. Just the value. Let ?~Exp(?) ? ??? d? ? ? = ? ?? ???? Let ? = ?; ?? = ?? ???? (? = ? ??) Integrate by parts: ?? ?? ? ???? = ?? ?? 1 = 0 ?? ?? Definite Integral: ?? ?? 1 ? ?? ?? 1 ?? ??) (0 1 ?? ?? z=0 = (lim ?) ? ????) (0 1 1 ????+1 1 ?=1 By L Hopital s Rule (lim ? ??? ?) = lim ? ?

  31. Variance of an exponential 1 ?2 If X~Exp ? then Var ? = Similar calculus tricks will get you there.

  32. Exponential ?~Exp(?) Parameter ? 0 is the average number of events in a unit of time. ??? = ?? ?? if ? 0 0 otherwise ??? = 1 ? ?? if ? 0 0 ? ? =1 ? 1 ?2 otherwise Var ? =

Related