[1] 0.5
2025-11-19
Scenario: Events are equally likely to happen anywhere or anytime in an interval of values
Shorthand: \(X \sim \text{U}[a,b]\)
\[ f_X(x) = \dfrac{1}{b-a}, \text{ for }a \leq x \leq b \]
\[ F_X(x) = \left\{ \begin{array}{ll} 0 & \quad x<a \quad \\ \dfrac{x-a}{b-a} & \quad a \leq x \leq b\quad \\ 1 & \quad x>b \quad \end{array} \right. \]
\[\text{E}(X) = \dfrac{a+b}{2} \text{, } \text{ Var}(X) = \dfrac{(b-a)^2}{12}\]
Look for some indication that all events are equally likely
Look for an interval
Time example: Costumer in your store will approach the cash register in next 30 minutes. Approaching the register throughout the 30 minutes is equally likely.
Length example: You have a 12 inch string that you need to cut. You are equally likely to cut anywhere on the string.
Different than the discrete uniform
Discrete usually includes a countable number of events that are equally likely
Continuous is not countable
Let’s say we’re looking at equally likely arrival times between 10 am and 11 am.
If we want to know the probability that someone arrives at 10:30am or earlier:
If we want to know the time, say \(t\), where the probability of arriving at \(t\) or earlier is 0.35:
If we want to know the probability that someone arrives between 10:14 and 10:16 am:
If we want to sample 20 arrival times from the distribution:
Example 1
A bird lands at a location that is Uniformly distributed along an electrical wire of length 150 feet. The wire is stretched tightly between two poles. What is the probability that the bird is 20 feet or less from one or the other of the poles?
Scenario: Modeling the time until the next (first) event
Continuous analog to the geometric distribution!
Shorthand: \(X \sim \text{Exp}(\lambda)\)
\[ f_X(x) = \lambda e^{-\lambda x}\text{ for } x>0, \lambda>0 \]
\[ F_X(x) = \left\{ \begin{array}{ll} 0 & \quad x<0 \quad \\ 1 - e^{-\lambda x} & \quad x\geq0 \\ \end{array} \right. \]
\[\text{E}(X) = \dfrac{1}{\lambda}\] \[\text{Var}(X) = \dfrac{1}{\lambda^2}\]
If \(b>0\),
\[P(X > a +b | X> a) = P(X > b)\]
This can be interpreted as:
If you have waited \(a\) seconds (or any other measure of time) without a success
Then the probability that you have to wait \(b\) more seconds is the same as as the probability of waiting \(b\) seconds initially.
Look for time between events/successes
Look for a rate of the events over time period
How does it differ from the geometric distribution?
Geometric is number of trials until first success
Exponential is time until first success
Relation to the Poisson distribution?
Let’s say we’re sitting at the bus stop, measuring the time until our bus arrives. We know the bus comes every 10 minutes on average.
If we want to know the probability that the bus arrives in the next 5 minutes:
If we want to know the time, say \(t\), where the probability of the bus arriving at \(t\) or earlier is 0.35:
If we want to know the probability that the bus arrives between 3 and 5 minutes:
If we want to sample 20 bus arrival times from the distribution:
Revisit after joint notes:
Example 1
Let \(X_i \sim \textrm{Exp}(\lambda_i)\) be independent RVs, for \(i=1 \ldots n\). Find the pdf for the first of the arrival times.
\[ f_X(x) = \dfrac{\lambda^r}{\Gamma(r)}x^{r-1} e^{-\lambda x}\text{ for } x>0, \lambda>0, \Gamma(r) = (r-1)! \]
\[ F_X(x) = \left\{ \begin{array}{ll} 0 & \quad x<0 \quad \\ 1 - e^{-\lambda x}\displaystyle\sum_{j=0}^{r-1}\dfrac{(\lambda x)^j}{j!} & \quad x\geq0 \\ \end{array} \right. \]
\[\text{E}(X) = \dfrac{r}{\lambda}\text{, }\text{ Var}(X) = \dfrac{r}{\lambda^2}\]
Common to see \(\alpha = r\) and \(\beta = \lambda\)
Gamma distribution with \(r=1\) is same as exponential
Similar to exponential
Let’s say we’re sitting at the bus stop, measuring the time until 4 buses arrive. We know the bus comes every 10 minutes on average.
If we want to know the time, say \(t\), where the probability of the 4 buses arriving at \(t\) or earlier is 0.35:
If we want to know the probability that the 4 buses arrives between 30 and 50 minutes:
If we want to sample 20 arrival times for the 4 buses:
The parameter \(r\) in a Gamma(\(r\),\(\lambda\)) distribution does NOT need to be a positive integer
When \(r\) is a positive integer, the distribution is sometimes called an Erlang(\(r\),\(\lambda\)) distribution
When \(r\) is any positive real number, we have a general gamma distribution that is usually instead parameterized by \(\alpha>0\) and \(\beta>0\), where:
\(\alpha = \text{shape parameter}\) : same as \(k\), the total number of events we must witness
\(\beta = \text{scale parameter}\) : same as \(\lambda\), the rate parameter
Example 1
On average, someone sends a money order once per 15 minutes. What is the probability someone sends 10 money orders in less than 3 hours?
No scenario description here because the Normal distribution is so universal
Shorthand: \(X \sim \text{Normal}(\mu, \sigma^2)\)
\[ f_X(x) = \dfrac{1}{\sqrt{2\pi \sigma^2}}e^{-(x-\mu)^2/(2\sigma^2)} \text{, for} -inf < x < inf \]
\[\text{E}(X) = \mu \] \[\text{Var}(X) = \sigma^2\]
Let’s say we’re measuring the high temperature today. The average high temperature on this day across many, many years is 50 degrees with a standard deviation of 4 degrees.
If we want to know the probability that the high temperature is below 45 degrees:
If we want to know the temoerature, say \(t\), where the probability of that the temperature is at \(t\) or lower is 0.35:
If we want to know the probability that the temperature is between 45 and 50 degrees:
If we want to sample 20 days’ temperature (over the years) from the distribution:
Example 1
Children’s movies run an average of 98 minutes with a standard deviation of 10 minutes. You check out a random movie from the library to entertain your kids so you can study for your test. Assume that your kids will be occupied for the entire length of the movie.
What is the probability that your kids will be occupied for at least the 2 hours you would like to study?
What is range for the bottom quartile (lowest 25%) of time they will be occupied?
\[ Z \sim \text{Normal}(\mu = 0, \sigma^2 = 1)\]
Used to be more helpful when computing was not as advanced
Use tables of the standard normal
You can convert any normal distribution to a standard normal through transformation
\(Z = \dfrac{X - \mu_X}{\sigma_X}\)
Comes from \(X = \sigma_X Z + \mu_X\)
Since \(\sigma_X\) and \(\mu_X\) are constants, then \(E(X) = \mu_X\) and \(SD(X) = \sigma_X SD(Z) = \sigma_X\)
Lesson 18 Slides