Mean and Variance of Continuous Step Function
With discrete random variables, we often calculated the probability that a trial would result in a particular outcome. For example, we might calculate the probability that a roll of three dice would have a sum of 5.
The situation is different for continuous random variables. For example, suppose we measure the length of time cars have to wait at an intersection for the green light. If the traffic light has a cycle lasting 30 seconds, then 8.192161 seconds is a possible outcome. However, it makes little sense to find the probability that a car will wait precisely 8.192161 seconds at the light. Even if we could meaningfully measure the waiting time to the nearest millionth of a second, it is inconceivable that we would ever get exactly 8.192161 seconds.
With a continuous random variable, we care only about the random variable taking on a value in a particular interval. Continuous probability distributions are probability density functions, or PDFs. We calculate probabilities based not on sums of discrete values but on integrals of the PDF over a given interval.
In general, the probability that a continuous random variable will be between limits a and b is given by the integral, or the area under a curve.
\begin{align*}
P(a\lt X \lt b) = \displaystyle \int_a^b f(x)dx
\end{align*}
For a probability density function to be valid, no probabilities may be negative, and the total probability must be one. In other words, a valid PDF must satisfy two criteria:
\begin{align}%\label{}
\nonumber f(x) \ge 0 &\\ \textrm{ } \\
\nonumber \int\limits_{-\infty}^{\infty} f(x)dx &=1
\end{align}
An important conceptual difference between a PMF and a PDF is that the PDF can be, and often is, greater than 1 at some value of x. The integral of the PDF cannot exceed 1, but the density itself may be larger than 1 over a small region.
The support for the PDF rarely stretches to infinity. Typically, much of the PDF has a value of zero, and integration is only needed over a small range of values.
The Cumulative Distribution Function
The Cumulative Distribution Function (CDF) for a continuous probability distribution is given by:
\begin{align}%\label{}
\nonumber F(x)= \textrm{P}(X\le x).
\end{align}
If the CDF is known, the PDF may be found by differentiation. If the PDF is known instead, the CDF may be found by integration.
\begin{align}%\label{}
\nonumber F(x) &= \int\limits_{-\infty}^{x} f(t) dt \\ \textrm{ } \\
f(x) &= \frac{d F(x)}{d x}
\end{align}
A continuous CDF is non-decreasing. In the limit, as \(x\to\infty\) the CDF approaches 1, and as \(x\to -\infty\) the CDF approaches 0.
We can find the probability of a range of values by subtracting CMFs with different boundaries. For example, \(F(a\lt X \lt b) = F(b) - F(a)\).
An S-shaped cumulative probability graph is sometimes referred to as the ogive, or the ogee, because of the use of a similar shape in Gothic architecture. The apogee, or highest point of an arch or orbit, is a related word. An example of a typical CDF is shown at the right.
Expectation and Variance
The definitions of the expected value and the variance for a continuous variation are the same as those in the discrete case, except the summations are replaced by integrals.
\begin{align*}
\mu = E(X) &= \int\limits_{-\infty}^{\infty}x\;f(x)dx \\ \textrm{ } \\
\\ \textrm{ } \\
\sigma^2 = E(\;(x-\mu)^2\;) &= \int\limits_{-\infty}^{\infty}(x-\mu)^2\;f(x)dx \\ \textrm{ } \\
&= E(\;X^2\;) - (\;E(X)^2\;)\\ \textrm{ } \\
&= \int\limits_{-\infty}^{\infty}x^2\;f(x)dx - \left(\;\; \int\limits_{-\infty}^{\infty}x\;f(x)dx\right)^2
\end{align*}
As in the discrete case, the standard deviation is the square root of the variance.
\begin{align*}
\sigma = \sqrt{E(\;(x-\mu)^2\;)} &= \sqrt{E(\;X^2\;) - (\;E(X)^2\;)}
\end{align*}
Example
The random variable, X, has a probability density function given by:
\(f(x) =
\begin{cases}
\frac{3}{8}x^2 & \text{for } 0 \leq x \leq 2 \\
0 & \text{otherwise}
\end{cases} \)
a) Find the probability that X is between 1 and 2
b) Find the cumulative probability distribution function
c) Find the expected value of X
d) Find the variance and standard deviation of X
Show Solution
Uniform Distribution
A very common continuous probability distribution is the rectangular, or uniform distribution. In this case the probability is the same constant value throughout the range. In the discrete case, flipping a coin or rolling a single die would have a uniform distribution since every outcome is equally likely. In the continuous case, the classic example is the wait time for a person boarding a shuttle bus that comes once every hour. If the person doesn't know when the shuttle last arrived, the wait time follows a uniform distribution.
The PDF for a uniform distribution between the values \(a\) and \(b\) is given by
\(f(x) =
\begin{cases}
\displaystyle \frac{1}{b-a} & \text{for } a \leq x \leq b \\
0 & \text{otherwise}
\end{cases} \)
The cumulative uniform distribution, CDF, is given by
\(F(x) =
\begin{cases}
0 & x\lt a \\
\displaystyle \frac{x-a}{b-a} & \text{for } a \leq x \leq b \\
1 & x\gt b
\end{cases} \)
The expected value, variance, and standard deviation are
\begin{align*}
\mu = E(X) &= \int\limits_a^b \frac{x}{b-a}dx = \frac{1}{2}(a+b) \\ \textrm{ }\\
E(\;X^2\;) &= \int\limits_a^b \frac{x^2}{b-a}dx = \frac{1}{3}(a^2+ab+b^2) \\ \textrm{ }\\
\sigma^2 &= \frac{1}{12}(b-a)^2 \\ \textrm{ }\\
\sigma &= \frac{b-a}{\sqrt{12}}
\end{align*}
Example
An electrical voltage is determined by the probability density function
\(
\begin{cases}
0 & x\lt a \\
\displaystyle \frac{1}{2\pi} & \text{for } 0 \leq x \leq 2\pi \\
0 & \textrm{ otherwise }
\end{cases} \)
a) Find the mean and standard deviation of the probability distribution
b) Find the cumulative distribution
c) What is the probability that the voltage is within two standard deviations of the mean?
Show Solution
Exponential Distribution
Another fairly common continuous distribution is the exponential distribution:
\begin{cases}
f(x) = \lambda\;e^{-\lambda x} & \text{for }x \ge 0 \\
f(x) = 0 & \text{for } x \lt 0
\end{cases}
where \(\lambda\) is a constant that is the reciprocal of the mean and standard deviation.
The cumulative distribution function may be found by integration:
\(F(x) = 1 - e^{-\lambda x}\)
The exponential distribution is similar to the Poisson distribution, which gives probabilities of discrete numbers of events occurring in a given interval of time. The exponential distribution gives the probabilities of a (continuous) amount of time between successive random events.
Note that it is often helpful to use the following expression when working with the exponential distribution:
\(\int\limits_{0}^{\infty} x^n e^{-ax} dx = n! a^{-(n+1)}\)
As mentioned above, the mean of the exponential distribution is given by
\(\mu = E(X) = \displaystyle \frac{1}{\lambda} = \beta\)
(sometimes, the reciprocal of \(\lambda\) is denoted by the parameter \(\beta\))
The variance is given by
\(\sigma^2 = \displaystyle \frac{1}{\lambda^2} = \beta^2\)
and the standard deviation is
\(\sigma = \frac{1}{\lambda} = \beta.\)
Example
The time between arrivals of trucks at a warehouse is a continuous random variable, \(T\). The probability of the time between arrivals is given by the probability density function below
\begin{cases}
f(t) = 4\;e^{-4 t} & \text{for }t \ge 0 \\
f(t) = 0 & \text{for } t \lt 0
\end{cases} where \(t\) is the time in hours.
a) What is the probability that the time between arrivals of the first and second trucks is less than 5 minutes?
b) Find the mean time between arrivals and the standard deviation, both in hours.
c) What is the probability that the waiting time will be within one standard deviation of the mean waiting time?
d) What is the probability that the waiting time will be within two standard deviations of the mean waiting time?
Show Solution
Other Continuous Probability Distributions
The Weibull & Gamma Distributions
The exponential distribution is actually a special case of both the Weibull distribution, which has the following probability density function:
\(f(x) = \displaystyle \frac{\lambda^k \; x^{k-1} \; {e^{(-\lambda x)}}^k}{(k-1)!}\)
and the gamma distribution, which has the following probability density function:
\(f(x) = \displaystyle \frac{\lambda^k \; x^{k-1} \; e^{-\lambda x}}{(k-1)!}\)
where in both cases, \(\lambda\) is the mean time between events, and \(k\) is the number of event occurrences. The Weibull distribution takes its name from Waloddi Weibull, who described it in 1951 (though it was identified and applied by others as early as 1927), and the gamma distribution takes its name from the denominator, which is the gamma function, \(\Gamma(k)\). The Weibull distribution models the situation when the average rate changes over time, and the gamma function models the situation where the average rate is constant.
The gamma function may be thought of as a sum of exponential functions. For example, it computes the probability that you have to wait less than 4 hours before catching 5 fish, when you expect to get one fish every half hour on average. In that case, \(k = 5\) and \(\lambda = 1/2\).
The exponential distribution is a special case of both the gamma and Weibull distributions when \(k= 1\).
To find the cumulative gamma distribution, we can repeatedly integrate by parts, reducing the exponent by one each time until we're done. For example,
\(\int\limits_a^b t^5 e^t dt = (t^5-5t^4+20t^3-60t^2+120t-120)e^t \Big]_{a}^{b}\)
Though possible to integrate by hand, it is much more convenient to use a spreadsheet function and simply specify whether we want the PDF or the CDF. To find the cumulative probability of waiting less than 4 hours before catching 5 fish, when you expect to get one fish every half hour on average, you would enter:
GAMMA.DIST(4, 5, 1/2, TRUE)
The Chi-squared Distribution
Later, we will use the chi-squared distribution, which is a different example of a gamma distribution where \(k = v/2\) and \(\lambda= 1/2\).
The Normal Distribution
By far the most important continuous probability distribution is the Normal Distribution, which is covered in the next chapter. As \(k\to\infty\), the gamma distribution approaches the normal distribution.
Source: http://matcmath.org/textbooks/engineeringstats/pdf-mean-variance/
0 Response to "Mean and Variance of Continuous Step Function"
Post a Comment