We now know that the sequence of inter-arrival times \(\bs{X} = (X_1, X_2, \ldots)\) in the Poisson process is a sequence of independent random variables, each having the exponential distribution with rate parameter \(r\), for some \(r \in (0, \infty)\). No other distribution gives the strong renewal assumption that we want: the property that the process probabilistically restarts, independently of the past, at each arrival time and at each fixed time. In this section we study the sequence of arrival times.
Let \(T_0 = 0\) and that for \(n \in \N_+\) let \(T_n\) denote the \(n\)th arrival time. So \(\bs{T} = (T_0, T_1, \ldots)\) is the partial sum process associated with the sequence of inter-arrival times \(\bs{X} = (X_1, X_2, \ldots)\). \[ T_n = \sum_{i=0}^n X_i, \quad n \in \N \]
Recall that the common probability density function of the inter-arrival times is \[ f(t) = r e^{-r t}, \quad 0 \le t \lt \infty \] Our first goal is to describe the distribution of the \( n \)th arrival \( T_n \).
For \(n \in \N_+\), \(T_n\) has a continuous distribution with probability density function \( f_n \) given by \[ f_n(t) = r^n \frac{t^{n-1}}{(n - 1)!} e^{-r t}, \quad 0 \le t \lt \infty\]
Since \(T_n\) is the sum of \(n\) independent variables, each with PDF \(f\), the PDF of \(T_n\) is the convolution power of \(f\) of order \(n\). That is, \(f_n = f^{*n}\). A simple induction argument shows that \(f_n\) has the form given above. For example, \[ f_2(t) = \int_0^t f(s) f(t - s) \, ds = \int_0^t r e^{-r s} r e^{-r(t-s)} \, ds = \int_0^t r^2 e^{-r t} \, ds = r^2 t \, e^{-r t}, \quad 0 \le t \lt \infty \] Parts (a) and (b) follow from standard calculus.
The distribution with this probability density function is known as the gamma distribution with shape parameter \(n\) and rate parameter \(r\). It is lso known as the Erlang distribution, named for the Danish mathematician Agner Erlang. Again, \(1 / r\) is the scale parameter, and that term will be justified below. The term shape parameter for \( n \) clearly makes sense in light of parts (a) and (b) of . The term rate parameter for \( r \) is inherited from the inter-arrival times, and more generally from the underlying Poisson process itself: the random points are arriving at an average rate of \( r \) per unit time. A more general version of the gamma distribution, allowing non-integer shape parameters, is studied in the chapter on on special distributions. Note that since the arrival times are continuous, the probability of an arrival at any given instant of time is 0.
In the gamma experiment, vary \(r\) and \(n\) with the scrollbars and watch how the shape of the probability density function changes. For various values of the parameters, run the experiment 1000 times and compare the empirical density function to the true probability density function.
The distribution function and the quantile function of the gamma distribution do not have simple, closed-form expressions. However, it's easy to write the distribution function as a sum.
For \( n \in \N_+ \), \( T_n \) has distribution function \( F_n \) given by \[ F_n(t) = 1 - \sum_{k=0}^{n-1} e^{-r t} \frac{(r t)^k}{k!}, \quad t \in [0, \infty) \]
Note that \[ F_n(t) = \int_0^t f_n(s) ds = \int_0^n r^n \frac{s^{n-1}}{(n - 1)!} e^{-r s} \] The result follows by repeated integration by part.
Open the quantile app, select the gamma distribution, and select CDF view. Vary the parameters and note the shape of the distribution and quantile functions. For selected values of the parameters, compute the quartiles.
The mean, variance, and moment generating function of \(T_n\) can be found easily from the representation as a sum of independent exponential variables in
The mean and variance of \(T_n\) are.
Recall that the exponential distribution with rate parameter \(r\) has mean \(1 / r\) and variance \(1 / r^2\).
For \(k \in \N\), the moment of order \(k\) of \(T_n\) is \[ \E\left(T_n^k\right) = \frac{(k + n - 1)!}{(n - 1)!} \frac{1}{r^k} \]
Using the standard change of variables theorem, \[ \E\left(T_n^k\right) = \int_0^\infty t^k f_n(t) \, dt = \frac{r^{n-1}}{(n - 1)!} \int_0^\infty t^{k + n - 1} r e^{-r t} \, dt \] But the integral on the right is the moment of order \(k + n - 1\) for the exponential distribution, which is \((k + n - 1)! \big/ r^{k + n - 1}\). Simplifying gives the result.
More generally, the moment of order \(k \gt 0\) (not necessarily an integer) is \[ \E\left(T_n^k\right) = \frac{\Gamma(k + n)}{\Gamma(n)} \frac{1}{r^k} \] where \(\Gamma\) is the gamma function.
In the gamma experiment, vary \(r\) and \(n\) with the scrollbars and watch how the size and location of the mean\( \pm \)standard deviation bar changes. For various values of \(r\) and \(n\), run the experiment 1000 times and compare the empirical moments to the true moments.
Our next result gives the skewness and kurtosis of the gamma distribution.
The skewness and kurtosis of \( T_n \) are
In particular, note that the gamma distribution is positively skewed but \( \skw(X) \to 0 \) and as \( n \to \infty \). Recall also that the excess kurtosis is \( \kur(T_n) - 3 = \frac{6}{n} \to 0 \) as \( n \to \infty \). This result is related to the convergence of the gamma distribution to the normal, discussed below. Finally, note that the skewness and kurtosis do not depend on the rate parameter \( r \). This is because, as we show in , \( 1 / r \) is a scale parameter.
The moment generating function of \(T_n\) is \[ M_n(s) = \E\left(e^{s T_n}\right) = \left(\frac{r}{r - s}\right)^n, \quad -\infty \lt s \lt r \]
Recall that the MGF of a sum of independent variables is the product of the corresponding MGFs. The exponential distribution with parameter \(r\) has MGF \(s \mapsto r / (r - s)\) for \(-\infty \lt s \lt r\).
The moment generating function can also be used to derive the moments of the gamma distribution in ; recall that \(M_n^{(k)}(0) = \E\left(T_n^k\right)\).
In many practical situations, the rate \(r\) of the process in unknown and must be estimated based on data from the process. We start with a natural estimate of the scale parameter \(1 / r\). Note that \[ M_n = \frac{T_n}{n} = \frac{1}{n} \sum_{i=1}^n X_i \] is the sample mean of the first \(n\) inter-arrival times \((X_1, X_2, \ldots, X_n)\). In statistical terms, this sequence is a random sample of size \(n\) from the exponential distribution with rate \(r\).
\(M_n\) satisfies the following properties:
Parts (a) and (b) follow from the expected value and variance of \(T_n\) in , and standard properties of expected value and variance. Part (c) is the strong law of large numbers.
The results in are standard properties of the sample mean as an estimator of the distribution mean. In statistical terms, part (a) means that \(M_n\) is an unbiased estimator of \(1 / r\) and hence the variance in part (b) is the mean square error. Part (b) means that \(M_n\) is a consistent estimator of \(1 / r\) since \(\var(M_n) \downarrow 0\) as \(n \to \infty\). Part (c) is a stronger from of consistency. In general, the sample mean of a random sample from a distribution is an unbiased and consistent estimator of the distribution mean. On the other hand, a natural estimator of \(r\) itself is \(1 / M_n = n / T_n\). However, this estimator is positively biased.
\(\E(n / T_n) \ge r\).
This follows immediately from Jensen's inequality since \(x \mapsto 1 / x\) is concave upward on \((0, \infty)\).
As noted above, the gamma distribution is a scale family.
Suppose that \( T \) has the gamma distribution with rate parameter \( r \in (0, \infty) \) and shape parameter \( n \in \N_+ \). If \( c \in (0, \infty) \) then \( c T \) has the gamma distribution with rate parameter \( r / c \) and shape parameter \( n \).
The moment generating function of \( c T \) is \[ \E[e^{s (c T)}] = \E[e^{(c s) T}] = \left(\frac{r}{r - cs}\right)^n = \left(\frac{r / c}{r / c - s}\right)^n, \quad s \lt \frac{r}{c} \]
The scaling property also follows from the fact that the gamma distribution governs the arrival times in the Poisson process. A time change in a Poisson process clearly does not change the strong renewal property, and hence results in a new Poisson process.
The gamma distribution is also a member of the general exponential family of distributions.
Suppose that \( T \) has the gamma distribution with shape parameter \( n \in \N_+ \) and rate parameter \( r \in (0, \infty) \). Then \( T \) has a two parameter general exponential distribution with natural parameters \( n - 1 \) and \( -r \), and natural statistics \( \ln(T) \) and \( T \).
A number of important properties flow from the fact that the sequence of arrival times \(\bs{T} = (T_0, T_1, \ldots)\) is the partial sum process associated with the sequence of independent, identically distributed inter-arrival times \(\bs{X} = (X_1, X_2, \ldots)\).
The arrival time sequence \(\bs{T}\) has stationary, independent increments:
The stationary and independent increments properties hold for any partial sum process associated with an independent, identically distributed sequence.
Of course, the stationary and independent increments properties are related to the fundamental renewal
assumption that we started with. If we fix \(n \in \N_+\), then \((T_n - T_n, T_{n+1} - T_n, T_{n+2} - T_n, \ldots)\) is independent of \((T_1, T_2, \ldots, T_n)\) and has the same distribution as \((T_0, T_1, T_2, \ldots)\). That is, if we restart the clock
at time \(T_n\), then the process in the future looks just like the original process (in a probabilistic sense) and is indpendent of the past. Thus, we have our second characterization of the Poisson process.
A process of random points in time is a Poisson process with rate \( r \in (0, \infty) \) if and only if the arrival time sequence \( \bs{T} \) has stationary, independent increments, and for \( n \in \N_+ \), \( T_n \) has the gamma distribution with shape parameter \( n \) and rate parameter \( r \).
The gamma distribution is closed with respect to sums of independent variables, as long as the rate parameter is fixed.
Suppose that \(V\) has the gamma distribution with shape parameter \(m \in \N_+\) and rate parameter \(r \gt 0\), \(W\) has the gamma distribution with shape parameter \(n \in \N_+\) and rate parameter \(r\), and that \(V\) and \(W\) are independent. Then \(V + W\) has the gamma distribution with shape parameter \(m + n\) and rate parameter \(r\).
There are at least three different proofs of this fundamental result. Perhaps the best is a probabilistic proof based on the Poisson process. We start with an IID sequence \(\bs{X}\) of independent exponentially distributed variables, each with rate parameter \(r\). Then we can associate \(V\) with \(T_m\) and \(W\) with \(T_{m + n} - T_m\) so that \(V + W\) becomes \(T_{m + n}\). The result now follows from .
Another simple proof uses moment generating functions. Recall again that the MGF of \(V + W\) is the product of the MGFs of \(V\) and of \(W\). A third, analytic proof uses convolution. Recall again that the PDF of \(V + W\) is the convolution of the PDFs of \(V\) and of \(W\).
In the gamma experiment, vary \(r\) and \(n\) with the scrollbars and watch how the shape of the probability density function changes. Now set \(n = 10\) and for various values of \(r\) run the experiment 1000 times and compare the empirical density function to the true probability density function.
Even though you are restricted to relatively small values of \(n\) in the app, note that the probability density function of the \(n\)th arrival time becomes more bell-shaped as \(n\) increases (for \(r\) fixed). This is yet another application of the central limit theorem, since \(T_n\) is the sum of \(n\) independent, identically distributed random variables (the inter-arrival times).
The distribution of the random variable \(Z_n\) below converges to the standard normal distribution as \(n \to \infty\): \[ Z_n = \frac{r\,T_n - n}{\sqrt{n}} \]
\(Z_n\) is the standard score associated with \(T_n\), so the result follows from the central limit theorem.
We return to the analogy between the Bernoulli trials process and the Poisson process that started in the introduction and continued in the last section on the exponential distribution. If we think of the successes in a sequence of Bernoulli trials as random points in discrete time, then the process has the same strong renewal property as the Poisson process, but restricted to discrete time. That is, at each fixed time and at each arrival time, the process starts over
, independently of the past. In Bernoulli trials, the time of the \( n \)th arrival has the negative binomial distribution with parameters \( n \) and \( p \) (the success probability), while in the Poisson process, as we now know, the time of the \( n \)th arrival has the gamma distribution with parameters \( n \) and \( r \) (the rate). Because of this strong analogy, we expect a relationship between these two processes. In fact, we have the same type of limit as with the geometric and exponential distributions.
Fix \( n \in \N_+ \) and suppose that for each \( m \in \N_+ \) \( T_{m,n} \) has the negative binomial distribution with parameters \( n \) and \( p_m \in (0, 1) \), where \( m p_m \to r \in (0, \infty) \) as \( m \to \infty \). Then the distribution of \( T_{m,n} \big/ m \) converges to the gamma distribution with parameters \( n \) and \( r \) as \( m \to \infty \).
Suppose that \( X_m \) has the geometric distribution on \( \N_+ \) with success parameter \( p_m \). We know from the convergence result in the last section that the distribution of \( X_m / m \) converges to the exponential distribution with rate parameter \( r \) as \( m \to \infty \). It follows that if \( M_m \) denotes the moment generating function of \( X_m / m \), then \( M_m(s) \to r / (r - s) \) as \( m \to \infty \) for \( s \lt r \). But then \( M_m^n \) is the MGF of \( T_{m,n} \big/ m \) and clearly \[ M_m^n(s) \to \left(\frac{r}{r - s}\right)^n \] as \( m \to \infty \) for \( s \lt r \). The expression on the right is the MGF of the gamma distribution with shape parameter \( n \) and rate parameter \( r \).
Suppose that customers arrive at a service station according to the Poisson model, at a rate of \(r = 3\) per hour. Relative to a given starting time, find the probability that the second customer arrives sometime after 1 hour.
0.1991
Defects in a type of wire follow the Poisson model, with rate 1 per 100 meter. Find the probability that the 5th defect is located between 450 and 550 meters.
0.1746
Suppose that requests to a web server follow the Poisson model with rate \(r = 5\). Relative to a given starting time, compute the mean and standard deviation of the time of the 10th request.
2, 0.6325
Suppose that \(Y\) has a gamma distribution with mean 40 and standard deviation 20. Find the shape parameter \(n\) and the rate parameter \(r\).
\(r = 1 / 10\), \(n = 4\)
Suppose that accidents at an intersection occur according to the Poisson model, at a rate of 8 per year. Compute the normal approximation to the event that the 10th accident (relative to a given starting time) occurs within 2 years.
0.5752
In the gamma experiment, set \(n = 5\) and \(r = 2\). Run the experiment 1000 times and compute the following:
Suppose that requests to a web server follow the Poisson model. Starting at 12:00 noon on a certain day, the requests are logged. The 100th request comes at 12:15. Estimate the rate of the process.
\(r = 6.67\) hits per minute