\(\newcommand{\P}{\mathbb{P}}\)
\(\newcommand{\E}{\mathbb{E}}\)
\(\newcommand{\R}{\mathbb{R}}\)
\(\newcommand{\N}{\mathbb{N}}\)
\(\newcommand{\bs}{\boldsymbol}\)
\(\newcommand{\var}{\text{var}}\)
\(\newcommand{\sd}{\text{sd}}\)
\(\newcommand{\cov}{\text{cov}}\)

We start with a renewal process as constructed in the introduction. Thus, \( \bs{X} = (X_1, X_2, \ldots) \) is the sequence of interarrival times. These are independent, identically distributed, nonnegative variables with common distribution function \( F \) (satisfying \( F(0) \lt 1 \)) and common mean \( \mu \). When \(\mu = \infty\), we let \(1 / \mu = 0\). When \(\mu \lt \infty\), we let \(\sigma\) denote the common standard deviation. Recall also that \( F^c = 1 - F \) is the right distribution function (or reliability function). Then, \( \bs{T} = (T_0, T_1, \ldots) \) is the arrival time sequence, where \( T_0 = 0 \) and \[ T_n = \sum_{i=1}^n X_i \] is the time of the \( n \)th arrival for \( n \in \N_+ \). Finally, \( \bs{N} = \{N_t: t \in [0, \infty)\} \) is the counting process, where for \( t \in [0, \infty) \), \[ N_t = \sum_{n=1}^\infty \bs{1}(T_n \le t) \] is the number of arrivals in \( [0, t] \). The renewal function \( M \) is defined by \( M(t) = \E\left(N_t\right) \) for \( t \in [0, \infty) \).

We noted earlier that the arrival time process and the counting process are inverses, in a sense. The arrival time process is the partial sum process for a sequence of independent, identically distributed variables. Thus, it seems reasonable that the fundamental limit theorems for partial sum processes (the law of large numbers and the central limit theorem theorem), should have analogs for the counting process. That is indeed the case, and the purpose of this section is to explore the limiting behavior of renewal processes. The main results that we will study, known appropriately enough as renewal theorems, are important for other stochastic processes, particularly Markov chains.

Our first result is a strong law of large numbers for the renewal counting process, which comes as you might guess, from the law of large numbers for the sequence of arrival times.

If \(\mu \lt \infty\) then \( N_t / t \to 1 / \mu \) as \( t \to \infty \) with probability 1.

Recall that \( T_{N_t} \le t \lt T_{N_t + 1} \) for \( t \gt 0 \). Hence, if \( N_t \gt 0 \), \[ \frac{T_{N_t}}{N_t} \le \frac{t}{N_t} \lt \frac{T_{N_t + 1}}{N_t}\] Recall that \( N_t \to \infty \) as \( t \to \infty \) with probability 1. Recall also that by the strong law of large numbers that \( T_n / n \to \mu \) as \( n \to \infty \) with probability 1. It follows that \( T_{N_t} \big/ N_t \to \mu \) as \( t \to \infty \) with probability 1. Also, \( (N_t + 1) \big/ N_t \to 1\) as \( t \to \infty \) with probability 1. Therefore \[ \frac{T_{N_t + 1}}{N_t} = \frac{T_{N_t + 1}}{N_t + 1} \frac{N_t + 1}{N_t} \to \mu \] as \( t \to \infty \) with probability 1. Hence by the squeeze theorem for limits, \( t \big/ N_t \to \mu \) as \( t \to \infty \) with probability 1.

Thus, \( 1 / \mu \) is the limiting average rate of arrivals per unit time.

Open the renewal experiment and set \( t = 50 \). For a variety of interarrival distributions, run the simulation 1000 times and note how the empirical distribution is concentrated near \( t / \mu \).

Our next goal is to show that the counting variable \( N_t \) is asymptotically normal.

Suppose that \( \mu \) and \( \sigma \) are finite, and let \[ Z_t = \frac{N_t - t / \mu}{\sigma \sqrt{t / \mu^3}}, \quad t \gt 0 \] The distribution of \( Z_t \) converges to the standard normal distribution as \( t \to \infty \).

For \( n \in \N_+ \), let \[ W_n = \frac{T_n - n \mu}{\sigma \sqrt{n}} \] The distribution of \( W_n \) converges to the standard normal distribution as \( n \to \infty \), by the ordinary central limit theorem. Next, for \( z \in \R \), \( \P(Z_t \le z) = \P\left(T_{n(z,t)} \gt t\right) \) where \( n(z, t) = \left\lfloor t / \mu + z \sigma \sqrt{t / \mu^3} \right\rfloor \). Also, \( \P(Z_t \le z) = \P\left[W_{n(z,t)} \gt w(z, t)\right] \) where \[ w(z, t) = -\frac{z}{\sqrt{1 + z \sigma \big/ \sqrt{t / \mu}}} \] But \( n(z, t) \to \infty \) as \( t \to \infty \) and \( w(z, t) \to -z \) as \( t \to \infty \). Recall that \( 1 - \Phi(-z) = \Phi(z) \), where as usual, \( \Phi \) is the standard normal distribution function. Thus, we conclude that \( \P(Z_t \le z) \to \Phi(z) \) as \( t \to \infty \).

Open the renewal experiment and set \( t = 50 \). For a variety of interarrival distributions, run the simulation 1000 times and note the normal

shape of the empirical distribution. Compare the empirical mean and standard deviation to \( t / \mu \) and \( \sigma \sqrt{t / \mu^3} \), respectively

The elementary renewal theorem states that the basic limit in the law of large numbers above holds in mean, as well as with probability 1. That is, the limiting mean average rate of arrivals is \( 1 / \mu \). The elementary renewal theorem is of fundamental importance in the study of the limiting behavior of Markov chains, but the proof is not as easy as one might hope. In particular, recall that convergence with probability 1 does not imply convergence in mean, so the elementary renewal theorem does not follow from the law of large numbers.

\( M(t) / t \to 1 / \mu \) as \( t \to \infty \).

We first show that \( \liminf_{t \to \infty} M(t) / t \ge 1 / \mu \). Note first that this result is trivial if \( \mu = \infty \), so assume that \( \mu \lt \infty \). Next, recall that \( N_t + 1 \) is a stopping time for the sequence of interarrival times \( \bs{X} \). Recall also that \( T_{N_t + 1} \gt t \) for \( t \gt 0 \). From Wald's equation it follows that \[ \E\left(T_{N_t + 1}\right) = \E(N_t + 1) \mu = [M(t) + 1] \mu \gt t \] Therefore \( M(t) / t \gt 1 / \mu - 1 / t \) for \( t \gt 0 \). Hence \( \liminf_{t \to \infty} M(t) / t \ge 1 / \mu \).

Next we show that \( \limsup_{t \to \infty} M(t) / t \le 1 / \mu \). For this part of the proof, we need to truncate the arrival times, and use the basic comparison method. For \( a \gt 0 \), let \[ X_{a,i} = \begin{cases} X_i, & X_i \le a \\ a, & X_i \gt a \end{cases} \] and consider the renewal process with the sequence of interarrival times \( \bs{X}_a = \left(X_{a,1}, X_{a,2}, \ldots\right) \). We will use the standard notation developed in the introductory section. First note that \( T_{a, N_{a,t} + 1} \le t + a \) for \( t \gt 0 \) and \( a \gt 0 \). From Wald's equation again, it follows that \( \left[M_a(t) + 1\right] \mu_a \le t + a \). Therefore \[ \frac{M_a(t)}{t} \le \left(\frac{1}{\mu_a} + \frac{a}{t \mu_a}\right) - \frac{1}{t}, \quad a, \; t \gt 0 \] But \( M(t) \le M_a(t) \) for \( t \gt 0 \) and \( a \gt 0 \) and therefore \[ \frac{M(t)}{t} \le \left(\frac{1}{\mu_a} + \frac{a}{t \mu_a} \right) - \frac{1}{t}, \quad a, \; t \gt 0 \] Hence \( \limsup_{t \to \infty} M(t) / t \le 1 / \mu_a \) for \( a \gt 0 \). Finally, \( \mu_a \to \mu \) as \( a \to \infty \) by the monotone convergence theorem, so it follows that \( \limsup_{t \to \infty} M(t) / t \le 1 / \mu \)

Open the renewal experiment and set \( t = 50 \). For a variety of interarrival distributions, run the experiment 1000 times and once again compare the empirical mean and standard deviation to \( t / \mu \) and \( \sigma \sqrt{t / \mu^3} \), respectively.

The renewal theorem states that the expected number of renewals in an interval is asymptotically proportional to the length of the interval; the proportionality constant is \( 1 / \mu \). The precise statement is different, depending on whether the renewal process is arithmetic or not. Recall that for an arithmetic renewal process, the interarrival times take values in a set of the form \( \{n d: n \in \N\} \) for some \( d \in (0, \infty) \), and the largest such \( d \) is the span of the distribution.

For \( h \gt 0 \), \( M(t, t + h] \to \frac{h}{\mu} \) as \( t \to \infty \) in each of the following cases:

- The renewal process is non-arithmetic
- The renewal process is arithmetic with span \( d \), and \( h \) is a multiple of \( d \)

The renewal theorem is also known as Blackwell's theorem in honor of David Blackwell. The final limit theorem we will study is the most useful, but before we can state the theorem, we need to define and study the class of functions to which it applies.

Recall that in the ordinary theory of Riemann integration, the integral of a function on the interval \( [0, t] \) exists if the upper and lower Riemann sums converge to a common number as the partition is refined. Then, the integral of the function on \( [0, \infty) \) is defined to be the limit of the integral on \( [0, t] \), as \( t \to \infty \). For our new definition, a function is said to be *directly* Riemann integrable if the lower and upper Riemann sums on the entire unbounded interval \( [0, \infty) \) converge to a common number as the partition is refined, a more restrictive definition than the usual one.

Suppose that \( g: [0, \infty) \to [0, \infty) \). For \( h \in [0, \infty) \) and \( k \in \N \), let \( m_k(g, h) = \inf\{g(t): t \in [k h, (k + 1) h)\} \) and \( M_k(g, h) = \sup\{g(t): t \in [k h, (k + 1)h) \). The lower and upper Riemann sums of \( g \) on \( [0, \infty) \) corresponding to \( h \) are \[ L_g(h) = h \sum_{k=0}^\infty m_k(g, h), \quad U_g(h) = h \sum_{k=0}^\infty M_k(g, h) \] The sums exist in \( [0, \infty] \) and satisfy the following properties:

- \( L_g(h) \le U_g(h) \) for \( h \gt 0 \)
- \( L_g(h) \) increases as \( h \) decreases
- \( U_g(h) \) decreases as \( h \) decreases

It follows that \( \lim_{h \downarrow 0} L_g(h) \) and \( \lim_{h \downarrow 0} U_g(h) \) exist in \( [0, \infty] \) and \( \lim_{h \downarrow 0} L_g(h) \le \lim_{h \downarrow 0} U_g(h) \). Naturally, the case where the limits are finite and agree is what we're after.

A function \( g: [0, \infty) \to [0, \infty) \) is *directly Riemann integrable* if \( U_g(h) \lt \infty \) for every \( h \gt 0 \) and
\[ \lim_{h \downarrow 0} L_g(h) = \lim_{h \downarrow 0} U_g(h) \]
The common value is \( \int_0^\infty g(t) \, dt \).

Ordinary Riemann integrability on \( [0, \infty) \) allows functions that are unbounded and oscillate wildly as \( t \to \infty \), and these are the types of functions that we want to exclude for the renewal theorems. The following result connects ordinary Riemann integrability with direct Riemann integrability.

If \( g: [0, \infty) \to [0, \infty) \) is integrable (in the ordinary Riemann sense) on \( [0, t] \) for every \( t \in [0, \infty) \) and if \( U_g(h) \lt \infty \) for some \( h \in (0, \infty) \) then \( g \) is directly Riemann integrable.

Here is a simple and useful class of functions that are directly Riemann integrable.

Suppose that \( g: [0, \infty) \to [0, \infty) \) is decreasing with \( \int_0^\infty g(t) \, dt \lt \infty \). Then \( g \) is directly Riemann integrable.

The key renewal theorem is an integral version of the renewal theorem, and is the most useful of the various limit theorems.

Suppose that the renewal process is non-arithmetic and that \( g: [0, \infty) \to [0, \infty) \) is directly Riemann integrable. Then \[ (g * M)(t) = \int_0^t g(t - s) \, dM(s) \to \frac{1}{\mu} \int_0^\infty g(x) \, dx \text{ as } t \to \infty \]

Our next goal is to see how the various renewal theorems relate.

The renewal theorem implies the elementary renewal theorem:

Let \( a_n = M(n, n + 1] \) for \( n \in \N \). From the renewal theorem, \( a_n \to 1 / \mu \) as \( n \to \infty \). Therefore \( \frac{1}{n} \sum_{k=0}^{n-1} a_k \to \frac{1}{\mu} \) as \( n \to \infty \). It follows that \( M(n) / n \to 1 / \mu \) as \( n \to \infty \). But the renewal function is increasing so for \( t \gt 0 \), \[ \frac{\lfloor t \rfloor}{t} \frac{M(\lfloor t \rfloor)}{\lfloor t \rfloor} \le \frac{M(t)}{t} \le \frac{\lceil t \rceil}{t} \frac{M(\lceil t \rceil)}{\lceil t \rceil} \] From the the squeeze theorem for limits it follows that \( M(t) / t \to 1 / \mu \) as \( t \to \infty \).

Conversely, the elementary renewal theorem *almost* implies the renewal theorem.

Assume that \( g(x) = \lim_{t \to \infty} [M(t + x) - M(t)] \) exists for each \( x \gt 0 \). (This assumption is the reason that the proof is incomplete.) Note that \[ M(t + x + y) - M(t) = [M(t + x + y) - M(t + x)] + [M(t + x) - M(t)] \] Let \( t \to \infty \) to conclude that \( g(x + u) = g(x) + g(y) \) for all \( x \ge 0 \) and \( y \ge 0 \). It follows that \( g \) is increasing and \( g(x) = c x \) for \( x \ge 0 \) where \( c \) is a constant. Exactly as in the proof of the previous theorem, it follows that \( M(n) / n \to c \) as \( n \to \infty \). From the elementary renewal theorem, we can conclude that \( c = 1 / \mu \).

The key renewal theorem implies the renewal theorem

This result follows by applying the key renewal theorem to the function \( g_h(x) = \bs{1}(0 \le x \le h) \) where \( h \gt 0 \).

Conversely, the renewal theorem implies the key renewal theorem.

The key renewal theorem can be used to find the limiting distributions of the current and remaining age. Recall that for \( t \in [0, \infty) \) the current life at time \( t \) is \( C_t = t - T_{N_t} \) and the remaining life at time \( t \) is \( R_t = T_{N_t + 1} - t \).

If the renewal process is non-arithmetic, then \[ \P(R_t \gt x) \to \frac{1}{\mu} \int_x^\infty F^c(y) \, dy \text{ as } t \to \infty, \quad x \in [0, \infty) \]

Recall that \[ \P(R_t \gt x) = F^c(t + x) + \int_0^t F^c(t + x - s) \, dM(s), \quad x \in [0, \infty) \] But \( F^c(t + x) \to 0 \) as \( t \to \infty \), and by the key renewal theorem, the integral converges to \( \frac{1}{\mu} \int_0^\infty F^c(x + y) \, dy \). Finally a change of variables in the limiting integral gives the result.

If the renewal process is aperiodic, then

\[ \P(R_t \gt x) \to \frac{1}{\mu} \int_x^\infty F^c(y) \, dy \text{ as } t \to \infty, \quad x \in [0, \infty) \]Recall that, since the renewal process is aperiodic, \[ \P(C_t \gt x) = F^c(t) + \int_0^{t - x} F^c(t - s) \, dM(s), \quad x \in [0, t] \] Again, \( F^c(t) \to 0 \) as \( t \to \infty \). The change of variables \( u = t - x \) changes the integral into \( \int_0^u F^c(u + x - s) \, dM(s) \). By the key renewal theorem, this integral converges \( \frac{1}{\mu} \int_0^\infty F^c(y + x) \, dy = \int_x^\infty F^c(y + x) \, dy \).

The current and remaining life have the same limiting distribution. In particular, \[ \lim_{t \to \infty} \P(C_t \le x) = \lim_{t \to \infty} \P(R_t \le x) = \frac{1}{\mu} \int_0^x F^c(y) \, dy, \quad x \in [0, \infty) \]

By the previous two theorems, the limiting right distribution functions of \( R_t \) and \( C_t \) are the same. The ordinary (left) limiting distribution function is \[ 1 - \frac{1}{\mu} \int_x^\infty F^c(y) \, dy = \frac{1}{\mu} \left(\mu - \int_x^\infty F^c(y) \, dy \right) \] But recall that \( \mu = \int_0^\infty F^c(y) \, dy \) so the result follows since \( \int_0^\infty F^c(y) \, dy - \int_x^\infty F^c(y) \, dy = \int_0^x F^c(y) \, dy \)

The fact that the current and remaining age processes have the same limiting distribution may seem surprising at first, but there is a simple intuitive explanation. After a long period of time, the renewal process looks just about the same backward in time as forward in time. But reversing the direction of time reverses the rolls of current and remaining age.

Recall that the Poisson process, the most important of all renewal processes, has interarrival times that are exponentially distributed with rate parameter \( r \gt 0 \). Thus, the interarrival distribution function is \( F(x) = 1 - e^{-r x} \) for \( x \ge 0 \) and the mean interarrival time is \( \mu = 1 / r \).

Verify each of the following directly:

- The law of large numbers for the counting process.
- The central limit theorem for the counting process.
- The elementary renewal theorem.
- The renewal theorem.

Suppose that \( \bs{X} = (X_1, X_2, \ldots) \) is a sequence of Bernoulli trials with success parameter \( p \in (0, 1) \). Recall that \( \bs{X} \) is a sequence of independent, identically distributed indicator variables with \( p = \P(X = 1) \). We have studied a number of random processes derived from \( \bs{X} \):

Random processes associated with Bernoulli trials.

- \( \bs{Y} = (Y_0, Y_1, \ldots) \) where \( Y_n \) the number of successes in the first \( n \) trials. The sequence \( \bs{Y} \) is the partial sum process associated with \( \bs{X} \). The variable \( Y_n \) has the binomial distribution with parameters \( n \) and \( p \).
- \( \bs{U} = (U_1, U_2, \ldots) \) where \( U_n \) the number of trials needed to go from success number \( n - 1 \) to success number \( n \). These are independent variables, each having the geometric distribution on \( \N_+ \) with parameter \( p \).
- \( \bs{V} = (V_0, V_1, \ldots) \) where \( V_n \) is the trial number of success \( n \). The sequence \( \bs{V} \) is the partial sum process associated with \( \bs{U} \). The variable \( V_n \) has the negative binomial distribution with parameters \( n \) and \( p \).

Consider the renewal process with interarrival sequence \( \bs{U} \). Thus, \( \mu = 1 / p \) is the mean interarrival time, and \( \bs{Y} \) is the counting process. Verify each of the following directly:

- The law of large numbers for the counting process.
- The central limit theorem for the counting process.
- The elementary renewal theorem.

Consider the renewal process with interarrival sequence \( \bs{X} \). Thus, the mean interarrival time is \( \mu = p \) and the number of arrivals in the interval \( [0, n] \) is \( V_{n+1} - 1 \) for \( n \in \N \). Verify each of the following directly:

- The law of large numbers for the counting process.
- The central limit theorem for the counting process.
- The elementary renewal theorem.