\(\newcommand{\P}{\mathbb{P}}\) \(\newcommand{\E}{\mathbb{E}}\) \(\newcommand{\var}{\text{var}}\) \(\newcommand{\R}{\mathbb{R}}\) \(\newcommand{\N}{\mathbb{N}}\) \(\newcommand{\ms}{\mathscr}\) \(\newcommand{\lfrta}{\leftrightarrow}\) \(\newcommand{\bs}{\boldsymbol}\)
  1. Reliability
  2. 7. Paths and Related Graphs
  3. 1
  4. 2
  5. 3
  6. 4

3. Finite Paths

Preliminaries

For \(m \in \N_+\), let \(\N_m = \{0, 1, \ldots, m\}\), and define the relation \(\lfrta\) on \(\N_m\) by \(x \lfrta x + 1\) for \(x \in \{0, 1, \ldots m - 1\}\) and \(x \lfrta x - 1\) for \(x \in \{1, 2, \ldots m\}\).

So the graph \((\N_m, \lfrta)\) is the symmetric (undirected) simple path \((0, 1, \ldots, m)\) (with no loops). Since the graph is discrete, we have our reference measure space \((\N_m, \ms P(\N_m), \#)\) in the background. All of our objects of study depend on \(m\), but we will surpress this from the notation unless necessary. Given the symmetry of the graph, it's not surprising that various functions on the domain will be symmetric about the midpoint.

A function \(\varphi\) defined on \(\N_m\) is symmetric if \(\varphi(x) = \varphi(m - x)\) for \(x \in \N_m\).

The walk function \(w_n\) of order \(n \in \N\) for \((\N_m, \lfrta)\) is symmetric and is given as follows: \(w_0(x) = 1\) for \(x \in \N_m\) and then recursively, \begin{align*} w_{n + 1}(0) &= w_n(1)\\ w_u(x) &= w_n(x - 1) + w_n(x + 1), \quad x \in \{1, 2, \ldots, m - 1\} \\ w_{n + 1}(m) &= w_n(m - 1) \end{align*}

Details:

By definition, \(w_0(x) = 1\) for \(x \in \N_m\) and \(w_{n + 1} = w_n L\) where as ususal, \(L\) is the adjacency matrix of \((\N_m, \lfrta)\). In this case, for \(u: \N_m \to \R\), \begin{align*} u L(0) &= u(1)\\ u L(x) &= u(x - 1) + u(x + 1), \quad x \in \{1, 2, \ldots, m - 1\}\\ u L(m) &= u(m - 1) \end{align*} So the result follows.

The coefficients \(w_n(x)\) for \(m \in \N_+\), \(n \in \N\), and \(x \in \N_m\) are interesting, and are objects of study in graph theory. By definition, \(w_n(x)\) is the number of walks of length \(n\) terminating in \(x \in \N_m\) in the graph \((\N_m, \lfrta)\). By symmetry, \(w_n(x)\) is also the number of walks of length \(n\) starting at \(x\). For most values of \(m\) and \(x\), it seems difficult to give a closed-form expression for \(w_n(x)\) for \(n \in \N\) and \(x \in \N_m\). The walk generating function \(W\) for \((\N_m, \lfrta)\), which encodes the same information, is easier to study.

For \(x \in \N_m\), the generating function \(W(x, t)\) is defined for \(|t| \lt \frac{1}{2}\). The function \(x \mapsto W(x, t)\) is symmetric and is given by \begin{align*} W(0, t) &= 1 + t W(1, t) \\ W(x, t) &= 1 + t W(x - 1, t) + t W(x + 1, t), \quad x \in \{1, 2, \ldots, m - 1\} \end{align*}

Details:

Recall that \(w_n(x)\) can be interpreted as the number of walks of length \(n \in \N\) starting from \(x \in S\). Since there are at most 2 neighbors of each state, \(w_n(x) \le 2^n\). So the reuslts follow from the definition of the generating function \[W(x, t) = \sum_{n = 0}^\infty w_n(x) t^n, \quad x \in S\] and from .

So \(\Gamma\) satisfies the relation given in Section 2 for the infinite path, but only for \(x \in \N_m\).

Consider the case \(m = 2\) so that \((\N_2, \lfrta)\) is the undirected simple path \((0, 1, 2)\). Find each of the following explicitly:

  1. The walk functions.
  2. The generating function.
Details:
  1. Let \(k \in \N_+\). Then \begin{align*} w_{2 k}(x) &= 2^k, \quad x \in \N_2 \\ w_{2 k + 1}(0) &= w_{2 k + 1}(2) = 2^k\\ w_{2 k + 1}(1) &= 2^{k + 1} \end{align*}
  2. For \(|t| \lt 1 / \sqrt{2}\), \begin{align*} W(0, t) &= W(2, t) = \frac{1 + t}{1 - 2 t^2} \\ W(1, t) &= \frac{1 + 2 t}{1 - 2 t^2} \end{align*}

Consider the case \(m = 3\) so that \((\N_3, \lfrta)\) is the undirected simple path \((0, 1, 2, 3)\). Find each of the following explicitly:

  1. The walk functions.
  2. The generating function.
Details:
  1. The path functions have an interesting representation in terms of the Fibonacci numbers. For \(n \in \N\), let \(c_n\) denote the \(n\)th Fibonacci number, so that \[\bs{c} = (0, 1, 1, 2, 3, 5, 8, 13, 21, \ldots)\] For \(n \in \N\), \begin{align*} w_n(0) &= w_n(3) = c_{n + 1}\\ w_n(1) &= w_n(2) = c_{n + 2}\\ \end{align*}
  2. For \(|t| \lt (\sqrt{5} - 1) / 2\), \begin{align*} W(0, t) &= W(3, t) = \frac{1}{1 - t - t^2} \\ W(1, t) &= W(2, t) = \frac{1 + t}{1 - t - t^2} \end{align*}

Consider the case \(m = 4\) so that \((\N_4, \lfrta)\) is the undirected simple path \((0, 1, 2, 3, 4)\). Find each of the following explicitly:

  1. The walk functions.
  2. The generating function.
Details:
  1. For \(k \in \N_+\), \begin{align*} w_{2 k}(0) &= w_{2 k}(4) = 2 \cdot 3^{k - 1}, \quad w_{2 k + 1}(0) = w_{2 k + 1}(4) = 3^k\\ w_{2 k}(1) &= w_{2 k}(3) = 3^k, \quad w_{2 k + 1}(1) = w_{2 k + 1}(3) = 2 \cdot 3^k\\ w_{2 k}(2) &= 4 \cdot 3^{k - 1}, \quad w_{2 k + 1}(2) = 2 \cdot 3^k \end{align*}
  2. For \(|t| \lt 1 / \sqrt{3}\), \begin{align*} W(0, t) &= W(4, t) = \frac{1 + t - t^2}{1 - 3 t^2} \\ W(1, t) &= W(3, t) = \frac{1 + 2 t}{1 - 3 t^2} \\ W(2, t) &= \frac{(1 + t)^2}{1 - 3 t^2} \end{align*}

Recall that the \(\sigma\)-algebra associated with a graph is the \(\sigma\)-algebra generated by the neighbor sets.

For \(m \in \N_+\) and \(m \ne 2\), the \(\sigma\)-algebra associated with \((\N_m, \lfrta)\) is \(\ms P(\N_m)\). The case \(m = 2\) is exceptional and in this case, the associated \(\sigma\)-algebra is \[\sigma(\{1\}) = \{\emptyset, \{1\}, \{0, 2\}, \{0, 1, 2\}\}\]

Details:

Let \(A_x = \{y \in \N_m: x \lfrta y\}\) denote the neighbor set of \(x \in \N_m\) so that \(\ms A = \sigma(\{A_x: x \in \N_m\})\) is the associated \(\sigma\)-algebra. The case \(m = 1\) is trivial since the neighbor sets are \(A_0 = \{1\}\) and \(A_1 = \{0\}\). When \(m = 2\) the neighbor sets are \(A_0 = A_2 = \{1\}\) and \(A_1 = \{0, 2\}\). Suppose that \(m \ge 3\). Then \(A_x \cap A_{x + 2} = \{x + 1\}\) for \(x \in \{0, 1, \ldots, m - 2\}\). Hence \(\{x\} \in \ms A\) for \(x \in \{1, 2, \ldots, m - 1\}\). But also \(\{0\} = A_1 \setminus A_3\) and \(\{m\} = A_{m - 1} \setminus A_{m - 3}\). Hence \(\ms A = \ms P(\N_m)\).

Our next goal is to identify the (right) eigenvalues and eigenfunctions of \((\N_m, \lfrta)\), and once again, the polynomials in Section 1 will play a critical role.

The characteristic polynomial of \((\N_m, \lfrta)\) is \(P_{m+1}\). If \(\beta \in \R\) is an eigenvalue, then an eigenfunction \(g\) is given by \(g(x) = P_x(\beta)\) for \(x \in \N_m\).

Details:

Let \(L\) denote the adjacency matrix of \((\N_m, \lfrta)\) as usual. Then \(\beta \in \R\) is an eigenvalue and \(g: \N_m \to \R\) a corresponding eigenfunction if \(g\) is nonzero and \(L g = \beta g\), or equivalently,

  1. \( g(1) = \beta g(0) \)
  2. \( g(x + 1) = \beta g(x) - g(x - 1), \quad x \in \{1, 2, \ldots, m\} \)
  3. \( g(m - 1) = \beta g(m) \)

Equivalently, we can consider functions \(g: \N \to \R\) satisfying (a) and (b) above, but with the boundary condition \(g(m + 1) = 0\) which then gives (c). But by the results in Section 2, the solution of (a) and (b) is given by \(g(x) = g(0) P_x(\beta)\) for \(x \in \N\), and the requirement that \(g(m + 1) = 0\) means that \(\beta\) is a root of \(P_{m + 1}\).

The eigenvalues of \((\N_m, \lfrta)\) are \[\beta_k = 2 \cos\left(\frac{k}{m + 2} \pi\right), \quad k \in \{1, 2, \ldots, m + 1\}\] The eigenvalues are simple and for \(k \in \{1, 2, \ldots, m + 1\}\), an eigenfunction \(g_k\) corresponding to \(\beta_k\) is given by \[g_k(x) = \frac{1}{\sin\left(\frac{k}{m + 2}\right)} \sin\left[\frac{(x + 1) k}{m + 2} \pi\right], \quad x \in \N_m\]

Details:

These results follow from Proposition and results in Section 1.

Probability

Suppose now that \(X\) is a random variable in \(\N_m\) with density function \(f\) so that \(f(x) = \P(X = x)\) for \(x \in \N_m\).

The reliability function \(F\) of \(X\) for \((\N_m, \lfrta)\) is given by \begin{align*} F(0) &= f(1)\\ F(x) &= f(x - 1) + f(x + 1), \quad x \in \{1, 2, \ldots, m - 1\}\\ F(m) &= f(m - 1) \end{align*}

As usual, our support assumption that \(F(x) \gt 0\) for \(x \in \N_m\) is in effect.

The rate function \(r\) of \(X\) for \((\N_m, \lfrta)\) is given by \begin{align*} r(0) &= \frac{f(0)}{f(1)}\\ r(x) &= \frac{f(x)}{f(x - 1) + f(x + 1)}, \quad x \in \{1, 2, \ldots, m - 1\}\\ r(m) &= \frac{f(m)}{f(m - 1)} \end{align*}

The reliability function \(F\) uniquely determines the density function \(f\) for some values of \(m \in \N_+\), but not others.

Suppose that \(F\) is the reliability function for \((N_m, \lfrta)\) corresponding to a density function \(f\). Then \(F\) uniquely determines \(f\) except when \(m = 2 \bmod 4\).

Details:

From , note that 0 is an eigenvalue if and only if \(m\) is even. In this case, an eigenfunction \(g\) is given by \begin{align*} g(2 x) &= (-1)^x, \quad x \in \{0, 1, \ldots m / 2\} \\ g(2 x + 1) &= 0, \quad x \in \{0, 1, \ldots, m / 2 - 1\} \end{align*} If \(m\) is a multiple of \(4\) then \(\sum_{x = 0}^m g(x) = 1\). If \(m = 2 \bmod 4\) then \(\sum_{x = 0}^m g(x) = 0\). So the result follows from Section 1.3.

From and it follows that the graph \((\N_m, \lfrta)\) is stochastic except when \(m = 2 \bmod 4\). The graph is stochastic in the exceptional case \(m = 2\) as well. If \(P\) is a probability meaasure on the associated \(\sigma\)-algebra \(\ms A\) in , with reliability function \(F\), then \(P(\{1\}) = F(0) = F(2)\) and \(P(\{0, 2\}) = F(1)\). But of course a reliability function \(F\) of a probability measure on the reference \(\sigma\)-algebra \(\ms P(\N_2)\) does not uniquely determine that measure in general. When \(m = 2 \bmod 4\) and \(m \gt 2\) then the graph \((\N_m, \lfrta)\) is non-stochastic: the associated \(\sigma\)-algebra is \(\ms P(\N_m)\) but the reliability function of a probability measure on \(\ms P(\N_m)\) does not uniquely determine that measure in general. The following result gives more detail on how to recover the density function \(f\) from the reliabiity function \(F\).

Suppose that \(m \in \N_+\) and that \(f\) is a probability density function on \(\N_m\) with reliability function \(F\) for the graph \((\N_m, \lfrta)\).

  1. If \(m\) is odd, \begin{align*} f(2 x + 1) &= \sum_{k = 0}^x (-1)^{x - k} F(2 k), \quad x \in \{0, 1, \ldots, (m - 1) / 2\} \\ f[m - (2 x + 1)] &= \sum_{k = 0}^x (-1)^{x - k} F(m - 2 k), \quad x \in \{0, 1, \ldots, (m - 1) / 2\} \end{align*}
  2. If \(m\) is even, \begin{align*} f(2 x + 1) &= \sum_{k = 0}^x (-1)^{x - k} F(2 k), \quad x \in \{0, 1, \ldots m / 2 - 1\} \\ f(2 x) &= (-1)^x f(0) + \sum_{k = 0}^{x - 1} (-1)^{x - 1 - k} F(2 k + 1), \quad x \in \{0, 1, \ldots, m / 2\} \end{align*}

In part (a), the first equation gives \(f\) at an odd vertex in terms of \(F\) at even vertices, while the second equation gives \(f\) at an even vertex in terms of \(F\) at odd vertices. In particular, when \(m\) is odd, \(F\) uniquely determines \(f\). In part (b), the first equation once again gives \(f\) at an odd vertex in terms of \(F\) at even vertices. The second equation gives \(f\) at an even vertex in terms of \(f(0)\) and \(F\) at odd vertices. When \(m\) is not of the form \(2 \bmod 4\), these equations together with the normalizing equation \(\sum_{x = 0}^m f(x) = 1\) uniquely determine \(f\). But when \(m = 4 k + 2\) for some \(k \in \N\), the normalizing equation gives no additional information since it's equivalent to \[\sum_{j = 0}^k [F(4 j) + F(4 j + 1)] = 1\] The following simple example gives two distributions with the same reliability function in the case \(m = 6\).

Define the probability density functions \(f\) and \(g\) on \(\N_6\) by \[f(1) = f(3) = f(5) = 1 / 6, \, f(0) = f(2) = f(4) = f(6) = 1 / 8 \] \[g(1) = g(3) = g(5) = 1 / 6, \, g(0) = g(4) = 1 / 12, \, g(2) = g(6) = 1 / 6 \] Then \(f\) and \(g\) have the same reliability function \(F\) for \((\N_6, \lfrta)\), given by \[F(0) = F(6) = 1 / 6, \, F(2) = F(4) = 1 / 3, \, F(1) = F(3) = F(5) = 1 / 4 \]

Constant Rate Distributions

As always, we are particularly interested in constant rate distributions. The equations for the density \(f\) to have constant rate \(\alpha \in (0, \infty)\) for \((\N_m, \lfrta)\) are \begin{align} f(0) &= \alpha f(1) \\ f(x) &= \alpha[f(x - 1) + f(x + 1)], \quad x \in \{1, 2, \ldots, m - 1\} \\ f(m) &= \alpha f(m - 1) \end{align} and of course we also need \(f(x) \ge 0\) for \(x \in S\) and \(\sum_{x=0}^m f(x) = 1\). Here is our main result:

For each \(m \in \N_+\), there exists a unique constant rate distribution for \((\N_m, \lfrta)\). The rate is \[\alpha = \frac{1}{2 \cos\left(\frac{\pi}{m+2}\right)}\] and the density function \(f\) is given by \[f(x) = \frac{\sin\left(\frac{\pi}{m + 2}\right)}{1 + \cos\left(\frac{\pi}{m + 2}\right)} \sin\left(\frac{x + 1}{m + 2} \pi\right), \quad x \in \N_m\]

Details:

The eigenvalues of the adjacency matrix \(L\) are given in . The largest eigenvalue is \[2 \cos\left(\frac{\pi}{m + 2}\right)\] and a corresponding eigenfunction \(g\) is given by \[g(x) = \sin\left(\frac{x + 1}{m + 2} \pi\right), \quad x \in \N_m \] So the result follows from the Peron-Frobenius theorem as discussed in Section 1.5. The probability density function \(f\) is the normalized eigenfunction.

Note that the density function \(f\) of the constant rate distribution is symmetric. Clearly also \(\alpha \in \left(\frac 1 2, 1\right)\) for \(m \in \N_+\) and \(\alpha \to \frac 1 2\) as \(m \to \infty\). By symmetry, the mean of the constant rate distribution is \(m / 2\), but the variance and other higher-order moments do not have simple expressions.

The app below is a simulation of the constant rate distribution on the path graph \((\N_m, \lfrta)\). The parameter \(m\) can be varied with the scroll bar.

Consider the case \(m = 2\) so that \((\N_2, \lfrta)\) is the undirected simple path \((0, 1, 2)\). Find each of the following explicitly:

  1. The rate constant \(\alpha\) and the probability density function \(f\) of the constant rate distribution.
  2. The mean and variance of the constant rate distribution.
Details:
  1. \(\alpha = \frac{1}{\sqrt{2}} \approx 0.707\). \begin{align*} f(0) &= f(2) = \frac{1}{2 + \sqrt{2}} \approx 0.293\\ f(1) &= \frac{\sqrt{2}}{2 + \sqrt{2}} \approx 0.414 \end{align*}
  2. The mean is \(\mu = 1\) and the variance is \(\sigma^2 = 2 - \sqrt{2} \approx 0.586\).

Consider the case \(m = 3\) so that \((\N_3, \lfrta)\) is the undirected simple path \((0, 1, 2, 3)\). Find each of the following explicitly:

  1. The rate constant \(\alpha\) and the probability density function \(f\) of the constant rate distribution.
  2. The mean and variance of the constant rate distribution.
Details:
  1. \(\alpha = (\sqrt{5} - 1) / 2 \approx 0.618\). \begin{align*} f(0) &= f(3) = \frac{3 - \sqrt{5}}{4} \approx 0.191\\ f(1) &= f(2) = \frac{\sqrt{5} - 1}{4} \approx 0.309 \end{align*}
  2. The mean is \(\mu = \frac{3}{2}\) and the variance is \(\sigma^2 = \frac{13}{4} - \sqrt{5} = 1.014\). Note that the Fibonacci numbers referenced in example can be written in terms of the constant rate \(\alpha\). For \(n \in \N\), \[c_n = \frac{\alpha^{-n}+(-1)^{n+1} \alpha^n}{2 \alpha + 1}\] This is essentially Binet's formula.

Consider the case \(m = 4\) so that \((\N_4, \lfrta)\) is the undirected simple path \((0, 1, 2, 3, 4)\). Find each of the following explicitly:

  1. The rate constant \(\alpha\) and the probability density function \(f\) of the constant rate distribution.
  2. The mean and variance of the constant rate distribution.
Details:
  1. \(\alpha = 1 / \sqrt{3} \approx 0.577\). \begin{align*} f(0) &= f(4) = \frac{1}{4 + 2 \sqrt{3}} \approx 0.134\\ f(1) &= f(3) = \frac{3}{6 + 4 \sqrt{3}} \approx 0.232\\ f(2) &= \frac{1}{2 + \sqrt{3}} \approx 0.268 \end{align*}
  2. The mean is \(\mu = 2\) and the variance is \(\sigma^2 = 5 - 2 \sqrt{3} \approx 1.536\).

Our next result gives the asymptotic behavior of the constant rate distributions.

Suppose that \(X_m\) has the constant rate distribution on \((\N_m, \lfrta)\) for each \(m \in \N_+\). Then as \(m \to \infty\), the distribution of \(X_m / m\) converges (in the usual sense) to the Gilbert's sine distribution on \([0, 1]\), with density function \[x \mapsto \frac{\pi}{2} \sin(\pi x), \quad x \in [0, 1]\]

Details:

Let \(x \in [0, 1]\). Then \begin{align*} \P(X_m / m \le x) &= \sum_{k = 0}^m \bs{1}\left(\frac{k}{m} \le x\right) f(k) \\ &= \frac{\sin\left(\frac{\pi}{m + 2}\right)}{1 + \cos\left(\frac{\pi}{m + 2}\right)} \sum_{k = 0}^m \bs{1}\left(\frac{k}{m} \le x\right) \sin\left(\frac{k + 1}{m + 2} \pi\right) \\ &= \left[m \frac{\sin\left(\frac{\pi}{m + 2}\right)}{1 + \cos\left(\frac{\pi}{m + 2}\right)}\right] \left[\frac{1}{m} \sum_{k = 0}^m \bs{1}\left(\frac{k}{m} \le x\right) \sin\left(\frac{k + 1}{m + 2} \pi\right) \right] \end{align*} The factor in the first set of square brackets converges to \(\pi / 2\) as \(m \to \infty\) by simple calculus. The factor in the second set of square brackets is a Riemann sum for \(\int_0^x \sin(\pi u) du\) Hence \[\P(X_m / m \le x) \to \int_0^x \frac{\pi}{2} \sin(\pi u) \, du \text{ as } m \to \infty\] As a function of \(x \in [0, 1]\), the right side is the distribution function of Gilbert's sine distribution.

The app below is a simulation of the sine distribution.

The sine distribution is unimodal and is symmetric about \(\frac 1 2\). The density function is concave downward. Here are some other simple properties:

Suppose that \(X\) has the sine distribution. Then

  1. \(X\) has distribution function \(x \mapsto \frac{1}{2} [1 - \cos(\pi x)]\) on \([0, 1]\).
  2. \(X\) has quantile function \(p \mapsto \frac{1}{\pi} \arccos(1 - 2 p)\) on \([0, 1]\).
  3. \(\E(X) = \frac 1 2\)
  4. \(\var(X) = \frac 1 4 - \frac{2}{\pi^2}\)
  5. \(X\) has moment generating function given by \[\E\left(e^{t X}\right) = \frac{\pi^2 (1 + e^t)}{2(t^2 + \pi^2)}, \quad t \in \R\]

Random Walk

Suppose now that \(\bs Y = (Y_1, Y_2, \ldots)\) is the random walk on the path \(\N_m, \lfrta)\) associated with the constant rate distribution. So \(Y_1\) has the constant rate distribution, and the transition probabilitiy matrix \(P\) is given by \(P(0, 1) = 1\), \(P(m, m - 1) = 1\), and \[P(x, x - 1) = \alpha \frac{f(x - 1)}{f(x)}, \; P(x, x + 1) = \alpha \frac{f(x + 1)}{f(x)}, \quad x \in \{1, 2, \ldots, m - 1\}\] where \(\alpha\) and \(f\) are given in .

The random walk \(\bs Y\) has a unique invariant distribution with density function \(h\) given by \[h(x) = \frac{2}{m + 2} \sin^2\left(\frac{x + 1}{m + 2} \pi\right), \quad x \in \N_m\]

Details:

Since the path is symmetric, it follows from our general theory that the square \(f^2\) of the constant rate density \(f\) is invariant for \(\bs Y\). Equivalently, the square \(g^2\) of the largest eigenfunction \(g\) is invariant. After some algegra, the normalizing constant for \(g^2\) is \[\sum_{x = 0}^m g^2(x) = \sum_{x = 0}^m \sin^2\left(\frac{x + 1}{m + 2} \pi\right) = \frac{m + 2}{2}\] and so the result follows.

Of course, \(h\) is also symmetric and hence has mean \(m / 2\). Once again, the variance and higher-order moments do not have simple expressions.

The app below is a simulation of the random walk on \((\N_m, \lfrta)\) corresponding to the constant rate distribution. The parameter \(m\) can be varied with the scroll bar. The initial state is chosen at random.

Consider the case \(m = 2\) so that \((\N_2, \lfrta)\) is the undirected simple path \((0, 1, 2)\). Find each of the following explicitly:

  1. The probability density function \(f_n\) of \(Y_n\) for \(n \in \N_+\)
  2. The invariant probability density function.
Details:
  1. For \(k \in \N_+\), \begin{align*} f_{2 k}(0) &= f_{2 k}(2) = \frac{1}{\sqrt{2}}f(0) = \frac{1}{2 + 2 \sqrt{2}} \approx 0.207\\ f_{2 k}(1) &= \sqrt{2} f(1) = \frac{2}{2 + \sqrt{2}} \approx 0.586 \end{align*} and \(f_{2 k + 1} = f\). So the distribution of \(Y_n\) depends on \(n \in \N_+\) only through parity (odd or even).
  2. \(h(0) = h(2) = \frac 1 4\), \(h(1) = \frac 1 2\).

Consider the case \(m = 3\) so that \((\N_3, \lfrta)\) is the undirected simple path \((0, 1, 2, 3)\). Find each of the following explicitly:

  1. The probability density function \(f_n\) of \(Y_n\) for \(n \in \N_+\).
  2. The invariant probability density function.
Details:
  1. For \(n \in \N_+\), \begin{align*} f_n(0) &= f_n(3) = \frac{\alpha + (-1)^{n + 1} \alpha^{2 n + 1}}{4 \alpha + 2}\\ f_n(1) &= f_n(2) = \frac{1 + \alpha + (-1)^n \alpha^{2 n + 1}}{4 \alpha + 2} \end{align*} Note that \begin{align*} f_n(0) &= f_n(3) \to \frac{\alpha}{4 \alpha + 2} = \frac{5 - \sqrt{5}}{20} \approx 0.138 \text{ as } n \to \infty\\ f_n(1) &= f_n(2) \to \frac{1 + \alpha}{4 \alpha + 2} = \frac{5 + \sqrt{5}}{20} \approx 0.362 \text{ as } n \to \infty \end{align*}
  2. \(h(0) = h(3) = \frac 1 4 \left(1 - \frac{1}{\sqrt{5}}\right)\), \(h(1) = h(2) = \frac 1 4 \left(1 + \frac{1}{\sqrt{5}}\right)\)

Consider the case \(m = 4\) so that \((\N_4, \lfrta)\) is the undirected simple path \((0, 1, 2, 3, 4)\). Find each of the following explicitly:

  1. The probability density function \(f_n\) of \(Y_n\) for \(n \in \N_+\).
  2. The invariant probability density function.
Details:
  1. Let \(k \in \N_+\). Then \begin{align*} f_{2 k}(0) &= f_{2k}(4) = \frac{1}{6 + 4 \sqrt{3}} \approx 0.0773\\ f_{2 k}(1) &= f_{2k}(3) = \frac{1}{2 + \sqrt{3}} \approx 0.268 \\ f_{2 k}(2) &= \frac{3}{3 + 2 \sqrt{3}} \approx 0.309 \\ f_{2 k + 1}(0) &= f_{2 k + 1}(4) = \frac{1}{6 + 3 \sqrt{3}} \approx 0.0893\\ f_{2 k + 1}(1) &= f_{2 k + 1}(3) = \frac{3}{6 + 4 \sqrt{3}} \approx 0.232\\ f_{2 k + 1}(2) &= \frac{4}{6 + 3 \sqrt{3}} \approx 0.357 \end{align*} So there are three distinct gamma distribution, corresponding to \(n = 1\), odd \(n \ge 3\) and even \(n\).
  2. \(h(0) = h(4) = \frac{1}{12}\), \(h(1) = h(3) = \frac 1 4\), \(h(2) = \frac 1 3\).

Our last task is to find the limit of the invariant distributions as \(m \to \infty\).

Suppose that \(X_m\) has the invariant distribution in for each \(m \in \N_+\). Then as \(m \to \infty\), the distribution of \(X_m / m\) converges (in the usual sense) to the sine square distribution on \([0, 1]\), with density function \[x \mapsto 2 \sin^2(\pi x), \quad x \in [0, 1]\]

Details:

Let \(x \in [0, 1]\). Then \begin{align*} \P(X_m / m \le x) &= \sum_{k = 0}^m \bs{1}\left(\frac{k}{m} \le x\right) f(k) \\ &= \frac{2}{m + 2} \sum_{k = 0}^m \bs{1}\left(\frac{k}{m} \le x\right) \sin^2\left(\frac{k + 1}{m + 2} \pi\right) \\ &= \frac{2 m}{m + 2} \left[\frac{1}{m} \sum_{k = 0}^m \bs{1}\left(\frac{k}{m} \le x\right) \sin^2\left(\frac{k + 1}{m + 2} \pi\right) \right] \end{align*} The first factor converges to 2 as \(m \to \infty\). The sum is a Riemann sum for \(\int_0^x \sin^2 (\pi u) du\) Hence \[\P(X_m / m \le x) \to \int_0^x 2 \sin^2 (\pi u) \, du \text{ as } m \to \infty\] As a function of \(x \in [0, 1]\), the right side is the distribution function of the sine square distribution.

The app below is a simulation of the sine square distribution.

The sine square distribution, like the sine distribution, is unimodal and symmetric about \(\frac 1 2\). The density function is concave upward, then downward, then upward again, with inflection points at \(x = \frac 1 4\) and \(x = \frac 3 4\). Here are some other simple properties.

Suppose that \(X\) has the sine square distribution. Then

  1. \(X\) has distribution function \(x \mapsto x - \frac{1}{2 \pi} \sin(2 \pi x)\) on \([0, 1]\)
  2. \(\E(X) = \frac 1 2\)
  3. \(\var(X) = \frac {1}{12} - \frac{1}{ 2 \pi^2}\)
  4. \(X\) has moment generating function given by \[\E\left(e^{t X}\right) = \frac{4 \pi^2 (e^t - 1)}{t (4 \pi^2 + t^2)}, \quad t \in \R\]