\(\newcommand{\P}{\mathbb{P}}\) \(\newcommand{\E}{\mathbb{E}}\) \(\newcommand{\R}{\mathbb{R}}\) \(\newcommand{\N}{\mathbb{N}}\) \(\newcommand{\ms}{\mathscr}\) \(\newcommand{\bs}{\boldsymbol}\) \(\newcommand{\rta}{\rightarrow}\) \(\newcommand{\upa}{\uparrow}\) \(\newcommand{\Upa}{\Uparrow}\) \(\newcommand{\lfrta}{\leftrightarrow}\) \(\newcommand{\Lfrta}{\Leftrightarrow}\) \(\newcommand{\lrta}{\looparrowright}\)
  1. Reliability
  2. 1. Graphs
  3. 1
  4. 2
  5. 3
  6. 4
  7. 5
  8. 6
  9. 7
  10. 8
  11. 9
  12. 10

6. Reflexive Closure

Basics

In this brief section, we study a simple construction that is helpful for the analysis of discrete graphs. To begin, suppose that \((S, \rta)\) is a discrete, irreflexive graph, so in the combinatorial sense, a graph with no loops.

Recall that the reflexive closure of \((S, \rta)\) is the graph \((S, \lrta)\) where \(x \lrta y\) if and only if \(x = y\) or \(x \rta y\) for \(x, \, y \in S\).

That is, we simply add a loop at each vertex so that \((S, \lrta)\) is the union of \((S, \rta)\) and \((S, =)\). Equivalently, we could start with a discrete, reflexive graph \((S, \lrta)\) and remove the loops to form the irreflexive graph \((S, \rta)\). Recall that this is irreflexive reduction of \((S, \lrta)\) and is the difference between \((S, \lrta)\) and \((S, =)\). Recall in particular that if \((S, \prec)\) is a strict partial order graph (anti-reflexive, asymmetric, and transitive) then the reflexive closure is \((S, \preceq)\), the corresponding partial order graph (reflexive, anti-symmetric, and transitive). We will use our usual notation for mathematical objects related to the graphs \((S, \rta)\) and \((S, \lrta)\), but with the addition of a circumflex for the latter. The following result connects the path functions for \((S, \rta)\) and \((S, \lrta)\).

Let \(\gamma_n\) and \(\hat \gamma_n\) denote the path functions of order \(n \in \N\), for \((S, \rta)\) and \((S, \lrta)\) respectively. Then for \(x \in S\), \begin{align*} \hat \gamma_n(x) &= \sum_{k=0}^n \binom{n}{k} \gamma_k(x) \\ \gamma_n(x) &= \sum_{k=0}^n (-1)^{n - k} \binom{n}{k} \hat \gamma_k(x) \end{align*}

Details:

Let \(n \in \N_+\) and \(x \in S\). Consider a path \(\bs{x} = (x_1, x_2, \ldots, x_k, x_{k+1})\) of length \(k\) in \((S, \rta)\), where \(k \in \N\), \(k \le n\), and \(x_{k+1} = x\) (so the path ends at \(x\)). Since \(\rta\) is anti-reflexive, \(x_{j+1} \ne x_j\) for \(j \in \{1, 2, \ldots k\}\). To construct a path \(\bs y\) of length \(n\) in \((S, \lrta)\) that ends in \(x\), we can add \(n - k\) loops to the path \(\bs x\). Conversely, every path of length \(n\) in \((S, \lrta)\) that ends in \(x\) must have this form for some \(k \le n\). In the path \(\bs y\), let \(a_j\) be the number of repetitions (loops) of \(x_j\) for \(j \in \{1, 2, \ldots, k + 1\}\). Then \(a_j\) is a nonnegative integer and \(\sum_{j=1}^{k+1} a_j = n - k\). The number of solutions is \[\binom{(k + 1) + (n - k) - 1}{n - k} = \binom{n}{k}\] By definition, there are \(\gamma_k(x)\) paths of length \(k\) in \((S, \rta)\) that end in \(x\), so it follows that the number of paths of length \(n\) for \((S, \lrta)\) that end in \(x\) is \[\hat \gamma_n(x) = \sum_{k=0}^n \binom{n}{k} \gamma_k(x)\] The second result follows from the first by the binomial inversion formula.

The generating functions are also related in a simple way.

Suppose that \(\Gamma\) and \(\hat \Gamma\) denote the path generating functions for \((S, \rta)\) and \((S, \lrta)\), with convergence functions \(\rho\) and \(\hat \rho\), respectively. Then for \(x \in S\), \begin{align*} \hat \Gamma(x, t) & = \frac{1}{1-t} \Gamma \left(x, \frac{t}{1-t}\right), \quad \hat \rho(x) = \frac{\rho(x)}{1 + \rho(x)} \\ \Gamma(x, t) & = \frac{1}{1 + t} \hat \Gamma\left(x, \frac{t}{1 + t}\right), \quad \rho(x) = \frac{\hat \rho(x)}{1 - \hat \rho(x)} \end{align*}

Details:

For \(x \in S\), \begin{align*} \hat \Gamma(x, t) &= \sum_{n=0}^\infty \hat \gamma_n(x) t^n = \sum_{n=0}^\infty \sum_{k=0}^n \binom{n}{k} \gamma_k(x) t^n \\ &= \sum_{k=0}^\infty \gamma_k(x) \sum_{n=k}^\infty \binom{n}{k} t^n = \sum_{k=0}^\infty \gamma_k(x) \frac{t^k}{(1 - t)^{k+1}} \\ &= \frac{1}{1 - t} \sum_{k=0}^\infty \gamma_k(x) \left(\frac{t}{1 - t}\right)^k = \frac{1}{1-t} \Gamma\left(x, \frac{t}{1 - t}\right) \end{align*} The series converges when \(|t / (1 - t)| \le \rho(x)\). Solving \(t / (1 - t) = \rho(x)\) for \(t\) gives \(\hat \rho(x) = \rho(x) / [1 + \rho(x)]\). It should be understood that \(\hat \rho(x) = 1\) if \(\rho(x) = \infty\).

The second result follows from the first. For \(t \in \R\), let \(s = t / (1 + t)\) so that \(t = s / (1 - s)\). Then \[\Gamma(x, t) = \Gamma\left(x, \frac{s}{1-s}\right) = (1 - s) \hat \Gamma(x, s) = \frac{1}{1 + t} \hat \Gamma\left(x, \frac{t}{1 + t}\right)\] Solving \(t / (1 + t) = \hat \rho(x)\) for \(t\) gives \(\rho(x) = \hat \rho(x) / [1 - \hat \rho(x)]\). It should be understood that \(\rho(x) = \infty\) if \(\hat \rho(x) = 1\).

Probability

Suppose now that \(X\) is a random variable in \(S\). There are very simple relationships between the reliability functions, rate functions, and generating functions of \(X\) with respect to \((S, \rta)\) and \((S \lrta)\). This in turn leads to a simple relationship between the constant rate distributions for the two graphs (if they exist). As usual, we assume that \(X\) is supported by the graph \((S, \rta)\) (and as we will see below, supported also by \((S, \lrta)\)).

Suppose that \(X\) is supported by \((S, \rta)\).

  1. Let \(f\) denote the density function of \(X\), and let \(F\) and \(\hat F\) denote the reliability functions of \(X\) for \((S, \rta)\) and \((S, \lrta)\), respectively. Then \[\hat F = f + F, \quad F = \hat F - f\]
  2. Let \(r\) and \(\hat r\) denote the rate functions of \(X\) for \((S, \rta)\) and \((S, \lrta)\), respectively. Then \[\hat r = r / (1 + r), \quad r = \hat r / (1 - \hat r)\]
  3. Let \(\Lambda\) and \(\hat \Lambda\) denote the generating functions of \(X\) for \((S, \rta)\) and \((S, \lrta)\), respectively. Then \begin{align*} \hat \Lambda(t) &= \frac{1}{1 - t} \Lambda\left(\frac{t}{1 - t}\right) \\ \Lambda(t) & = \frac{1}{1 + t} \hat \Lambda\left(\frac{t}{1 + t}\right) \end{align*}
Details:
  1. Note that \[\hat F(x) = \P(x \lrta X) = \P(X = x) + \P(x \rta X) = f(x) + F(x), \quad x \in S\]
  2. From (a), \[\hat r = \frac{f}{\hat F} = \frac{f}{f + F} = \frac{f / F}{f / F + 1} = \frac{r}{r + 1}\]
  3. This follows from .

As usual, we are particularly interested in constant rate distributions.

Suppose again that \(X\) is supported by \((S, \rta)\). Then

  1. If \(X\) has constant rate \(\alpha \in (0, \infty)\) for \((S, \rta)\), then \(X\) has constant rate \(\beta = \alpha / (1 + \alpha)\) for to \((S, \lrta)\).
  2. If \(X\) has constant rate \(\beta \in (0, 1)\) for \((S, \lrta)\), then \(X\) has constant rate \(\alpha = \beta / (1 - \beta)\) for \((S, \rta)\).

Note that \(\hat r \lt 1\). Also, there is a one-to-one correspondence between constant rate distributions for \((S, \rta)\) and \((S, \lrta)\). The respective rate constants \(\alpha\) and \(\beta\) satisfy \(1 / \beta - 1 / \alpha = 1\), a bit reminiscent of conjugate exponents. In particular, a constant rate distribution exists for \((S, \rta)\) if and only if a constant rate distribution exists for \((S, \lrta)\).

Random Walks

Suppose that \(X\) is a random variable on \(S\) as above. Let \(\bs Y = (Y_1, Y_2, \ldots)\) and \(\bs{\hat Y} = (\hat Y_1, \hat Y_2, \ldots)\) denote the random walks on \((S, \rta)\) and \((S, \lrta)\) associated with \(X\), respectively. The transition densities are also related in a simple way.

Let \(P\) and \(\hat P\) denote the transition densities of \(\bs Y\) and \(\bs{\hat Y}\), respectively.

  1. Let \(r\) denote the rate function of \(X\) for \((S, \rta)\). Then \[\hat P(x, x) = \frac{r(x)}{r(x) + 1}, \quad x \in S; \quad \quad \hat P(x, y) = \frac{1}{r(x) + 1} P(x, y), \quad x \rta y\]
  2. Let \(\hat r\) denote the rate function of \(X\) for \((S, \lrta)\). Then \[P(x, y) = \frac{1}{1 - \hat r(x)} \hat P(x, y), \quad x \rta y\]
Details:

As before, let \(F\) and \(\hat F\) denote the reliability functions of \(f\) for \((S, \rta)\) and \((S, \lrta)\), respectively. If \(x \lrta y\) then \[\hat P(x, y) = \frac{f(y)}{\hat F(x)} = \frac{f(y)}{f(x) + F(x)} = \frac{f(y) / F(x)}{f(x) / F(x) + 1} = \frac{f(y) / F(x)}{r(x) + 1}\] Part (a) follows since \(x \lrta y\) if and only if \(x = y\) or \(x \rta y\). Part (b) follows from part (a).

As a corollary, the higher order-density functions for \((S, \lrta)\) and \((S, \rta)\) have a simple relationship when the underlying random variable has constant rate.

For \(n \in \N_+\), let \(f_n\) and \(\hat f_n\) denote the density functions of \(Y_n\) and \(\hat Y_n\), respectively.

  1. If \(X\) has constant rate \(\alpha \in (0, \infty)\) for \((S, \rta)\) then \[\hat f_n = \frac{1}{(1 + \alpha)^{n-1}} \sum_{j=1}^n \binom{n - 1}{j - 1} \alpha^{n-j} f_j\]
  2. If \(X\) has constant rate \(\beta\) for \((S, \lrta)\) then \[f_n = \frac{1}{(1 - \beta)^{n-1}} \sum_{j=1}^n \binom{n-1}{j-1} (-\beta)^{n-j} \hat f_j\]
Details:

The proof uses the general formula for the higher-order densities in Section 5

  1. If \(X\) has constant rate \(\alpha \in (0, \infty)\) for \((S, \rta)\) then \(X\) has constant rate \(\alpha / (\alpha + 1)\) for \((S, \lrta)\). Hence \[\hat f_n = \left(\frac{\alpha}{\alpha + 1}\right)^n \hat \gamma_{n-1} \hat F\] But \(\hat \gamma_{n-1} = \sum_{k=0}^{n-1} \binom{n - 1}{k} \gamma_k\) and \(\hat F = f + F = \alpha F + F = (\alpha + 1) F\). Substituting and simplifying gives the result.
  2. The proof is similar to (a).

Finally, recall that \(\bs N = \{N_A: A \in \ms S\}\) and \(\bs{\hat N} = \{\hat N_A: A \in \ms S\}\) are the point processes corresponding to the random paths \(\bs Y\) and \(\bs{\hat Y}\), respectively.

Suppose that \(X\) has constant rate \(\alpha \in (0, \infty)\) for \((S, \rta)\). Then \[\E(\hat N_A) = (1 + \alpha) \E(N_A), \quad A \in \ms S\]

Details:

Since \(X\) has constant rate \(\alpha \in (0, \infty)\) for \((S, \rta)\), \(X\) has constant rate \(\beta = \alpha / (\alpha + 1) \in (0, 1)\) for \((S, \lrta)\). Hence using the result in Section 5 and above, \begin{align*} \E(\hat N_A) &= \E[\hat \Gamma(X, \beta), X \in A] = \E\left[\frac{1}{1 - \beta} \Gamma\left(X, \frac{\beta}{1 - \beta}\right), X \in A\right]\\ &= (1 + \alpha) \E(\Gamma(X, \alpha), X \in A) = (1 + \alpha) \E(N_A) \end{align*}

Examples and Exercises

Regular Graphs

Suppose that the finite, irreflexive graph \((S, \rta)\) is regular of degree \(k \in \N_+\), so that the reflexive closure \((S, \lrta)\) is regular of dgree \(k + 1\). In this simple case, verify each of the following:

  1. Propostion on the path functions.
  2. Propostion on the generating functions.
  3. Propostion on the constant rate distributions.
Details:
  1. The path functions of order \(n \in \N_+\) are given by \(\gamma_n(x) = k^n\) and \(\hat \gamma_n(x) = (k + 1)^n\) for \(x \in S\). So holds by simple applications of the binomial theorem.
  2. The generating functions are given by \begin{align*} \Lambda(x, t) = \frac{1}{1 - k t}, \quad x \in S, \; |t| \lt \frac{1}{k} \\ \\hat Lambda(x, t) = \frac{1}{1 - (k + 1) t}, \quad x \in S, \; |t| \lt \frac{1}{k + 1} \end{align*} so holds by simple algebra.
  3. The uniform distribution on \(S\) has constant rate for both graphs, with rate constants \(\alpha = 1 / k\) and \(\beta = 1 / (k + 1)\). So holds by simple algebra.

Suppose that \(S\) is a nonempty countable set. The graph \((S, \ne)\) is the complete irrefexive graph on \(S\). The reflexive closure \((S, \equiv)\) is the complete reflexive graph on \(S\), so that \(x \equiv y\) for all \(x, \, y \in S\). Find the \(\sigma\) algebra associated with each graph.

Details:

For \(x \in S\), the neighbor set of \((S, \ne)\) at \(x\) is \(A_x = S - \{x\}\) while the neighbor set of \((S, \equiv)\) at \(x\) is \(B_x = S\). So the \(\sigma\)-algebra associated with \((S, \ne)\) is \(\ms A = \ms P(S)\), the collection of all subsets of \(S\). The \(\sigma\)-algebra associated with \((S, \equiv)\) is the trivial \(\sigma\)-algebra \(\ms B = \{\emptyset, S\}\).

In general, there is no simple relationship between the \(\sigma\)-algebra associated with the original irreflexive graph and the \(\sigma\)-algebra associated with the reflexive closure. If \(S\) in example is finite with \(k \in \N_+\) points, then \((S, \ne)\) is regular of degree \(k - 1\) while \((S, \equiv)\) is regular of degree \(k\), so applies

Partial Order Graphs

Suppose that \((S, \preceq)\) is a discrete partial order graph. As noted earlier, \((S, \prec)\) is the corresponding strict partial order graph, and \((S, \preceq)\) is the reflexive closure of \((S, \prec)\). Recall that the covering graph \((S, \upa)\) is defined by \(x \upa y\) if and only if \(y\) covers \(x\) for \(x, \, y \in S\). That is, \(x \prec y\) and no \(z \in S\) satisfies \(x \prec z \prec y\). Let \((S, \Upa)\) denote the reflexive closure of \((S, \upa)\) so that \(x \Upa y\) if and only if \(x = y\) or \(y\) covers \(x\). In Section 2 we showed that if \((S, \preceq)\) is well founded then the \(\sigma\)-algebra associated is \((S, \preceq)\) is \(\ms P(S)\), the reference \(\sigma\)-algebra. The same is true of \((S, \Upa)\).

If \((S, \preceq)\) is well founded then the \(\sigma\)-algebra associated with \((S, \Upa)\) is \(\ms P(S)\).

Details:

The neighbor set of \(x \in S\) for \((S, \Upa)\) is \(A_x = \{x\} \cup \{y \in S: x \upa y\}\). Since the graph is well founded, \[ \{x\} = A_x \setminus \bigcup_{x \upa y} A_ y, \quad x \in S\]

Suppose now that \((S, \preceq)\) has minimum element \(e\). Let \(S_+ = S - \{e\}\). Then \((S_+, \preceq)\) is a sub-graph of \((S, \preceq)\) and \((S_+, \prec)\) is a sub-graph of \((S, \prec)\), and the former is the reflexive closure of the latter.

Suppose now that \(X\) is a random variable on \(S\) with probability density function \(f\).

  1. The conditional density function \(g\) given \(X \in S_+\) is defined by \[g(x) = \frac{f(x)}{1 - f(e)}, \quad x \in S_+\]
  2. Let \(F\) denote the reliability function of \(X\) for \((S, \prec)\). Then the reliability function of \(X\) for \((S, \preceq)\) is \(\hat F = f + F\).
  3. For the conditional distribution of \(X\) given \(X \in S_+\), the reliability function \(G\) for \((S_+, \prec)\) is defined by \[G(x) = \frac{F(x)}{1 - f(e)}, \quad x \in S_+\]
  4. The reliability function of the conditional distribution for \((S_+, \preceq)\) is \(\hat G = G + g\) or equivalently, \[\hat G(x) = \frac{\hat F(x)}{1 - f(e)}, \quad x \in S_+\]

With the notation in place, the following corollaries are a summary of our results:

Suppose that \(X\) has constant rate \(\alpha \in (0, \infty)\) for \((S, \prec)\).

  1. The conditional distribution of \(X\) given \(X \in S_+\) has constant rate \(\alpha\) for \((S_+, \prec)\).
  2. \(X\) has constant rate \(\beta = \alpha / (1 + \alpha)\) for \((S, \preceq)\)
  3. The conditional distribution of \(X\) given \(X \in S_+\) has constant rate \(\beta\) for \((S_+, \preceq)\).

The Diamond Graph

Consider the reflexive closure \((S, \Lfrta)\) of the diamond graph \((S, \lfrta)\) introduced in Section 1 and shown below.

  1. Find the path function of order \(n \in \N\).
  2. Find the rate constant and the density function of the constant rate distribution.
Details:
  1. The path function of order \(n \in \N\) is given by \begin{align*} \bar \gamma_n(1) = \bar \gamma_n(3) &= \frac{1}{\sqrt{17}}\left[\left(\frac{3 + \sqrt{17}}{2}\right)^{n+1} - \left(\frac{3 - \sqrt{17}}{2}\right)^{n+1}\right] \\ \bar \gamma_n(2) = \bar \gamma_n(4) &= \frac{\sqrt{17} - 5}{2 \sqrt{17}} \left(\frac{3 - \sqrt{17}}{2}\right)^n + \frac{\sqrt{17} + 5}{2 \sqrt{17}} \left(\frac{3 + \sqrt{17}}{2}\right)^n \end{align*}
  2. The rate constant is \(\beta = 2 / (3 + \sqrt{17})\). The constant rate distribution is the same as for the regular diamond graph, with density function \(f\) given by \[f(1) = f(3) = \frac{5 - \sqrt{17}}{4}, \quad f(2) = f(4) = \frac{\sqrt{17} - 3}{4} \]