\(\newcommand{\P}{\mathbb{P}}\) \(\newcommand{\E}{\mathbb{E}}\) \(\newcommand{\R}{\mathbb{R}}\) \(\newcommand{\N}{\mathbb{N}}\) \(\newcommand{\ms}{\mathscr}\) \(\newcommand{\bs}{\boldsymbol}\) \(\newcommand{\rta}{\rightarrow}\) \(\newcommand{\upa}{\uparrow}\) \(\newcommand{\nea}{\nearrow}\)
  1. Reliability
  2. 1. Graphs
  3. 1
  4. 2
  5. 3
  6. 4
  7. 5
  8. 6
  9. 7
  10. 8
  11. 9
  12. 10

9. Lexicographic Sums of Graphs

In this section we generalize the lexicographic product of graphs studied in Section 8.

Basics

Suppose that \((S, \rta)\) is a discrete, irreflexive graph (so a graph without loops in the combinatorial sense) and that \((T_x, \upa_x)\) is a measurable graph for each \(x \in S\). Underlying \((S, \rta)\) is the discrete measure space \((S, \ms S, \#)\) of course, where \(\ms S\) is the power set of \(S\) and \( \#\) is counting measure. Underlying \((T_x, \upa_x)\) is a \(\sigma\)-finite measure space \((T_x, \ms T_x, \mu_x)\), as usual. Let \(T = \bigcup_{x \in S} T_x\). Here is the new space we will need:

Define the \(\sigma\)-finite measure space \((U, \ms U, \lambda)\) as follows:

  1. \[U = \bigcup_{x \in S} \left(\{x\} \times T_x\right)\]
  2. \[\ms U = \left\{\bigcup_{x \in S} \left(\{x\} \times A_x\right): A_x \in \ms T_x \text{ for all } x \in S\right\}\]
  3. \[\lambda\left[\bigcup_{x \in S} \left(\{x\} \times A_x\right)\right] = \sum_{x \in S} \mu_x(A_x), \quad A_x \in \ms T_x \text{ for } x \in S\]

Note that \(U \subseteq S \times T\). In the special case that \((T_x, \ms T_x, \mu)\) is the common measure space \((T, \ms T, \mu)\) for all \(x \in S\), the measure space \((U, \ms U, \lambda)\) is the product space \((S \times T, \ms S \times \ms T, \# \times \mu)\). Next is the graph we will study.

The lexicographic sum of the graphs \((T_x, \upa_x)\) over \((S, \rta)\) is the graph \((U, \nea)\) where for \((u, v), \, (x, y) \in U\), \[(u, v) \nea (x, y) \text{ if and only if } u \rta x \text{ or } (u = x \text{ and } v \upa_x y)\]

Details:

Since \((S, \rta)\) is irreflexive, the two conditions defining \(\nea\) are mutually exclusive. The measurability of \((U, \nea)\) is clear since \(\nea\) can be written as a countable union of measurable sets in \(U^2\).

In the special case that \((T_x, \upa_x)\) is the common graph \((T, \upa)\) (on the common measure space) for all \(x \in S\), the graph \((S \times T, \nea)\) is the lexicographic product of \((S, \rta)\) and \((T, \upa)\) as studied in Section 8. The lexicographic sum (and particularly the lexicographic product) is a common construction in ordinary combinatorial graph theory. It is also common in the study of partial orders, and it is in this context that the graph gets its name.

Suppose that \((S, \prec_0)\) is a discrete, strict partial order graph.

  1. If \((T_x, \prec_x)\) is a strict partial order graph for each \(x \in S\), then the lexicographic sum \((U \prec)\) is also a strict partial order graph: \[(u, v) \prec (x, y) \text{ if and only if } u \prec_0 x \text{ or } (u = x \text{ and } v \prec_x y)\]
  2. If \((T_x, \preceq_x)\) is a partial order graph for each \(x \in S\), then the lexicographic sum \((U, \preceq)\) is also a partial order graph: \[(u, v) \preceq (x, y) \text{ if and only if } u \prec_0 x \text{ or } (u = x \text{ and } v \preceq_x y)\]
Details:

The proofs are simple from the various cases in the defintion.

  1. First \((x, y) \not \prec (x, y)\) for \(x \in S, \, y \in T_x\), so \(\prec\) is irreflexive.
  2. If \((u, v) \prec (x, y)\) and \((x, y) \prec (z, w)\) then \((u, v) \prec (z, w)\) for \((u, v), \, (x, y), \, (z, w) \in S \bullet T\), so \(\prec\) is transitive.
  3. It's clear that \(\preceq\) is the partial order associated with the strict partial order \(\prec\) in (a), when \(\preceq_x\) is the partial order associated with the strict partial order \(\prec_x\) for each \(x \in S\).

Returning to the general setting, there is a simple relationship between the walk functions of order 1. The relationship is more complicated for higher orders, but the order 1 relationship is important for the constant rate distributions that we will discuss below. For \(x \in S\) let \(a_x\) denote the walk functions of order 1 for \((T_x, \upa_x)\), and let \(b\) and \(c\) denote the walk functions of order 1 for \((S, \rta)\) and \((U, \nea)\), respectively.

The left walk functions of order 1 are related as follows: \[c(x, y) = \sum_{u \rta x} \mu_u(T_u) + a_x(y), \quad x \in S, \, y \in T_x\]

Details:

This follows from the definition of the walk function. \begin{align*} c(x, y) &= \lambda\{(u, v) \in U: (u, v) \nea (x, y)\}\\ &= \lambda\{(u, v): u \in S, v \in T_u, u \rta x\} + \lambda\{(x, v): v \in T_x, v \upa y\}\\ &= \sum_{u \rta x} \mu_u(T_u) + a_x(y), \quad x \in S, \, y \in T_x \end{align*}

Note that \(c(x, y) \lt \infty\) implies \(a_x(y) \lt \infty\) and \(\sum_{u \rta x} \mu_u(T_u) \lt \infty\). Conversely, if \(a_x(y) \lt \infty\), \(b(x) \lt \infty\) and \(\mu_u(T_u) \lt \infty\) for every \(u \rta x\) then \(c(x, y) \lt \infty\).

Probability

Suppose now that \((X, Y)\) is a random variable in \(U\), so that \(X\) has values in \(S\), and given \(X = x\), random variable \(Y\) has values in \(T_x\). Unconditionally, \(Y\) has values in \(T = \bigcup_{x \in S} T_x\). To setup the notation, let \(F\) denote the reliability function of \(X\) for \((S, \rta)\), and for \(x \in S\), let \(G(\cdot \mid x)\) denote the conditional reliability function of \(Y\) for \((T_x, \upa_x)\), given that \(X = x\). Let \(H\) denote the reliability function of \((X, Y)\) for lexicographic sum graph \((U, \nea)\). Finally, let \(f\) denote the density function of \(X\), so that \(f(x) = \P(X = x)\) for \(x \in S\). We assume that \(f(x) \gt 0\) for \(x \in S\).

The reliability functions are related as follows: \[H(x, y) = F(x) + f(x) G(y \mid x), \quad x \in S, \, y \in T_x\]

Details:

This follows from the definition of the lexicographic relation \(\nea\): \begin{align*} H(x, y) &= \P[(x, y) \nea (X, Y)] = \P(x \rta X) + \P(X = x, y \upa Y) \\ &=\P(x \rta X) + \P(X = x) \P(y \upa Y \mid X = x) = F(x) + f(x) G(y \mid x), \quad x \in S, \, y \in T_x \end{align*} Note that since the discrete graph \((S, \rta)\) is irreflexive, the events \(\{x \rta X\}\) and \(\{X = x\}\) are disjoint.

The reason for requiring that the graph \((S, \rta)\) be discrete is clear: If \(X\) has a continuous distribution on an uncountable set \(S\) then \(\P(X = x) = 0\) for \(x \in S\) and so the lexicographic relation is irrelevant from a probabilistic viewpoint. The following results on the density function and rate function of \((X, Y)\) is a simple corollary:

Suppose that given \(X = x \in S\), random variable \(Y\) has density function \(g( \cdot \mid x)\) on \(T_x\).

  1. \((X, Y)\) has density \(h\) given by \(h(x, y) = f(x) g( y \mid x)\) for \(x \in S, \, y \in T_x\).
  2. \((X, Y)\) has rate function for \((U, \nea)\) given by \[ (x, y) \mapsto \frac{f(x) g(y \mid x)}{F(x) + f(x) G(y \mid x)}, \quad x \in S, \, y \in T_x \]

As usual, we have special interest in constant rate distributions. Recall that the existence of a constant rate distribution for a graph implies that the walk function is finite.

Suppose that \(X\) has constant rate \(\alpha \in (0, \infty)\) for \((S, \rta)\), and that given \(X = x \in S\), the conditional density function of \(Y\) and the conditional reliability function of \(Y\) for \((T_x, \upa_x)\) are related by \[g(y \mid x) = \beta[1 + \alpha G(y \mid x)], \quad y \in T_x\] for some \(\beta \in (0, \infty)\). Then \((X, Y)\) has constant rate \(\alpha \beta\) for \((U, \nea)\).

Details:

The assumptions are \(f(x) = \alpha F(x)\) for \(x \in S\) and \(g(y \mid x) = \beta[1 + \alpha G(y \mid x)]\) for \(x \in S\) and \(y \in T_x\). The result then follows from

The following is a simple example.

Consider the special case of the lexicographic sum \((U, \nea)\) of the discrete graphs \((T_x, =)\) over the discrete, irreflexive graph \((S, \rta)\). So \((u, v) \nea (x, y)\) if and only if either \(u \rta x\) or \((u, v) = (x, y)\).

  1. The walk function \(\rho\) of \((U, \nea)\) is given by \[\rho(x, y) = \sum_{u \rta x} \#(T_u) + 1, \quad x \in T, \, y \in T\]
  2. Suppose that \(\#(T_x) = k \in \N_+\) for all \(x \in S\) so that the sets all have the same cardinality. Suppose also that \(X\) has constant rate \(\alpha \in (0, \infty)\) for \((S, \rta)\), and that given \(X = x \in S\), random variable \(Y\) is uniformly distributed over \(T_x\). Then \((X, Y)\) has constant rate \(\alpha / (\alpha + k)\) for \((U, \nea)\).
Details:
  1. This follows from .
  2. Note that given \(X = x \in S\), the conditional density function of \(Y\) and the conditional reliability function of \(Y\) for \((T_x, =)\) are the same: \[G(y \mid x) = g(y \mid x) = \P(Y = y \mid X = x) = \frac{1}{k}, \quad y \in T_x\] So the condition in is satisfied with \(\beta = 1 / (\alpha + k)\)

Let's return to the general case of a lexicographic sum and consider some consequences of a constant rate distribution on the lexicographic sum graph.

Suppose that \((X, Y)\) has constant rate \(\alpha \in (0, \infty)\) for \((U, \nea)\). Then

  1. The density function \(f\) of \(X\) is given by \[f(x) = \frac{\alpha F(x) \mu_x(T_x)}{1 - \alpha \E[\sigma_x(Y) \mid X = x]}, \quad x \in S\]
  2. For \(x \in S\), a conditional density of \(Y\) given \(X = x\) is defined by \[g(y \mid x) = \alpha \left[\frac{F(x)}{f(x)} + G(y \mid x)\right], \quad y \in T_x\]
  3. \(Y\) has density function \(g\) defined by \[g(y) = \alpha \left(\E[\delta(X)] + \E[G(y \mid X)]\right), \quad y \in T\]
Details:

By assumption, \(h = \alpha H\) is a probability density function of \((X, Y)\).

  1. This follows the basic moment result in Section 3: \begin{align*} f(x) &= \int_{T_x} h(x, y) \, d\mu_x(y) = \alpha F(x) \mu_x(T_x) + \alpha f(x) \int_{T_x} G(y \mid x) d\mu_x(y)\\ &= \alpha F(x) \mu_x(T_x) + \alpha f(x) \E[\sigma_x(Y) \mid X = x], \quad x \in S \end{align*} solving for \(f(x)\) gives the result.
  2. For \(x \in S\), \[g(y \mid x) = \frac{h(x, y)}{f(x)} = \alpha \frac{F(x) + f(x) G(y \mid x)}{f(x)}, \quad y \in T_x\] Simplifying gives the result.
  3. Again we use the basic moment result in Section 3. \[g(y) = \sum_{x \in S} h(x, y) = \alpha \sum_{x \in S} F(x) + \alpha \sum_{x \in S} f(x) G( y \mid x) = \alpha \E[\delta(X)] + \alpha \E[G(y \mid X)], \quad y \in T\]