\(\newcommand{\P}{\mathbb{P}}\) \(\newcommand{\E}{\mathbb{E}}\) \(\newcommand{\R}{\mathbb{R}}\) \(\newcommand{\N}{\mathbb{N}}\) \(\newcommand{\ms}{\mathscr}\) \(\newcommand{\Z}{\mathbb{Z}}\) \(\newcommand{\rta}{\rightarrow}\) \(\newcommand{\bs}{\boldsymbol}\)
  1. Reliability
  2. 2. Semigroups
  3. 1
  4. 2
  5. 3
  6. 4
  7. 5
  8. 6
  9. 7
  10. 8

8. Semigroup Quotients

Algebra

Groups

Before we consider the general semigroup setting of this text, it might help to review the simple quotient space structure in a group. Suppose that \((S, \cdot)\) is a group with identity \(e\) and that \((T, \cdot)\) is a subgroup.

The right coset of \(T\) by \(x \in S\) is \(T x\). The collection of right cosets partitions \(S\).

Details:

Suppose that \(x, \, y \in S\) and that \(z \in T x \cap T y\). Then \(z = s x = t y\) for some \(s, \, t \in T\) and hence \(x = s^{-1} t y\). Next if \(u \in T x\) then \(u = a x\) for some \(a \in T\). But then \(u = a s^{-1} t y\) and since \(a s^{-1} t \in T\) it follows that \(u \in T y\). Hence \(T x \subseteq T y\). By a symmetric argument, \(T y \subseteq T x\). So either \(T x = T y\) or \(T x \cap T y = \emptyset\). Finally, \(x = e x \in T x\) for \(x \in S\) and hence \(\bigcup_{x \in S} T x = S\).

The right coset partition in corresponds to the equivalence relation \(\equiv\) on \(S\) defined by \(x \equiv y\) if and only if \(x y^{-1} \in T\).

Detials:

First we show that \(\equiv\) is an equivalence relation. Note that \(x x^{-1} = e \in T\) so \(x \equiv x\) for \(x \in S\). If \(x \equiv y\) for \(x, \, y \in S\) then \(x y^{-1} \in T\). But then \((x y^{-1})^{-1} = y x^{-1} \in T\) and hence \(y \equiv x\). If \(x \equiv y\) and \(y \equiv z\) for \(x, \, y, \, z \in S\) then \(x y^{-1} \in T\) and \(y z^{-1} \in T\). But then \((x y^{-1})(y z^{-1}) = x z^{-1} \in T\) and hence \(x \equiv z\). Next, the equivalence class generated by \(x \in S\) is \[\{y \in S: y \equiv x\} = \{y \in S: y x^{-1} \in T\} = \{y \in S: y \in T x\} = T x\]

Here is the result that we want to extend to the semigroup setting:

Suppose now that \(S / T \subseteq S\) is a collection of elements that generate all of the distinct right cosets of \(T\). Then every \(x \in S\) can be decomposed uniquely as \(x = y z\) where \(y \in T\) and \(z \in S / T\).

Details:

The assumptions mean that \(T z \cap T w = \emptyset\) for distinct \(z, \, w \in S / T\) and that \(\bigcup_{z \in S / T} T_z = S\). So if \(x \in S\) then \(x \in T z\) for a unique \(z \in S / T\) and hence \(x = y z\) for some \(y \in T\). The element \(y \in T\) is also unique since \(y = x z^{-1}\).

There is one last result that is worth mentioning in the group setting.

Suppose that \(T\) is a normal subgroup so that \(T x = x T\) for every \(x \in S\). Then the collection of cosets forms a group, the quotient group defined by the operation \[(T x) \cdot (T y) = T x y, \quad x, \, y \in S\] The identity is \(T\) and for \(x \in S\), the inverse of \(T x\) is \(T x^{-1}\).

Details:

We need to show that the operation is well defined and does not depend on the particular element that generates a coset. That is, we need to show that if \(x \equiv u\) and \(y \equiv v\) for \(x, \, y, \, u, \, v \in S\) then \(T x y = T u v\). Since \(u x^{-1} \in T\) and \(y^{-1} v \in T\) we have \(T u x^{-1} = T\) and \(y^{-1} v T = T\). Therefore \[T x y = T u x^{-1} x y = T u y = u y T = u y y^{-1} v T = u v T = T u v\] The rest is easy. The identity coset is \(T = T e\) since \(T T x = T^2 x = T x\) for \(x \in S\). The inverse of \(T x\) is \(T x^{-1}\) since \((T x) (T x^{-1}) = T x x^{-1} = T e = T\).

Of course this result applies to a commutative group \((S, +)\) since every subgroup \((T, +)\) is normal: \(T + x = x + T\) for \(x \in S\).

Semigroups

Suppose that \((S, \cdot)\) is a positive semigroup with identity element \(e\) and that \((T, \cdot)\) is a positive sub-semigroup of \((S, \cdot)\). As usual, we assume only the left cancellation law, so the two semigroups are not necessarily embedded in a group. Let \(\preceq\) denote the relation on \(S\) associated with \((S, \cdot)\), and \(\preceq_T\) the relation on \(S\) associated with \((S, \cdot)\) and the set \(T\). So \(x \preceq y\) if and only if \(y \in x S\), and \(x \preceq_T y\) if and only if \(y \in x T\) for \((x, y) \in S^2\). Recall that \(\preceq\) and \(\preceq_T\) are partial orders on \(S\), and \(\preceq_T\) is a sub order of \(\preceq\). That is, \(x \preceq_T y\) imples \(x \preceq y\) for \((x, y) \in S^2\). Recall also that \(S_+ = S - \{e\}\) and similarly \(T_+ = T - \{e\}\). Our goal in this section is to define a set \(S / T\) such that each \(x \in S\) can be factored uniquely as \(x = y z\) where \(y \in T\) and \(z \in S / T\).

Define the quotient of \((S, \cdot)\) by \((T, \cdot)\) to be the set \[S / T = \bigcap_{t \in T_+} (t S)^c\]

Details:

As motiviation for the definition, suppose that \(U \subseteq S\) is a set so that each \(x \in S\) can be factored uniquely as \(x = y z\) where \(y \in T\) and \(z \in U\). If \(z \in U \cap t S\) for some \(t \in T_+\) then \(z = t w\) for some \(w \in S\). But then \(w = s u\) for some \(s \in T\) and \(u \in U\). So we have \(z = e z = t s u\), with \(e, \, t s \in T\) and \(z, \, u \in U\). Note that \(t s \neq e\) so we have two distinct factorings.

In general, \(S / T\) is not closed under the semigroup operation \(\cdot\) and so is not a sub-semigroup.

\(T \cap (S / T) = \{e\}\).

Details:

First \(e \in T \cap (S / T)\) since \(e \in T\) and \(e \notin t S\) for every \(t \in T_+\). Conversey, if \(t \in T_+\) then \(t \in t S\) so if \(t \notin S / T\).

The following result essentially restates the definition in terms of the order relation.

\(z \in S / T\) if and only if \(t \in T\) and \(t \preceq z\) imply \(t = e\).

Details:

Note that \(t \preceq z\) means that \(z \in t S\).

For \(x \in S\) recall the interval notation \([e, x] = \{y \in S: y \preceq x\}\).

Suppose that \(x \in S\). Then \(x = y z\) for some \(y \in T\) and \(z \in S / T\) if and only if \(y\) is a maximal element of \([e, x] \cap T\) with respect to \(\preceq_T\) (and then \(z = y^{-1} x\)).

Details:

Suppose that \(x = y z\) for some \(y \in T\) and \(z \in S / T\). Then \(y \preceq x\) by definition, so \(y \in [e, x] \cap T\). Suppose that \(t \in [e, x] \cap T\) and \(y \preceq_T t\). There exists \(a \in S\) and \(b \in T\) such that \(x = t a\) and \(t = y b\). Hence \(x = y b a\). By the left cancellation rule, \(z = b a\), so \(b \preceq z\). But \(z \in S / T\) so \(b = e\) and hence \(t = y\). Therefore \(y\) is a maximal element of \([e, x] \cap T\) with respect to \(\preceq_T\). Conversely, suppose that \(y\) is a maximal element of \([e, x] \cap T\) with respect to \(\preceq_T\). Then \(y \preceq x\) so \(x = y z\) for some \(z \in S\). Suppose that \(t \in T\) and \(t \preceq z\). Then \(z = t b\) for some \(b \in S\) so \(x = y t b\). Hence \(y t \preceq x\) and \(y t \in T\). Since \(y\) is maximal, \(y t = y\) and so \(t = e\). Therefore \(z \in S / T\).

Since we want a unique factoring, we impose the following assumption:

For each \(x \in S\), there exists a unique maximal element \(y \in [e, x] \cap T\) with respect to \(\preceq_T\).

Thus each \(x \in S\) can be factored uniquely as \(x = y z\) where \(y \in T\) and \(z \in S / T\). Equivalently, the function \((y, z) \mapsto y z\) maps \(T \times (S / T)\) one-to-one onto \(S\). Put yet another way, the collection of right semigroup cosets \(\{T z: z \in S / T\}\) partitions \(S\).

Measure

Suppose now that \((S, \ms S)\) is a measure space and that the positive semigroup \((S, \cdot)\) is measurable and the positive sub-semigroup \((T, \cdot)\) is measurable. Let \(\ms T\) denote the induced \(\sigma\)-algebra on \(T\). Suppose that \(\lambda\) is a \(\sigma\)-finite measure on \((S, \ms S)\) that is left invariant for \((S, \cdot)\), and that \(\mu\) is a \(\sigma\)-finite measure on \((T, \ms T)\) that is left invariant for \((T, \cdot)\), both unique up to multiplication by positive constants. As noted above, the function \((y, z) \mapsto y z\) maps \(T \times (S / T)\) one-to-one onto \(S\). We want this equivlance to extend to the measure-theoretic structure as well, so we make the following additional assumption:

The quotient \(S / T \in \ms S\) with induced \(\sigma\)-algebra denoted by \(\ms S / \ms T\). There exists a \(\sigma\)-finite measure \(\nu\) on \((S / T, \ms S / \ms T)\) such that the mapping \((y, z) \mapsto y z\) is an isomorphism from the product measure space \((T \times (S / T), \ms T \times (\ms S / \ms T), \mu \times \nu)\) to the measure space \((S, \ms S, \lambda)\).

If the sub-semigroup is discrete, the measure-theoretic conditions automatically hold.

Suppose that \((T, \cdot)\) is discrete and that assumption holds. Then assumption also holds.

Details:

First, \(S / T = \bigcap_{t \in T_+} (t S)^c \in \ms S\) since \(t S \in \ms S\) and \(T_+\) is countable. Let \(\varphi(y, z) = y z\) for \((y, z) \in T \times (S / T)\) so that by assumption, \(\varphi\) maps \(T \times (S / T)\) one-to-one onto \(S\). If \(A \in \ms T \times (\ms S / \ms T)\) then \(A = \bigcup_{y \in T} \{y\} \times A_y\) where \(A_y = \{z \in S / T: (y, z) \in A\}\) is the cross section of \(A\) at \(y \in T\). Hence \[\varphi(A) = \varphi\left(\bigcup_{y \in T} \{y\} \times A_y\right) = \bigcup_{y \in T} \varphi\left(\{y\} \times A_y\right) = \bigcup_{y \in T} y A_y \in \ms S\] again since \(y A_y \in \ms S\) for \(y \in T\) and \(T\) is countable. Similarly if \(A \in \ms S\) then \[\varphi^{-1}(A) = \bigcup_{y \in T} \{y\} \times [(y^{-1} A) \cap (S / T)] \in \ms S \times (\ms S / \ms T)\] Finally, suppose that \(\lambda\) is a \(\sigma\)-finite measure on \((S, \ms S)\) that is left invariant for \((S, \cdot)\). Of course, counting measure \(\#\) is left invariant for \((T, \cdot)\) and is essentially unique. Let \(\nu\) be \(\lambda\) restricted to \(\ms S / \ms T\). If \(A \in \ms T\) and \(B \in \ms S / \ms T\) then \[\lambda(A B) = \sum_{y \in A} \lambda(y B) = \sum_{y \in A} \lambda(B) = \#(A) \nu(B) \]

Examples

The following examples should help clarify the assumptions.

Let \((S, \cdot)\) be a positive semigroup. For \(t \in S_+\) let \(\langle t \rangle = \{t^n: n \in \N\}\), the positive sub-semigroup generated by \(t\). So \((\langle t \rangle, \cdot)\) is isomorphic to the standard, discrete positive semigroup \((\N, +)\), under the mapping \(n \mapsto t^n\). The quotient space is \[S / \langle t \rangle = \bigcap_{n=1}^\infty (t^n S)^c = (t S)^c = \{z \in S: t \not \preceq z\}\] Suppose that \(\{n \in \N: t^n \preceq x\}\) is finite for each \(x \in S\). The set is clearly nonempty since \(t^0 = e \preceq x\). Hence the set has a maximum element \(n_t(x) = \max\{n \in \N: t^n \preceq x\}\). So the basic assumptions are satisfied and each \(x \in S\) has a unique factoring as \[x = t^n z, \quad n \in \N, \, z \in (t S)^c\] Suppose that \(\lambda\) is a left-invariant measure for \((S, \cdot)\). Of corse, counting measure \(\#\) is left invariant for \((\langle t \rangle, \cdot)\). The measure-theoretic assumptions hold with \(\nu\) being the restriction of \(\lambda\) to \(S / \langle t \rangle\). That is \[\lambda(A B) = \#(A) \lambda(B), \quad A \subseteq \langle t \rangle, \, B \subseteq (t S)^c \text{ with } B \in \ms S\]

Consider the standard continuous positve semigroup \(([0, \infty)^k, +)\) where \(k \in \N_+\), with the usual Borel measure structure. The associated order \(\le\) is the ordinary (product) order. The space \((\N^k, +)\) is a discrete, positive sub-semigroup of \((S, +)\) with quotient space \(S / T = [0, 1)^k\). So each \(\bs{x} \in [0, \infty)^k\) can be decomposed uniquely as \[\bs x = \bs n + \bs t, \quad \bs n \in \N^k, \, \bs t \in [0, 1)^k\] where \(\bs n\) is the vector of integer parts of \(\bs x\) and \(\bs t\) is the vector of remainders. The left-invariant measure on \([0, \infty)^k\) is \(k\)-dimensional Lebesgue measure \(\lambda\), and the left-invariant measure on \(T\) is counting measure \(\#\). The reference measure on \(S / T\) is also \(k\)-dimensional Lebesgue measure. Moreover, the partial order graph \((S, \le)\) is the lexicographic product of \((T, \lt)\) with \((S / T, \le)\), as discussed in Section Section 1.8. Of course, \(([0, \infty)^k, +)\) is a sub-semigroup of the group \((\R^k, +)\) and similalry \((\N^k, +)\) is a sub-semigroup of the group \((\Z^k, +)\).

Consider the direct product \((S, \cdot)\) of positive semigroups \((S_1, \cdot)\) and \((S_2, \cdot)\), with identity elements \(e_1\) and \(e_2\), and with left-invariant measures \(\lambda_1\) and \(\lambda_2\), respectively, as discussed in Section 7. Let \[T_1 =\{(x_1, e_2): x_1 \in S_1\}\] Then \((T_1, \cdot)\) is a positive sub-semigroup of \((S, \cdot)\) with quotient space \[T_2 := S/T_1 = \{(e_1, x_2): x_2 \in S_2\}\] In this case, \((T_2, \cdot)\) is also a positive sub-semigroup and the spaces \((T_1, \cdot)\) and \((T_2, \cdot)\) are symmetric; the first is isomorphic to \((S_1, \cdot)\) and the second is isomorphic to \((S_2, \cdot)\). The unique factoring is simply \((x_1, x_2) = (x_1, e_2)(e_1, x_2)\). The measures \(\mu\) and \(\nu\) in Proposition are given by \begin{align*} \mu(A) &= \lambda_1(A_1), \quad A \in \ms T_1 \\ \nu(B) &= \lambda_2(B_2), \quad B \in \ms T_2 \end{align*} where \(A_1 = \{x_1 \in S_1: (x_1, e_2) \in A\}\) and \(B_2 = \{x_2 \in S_2: (e_1, x_2) \in B\}\).

Probability

We return to the general setting of a positive semigroup \((S, \cdot)\) on an underlying measurable space \((S, \ms S)\). As usual, let \(\preceq\) denote the partial order associated with \((S, \cdot)\). Suppose next that \((T, \cdot)\) is a sub-semigroup of \((S, \cdot)\) and that \(X\) is a random variable in \(S\) with \(\P(X \in T) > 0\). Recall from Section 6 that if \(X\) has an exponential distribution on \((S, \cdot)\) then the conditional distribution of \(X\) given \(X \in T\) is exponential on \((T, \cdot)\), and the reliability function of \(X\) given \(X \in T\) is the restriction to \(T\) of the reliability function of \(X\): \[\P(X \succeq_T x \mid X \in T) = \P(X \succeq x), \quad x \in T\] We will generalize and extend this basic result. Suppose that \(S / T\) is the quotient space of \((T, \cdot)\) relative to \((S, \cdot)\) as discussed above, and in particular with assumptions and in place. Then random variable \(X\) can be decomposed uniquely as \(X = Y_T Z_T\) where \(Y_T \in T\) and \(Z_T \in S / T\). Our goal is the study of the random variables \(Y_T\) and \(Z_T\). When \(T = \langle t \rangle\) for \(t \in S_+\), as in example , we simplify the notation to \(Y_t\) and \(Z_t\). In this case, note that \(Y_t = t^{N_t}\) where \(N_t \in \N\). The following proposition is our first main result.

Suppose that \(X = Y_T Z_T\) has an exponential distribution on \((S, \cdot)\). Then

  1. \(Y_T\) has an exponential distribution on \((T, \cdot)\).
  2. The reliability function of \(Y_T\) for \((T, \cdot)\) is the restriction to \(T\) of the reliability function of \(X\) for \((S, \cdot)\).
  3. \(Y_T\) and \(Z_T\) are independent.
Details:

Let \(y \in T\), \(A \in \ms T\), and \(B \in \ms S / \ms T\). Then by the uniqueness of the factorization and since \(X\) has an exponential distribution, we have the basic equation \begin{align*} \P(Y_T \in y A, Z_T \in B) &= \P (X \in y A B) = \P(X \succeq y) \P (X \in AB) \\ &= \P(X \succeq y) \P(Y_T \in A, Z_T \in B) \end{align*} Substituting \(A = T\) and \(B = S/T\) gives \[\P(Y_T \succeq_T y) = \P(X \succeq y), \quad y \in T\] so the reliability function of \(Y\) is the restriction to \(T\) of the reliability function of \(X\). Returning to the basic equation with general \(A \in \ms T\) and \(B = S / T\) we have \[\P(Y_T \in y A) = \P(Y_T \succeq_T y) \P(Y_T \in A), \quad y \in T, \, A \in \ms T\] so \(Y_T\) has an exponential distribution on \((T, \cdot)\). Finally, returning to the basic equation \(A = T\) and general \(B \in \ms S / \ms T\) we have \[\P(Y_T \succeq_T y, Z_T \in B) = \P(Y_T \succeq_T y) \P(Z_T \in B), \quad y \in T, \, B \in \ms S / \ms T\] so \(Y_T\) and \(Z_T\) are right independent. To get full independence, recall that \(X\) has constant rate \(\alpha\) with respect to \(\lambda\), so \(X\) has density \(f\) with respect to \(\lambda\) given by \(f(x) = \alpha \P(X \succeq x)\) for \(x \in S\). Similarly, \(Y_T\) has constant rate with respect to \(\mu\) on \(T\). Thus the function \(g\) on \(T \times S / T\) given by \(g(y, z) = f(y z)\) is a density function of \((Y_T, Z_T)\) with respect to \(\mu \times \nu\). By the memoryless property, \[g(y,z) = f(y z) = \alpha \P (X \succeq y z) = \alpha \P(X \succeq y) \P(X \succeq z), \quad y \in S, \, z \in S/T\] and so by the standard factorization theorem, \(Y_T\) and \(Z_T\) are independent.

Consider the direct product of positive semigroups \((S_1, \cdot)\) and \((S_2, \cdot)\) with the sub-semigroup and quotient space described in example . In this case, theorem gives another proof of the characterization of exponential distributions in Section 7: \((X_1, X_2)\) is exponential on \((S_1 \times S_2, \cdot)\) if and only if \(X_1\) is exponential on \((S_1, \cdot)\), \(X_2\) is exponential on \((S_2, \cdot)\), and \(X_1, \, X_2\) are independent.

Suppose again that \(X\) is a random variable in \(S\), with the factoring \(X = Y_T Z_T\) where \(Y_T \in T\) and \(Z_T \in S / T\).

  1. If \(\P(X \in T) > 0\) then the conditional distribution of \(X\) given \(X \in T\) is the same as the distribution of \(Y_T\) if and only if \(Y_T\) and \(\{Z_T = e\}\) are independent.
  2. If \(\P(X \in S / T) > 0\) then the conditional distribution of \(X\) given \(X \in S / T\) is the same as the distribution of \(Z_T\) if and only if \(Z_T\) and \(\{Y_T = e\}\) are independent.
Details:

Suppose that \(A \in \ms T\). Then \(\{X \in A\} = \{Y_T \in A, Z_T = e\}\) and in particular, \(\{X \in T\} = \{Z_T = e\}\). Thus, \[\P(X \in A \mid X \in T) = \P(Y_T \in A \mid Z_T = e)\] The proof of the second result is analogous.

Suppose that \(X\) has an exponential distribution on \((S, \cdot)\), with the factoring \(X = Y_T Z_T\) where \(Y_T \in T\) and \(Z_T \in S / T\)

  1. If \(\P(X \in T) \gt 0\) then the conditional distribution of \(X\) given \(X \in T\) is the same as the distribution of \(Y_T\), and this distribution is exponential on \(T\).
  2. If \(\P(X \in S / T) \gt 0\) then the conditional distribution of \(X\) given \(X \in S / T\) is the same as the distribution of \(Z_T\).

For the following corollary, recall the sub-semigroup \((\langle x \rangle, \cdot)\) for \(x \in S_+\) where \(\langle x \rangle = \{x^n: n \in \N\}\), as in example . We assume that \(\{n \in \N: x^n \preceq y\}\) is finite for each \(x \in S_+\) and \(y \in S\), and hence has a maximum element \(n_x(y)\). In this case, the quotient space is \(S / \langle x \rangle = (x S)^c\) and the factoring for a random variable \(X\) is \(X = x^{N_x} Z_x\) where \(N_x \in \N\) and \(Z_x \in S - x S\).

Suppose that \(X\) has an exponential distribution on \((S, \cdot)\). Then

  1. \(N_x\) has the geometric distribution on \(\N\) with success parameter \(p_x = 1 - P(X \succeq x)\) for \(x \in S_+\).
  2. \((1 - p_x)(1 - p_y) = 1 - p_{x y}\) for \(x, \, y \in S_+\).
  3. \(N_x\) and \(Z_x\) are independent for \(x \in S_+\).
Details:

Most of the proof follows directly from .

  1. Let \(x \in S_+\). Then \(Y_x\) has an exponential distribution on \((\langle x \rangle, \cdot)\) and therefore \(N_x\) has a geometric distribution on \(\N\), since \((\langle x \rangle, \cdot)\) and \((\N, +)\) are isomorphic. Next, \(N_x \ge 1\) if and only if \(Y_x \succeq x\) if and only if \(X \succeq x\). Thus, the rate (or success) parameter of the geometric distribution is \[p_x = 1 - \P(N_x \ge 1) = 1 - \P(X \succeq x)\] so that \(\P(N_x = n) = p_x (1 - p_x)^n\) for \(n \in \N_+\).
  2. By the memoryless property, \[1 - p_{x y} = \P(X \succeq x y) = \P(X \succeq x) \P(X \succeq y) = (1 - p_x)(1 - p_y), \quad x, \, y \in S_+\]
  3. \(Y_x\) and \(Z_x\) are independent, and hence \(N_x\) and \(Z_x\) are independent for \(x \in S_+\).

As noted in the details of , and will be shown in Chapter 4, the geometric distribution on \(\N\) with success parameter \(p_x\) is the exponential distribution on \((\N, +)\) with rate parameter \(p_x\). The following theorem is our second main result, and gives a partial converse to .

Suppose \(X\) is a random variable in \(S\), and that \(N_x\) and \(Z_x\) are independent and \(N_x\) has a geometric distribution on \(\N\) for each \(x \in S_+\). Then \(X\) has an exponential distribution on \((S, \cdot)\).

Details:

Let \(p_x\) denote the parameter of the geometric distribution of \(N_x\), for \(x \in S_+\) so that \[\P(N_x = n) = p_x (1 - p_x)^n, \quad n \in \N\] Let \(A \in \ms S\). Then \[A = \bigcup_{n = 0}^\infty [A \cap \{y \in S: n_x(y) = n\}] = \bigcup_{n = 0}^\infty x^n B_n\] where \(B_n = \{z \in S / \langle x \rangle: x^n z \in A\} = \{x^{-n} y: n_x(y) = n, \, y \in A\}\). The collection \(\{x^n B_n: n \in \N\}\) is disjoint. Similarly, \(x A = \bigcup_{n = 0}^\infty x^{n + 1} B_n\) and the collection \(\{x^{n + 1} B_n: n \in \N\}\) is disjoint. From the hypotheses, \begin{align*} \P(X \in x A) &= \sum_{n = 0}^\infty \P(X \in x^{n + 1} B_n) = \sum_{n = 0}^\infty \P(N_x = n + 1, Z_x \in B_n)\\ &= \sum_{n = 0}^\infty p_x (1 - p_x)^{n+1} \P(Z_x \in B_n). \end{align*} But also \(1 - p_x = \P(N_x \ge 1) = \P(Y_x \succeq t) = \P(X \succeq x)\) so \begin{align*} \P(X \succeq x) \P(X \in A) &= (1 - p_x) \sum_{n = 0}^\infty \P(X \in x^n B_n) \\ &= (1 - p_x) \sum_{n = 0}^\infty \P(N_x = n, Z_x \in B_n) \\ &= (1 - p_x) \sum_{n = 0}^\infty p_x (1 - p_x)^n \P(Z_x \in B_n). \end{align*} If follows that \(\P(X \in x A) = \P(X \succeq x) \P(X \in A)\) and hence \(X\) has an exponential distribution on \(S\).

As a consequence of , \(1 - p_{x y} = (1 - p_x)(1 - p_y)\) for \(x, \, y \in S_+\). Here is a related result with different hypotheses and a slightly weaker conclusion.

Suppose again that \(X\) is a random variable in \(S\) and that \(N_x\) has a geometric distribution on \(\N\) with success parameter \(p_x\) for \(x \in S_+\). Suppose also that \[1 - p_{x y} = (1 - p_x)(1 - p_y), \quad x, \, y \in S_+\] Then \(X\) has a memoryless distribution for \((S, \cdot)\).

Details:

Once again, \(\P(X \succeq t) = \P(Y_t \succeq t) = \P(N_t \ge 1) = 1 - p_t\) for \(t \in S_+\). Hence \(X\) is memoryless because of the condition imposed on \(\{p_x: x \in S_+\}\).