C.2_Convergence_of_stochastic_sequences
C.2 Convergence of stochastic sequences
We now consider stochastic sequences. While various definitions of stochastic sequences have been given in Appendix B, how to determine the convergence of a given stochastic sequence has not yet been discussed. We next present an important class of stochastic sequences called martingales. If a sequence can be classified as a martingale (or one of its variants), then the convergence of the sequence immediately follows.
Convergence of martingale sequences
Definition: A stochastic sequence is called a martingale if and
almost surely for all .
Here, is a random variable rather than a deterministic value. The term "almost surely" in the second condition is due to the definition of such expectations. In addition, is often written as for short where represents the "history" of the sequence. has a specific name called a filtration. More information can be found in [96, Chapter 14] and [104].
Example: An example that can demonstrate martingales is random walk, which is a stochastic process describing the position of a point that moves randomly. Specifically, let denote the position of the point at time step . Starting from , the expectation of the next position equals if the mean of the one-step displacement is zero. In this case, we have and hence is a martingale.
A basic property of martingales is that
for all and hence
This result can be obtained by calculating the expectation on both sides of (C.3) based on property (b) in Lemma B.2.
While the expectation of a martingale is constant, we next extend martingales to submartingales and supermartingales, whose expectations vary monotonically.
Definition: A stochastic sequence is called a submartingale if it satisfies and
for all
Taking the expectation on both sides of (C.4) yields . In particular, the left-hand side leads to due to property (b) in Lemma B.2. By induction, we have
Therefore, the expectation of a submartingale is nondecreasing.
It may be worth mentioning that, for two random variables and , means for all . It does not mean the maximum of is less than the minimum of .
Definition: A stochastic sequence is called a supermartingale if it satisfies and
for all
Taking expectation on both sides of (C.5) gives . By induction, we have
Therefore, the expectation of a supmartingale is nonincreasing.
The names "submartingale" and "supmartingale" are standard, but it may not be easy for beginners to distinguish them. Some tricks can be employed to do so. For example, since "supermartingale" has a letter "p" that points down, its expectation decreases; since submartingale has a letter "b" that points up, its expectation increases [104].
A supermartingale or submartingale is comparable to a deterministic monotonic sequence. While the convergence result for monotonic sequences has been given in Theorem C.1, we provide a similar convergence result for martingales as follows.
Theorem C.3 (Martingale convergence theorem). If is a submartingale (or supermartingale), then there is a finite random variable such that almost surely.
The proof is omitted. A comprehensive introduction to martingales can be found in [96, Chapter 14] and [104].
Convergence of quasimartingale sequences
We next introduce quasimartingales, which can be viewed as a generalization of martingales since their expectations are not monotonic. They are comparable to nonmonotonic deterministic sequences. The rigorous definition and convergence results of quasimartingales are nontrivial. We merely list some useful results.
The event is defined as , where . Intuitively, indicates that is greater than in expectation. Let be an indicator function:
The indicator function has a property that
for any event where denotes the complementary event of . As a result, it holds for any random variable that
Although quasimartingales do not have monotonic expectations, their convergence is still ensured under some mild conditions as shown below.
Theorem C.4 (Quasimartingale convergence theorem). For a nonnegative stochastic sequence , if
then and there is a finite random variable such that almost surely as .
Theorem C.4 can be viewed as an analogy of Theorem C.2, which is for nonmonotonic deterministic sequences. The proof of this theorem can be found in [105, Proposition 9.5]. Note that here is required to be nonnegative. As a result, the boundedness of implies the boundedness of .
Summary and comparison
We finally summarize and compare the results for deterministic and stochastic sequences.
Deterministic sequences:
Monotonic sequences: As shown in Theorem C.1, if a sequence is monotonic and bounded, then it converges.
Nonmonotonic sequences: As shown in Theorem C.2, given a nonnegative sequence, even if it is nonmonotonic, it can still converge as long as its variation is damped in the sense that < .
Stochastic sequences:
Supermartingale/submartingale sequences: As shown in Theorem C.3, the expectation of a supermartingale or submartingale is monotonic. If a sequence is a supermartingale or submartingale, then the sequence converges almost surely.
Quasimartingale sequences: As shown in Theorem C.4, even if a sequence's expectation is nonmonotonic, it can still converge as long as its variation is damped in the sense that .
The above properties are summarized in Table C.1.
Table C.1: Summary of the monotonicity of different variants of martingales.