Harris chain
Encyclopedia
In the mathematical study of stochastic process
es, a Harris chain is a Markov chain
satisfying an additional property.
{Xn} on state space Ω with stochastic kernel
K is a Harris chain if there exist A, B ⊆ Ω, ϵ > 0, and probability measure
ρ with ρ(B) = 1 such that
In essence, this technical definition can be rephrased as follows: given two points x1 and x2 in A, then there is at least an ϵ chance that they can be moved together to the same point at the next time step.
Another way to say it is that suppose that x and y are in A. Then at the next time step I first flip a Bernoulli with parameter ϵ. If it comes up one, I move the points to a point chosen using ρ. If it comes up zero, the points move independently, with x moving according to P(Xn+1 ∈ C|Xn = x) = K(x, C) − ερ(C) and y moving according to P(Yn+1 ∈ C|Yn = y) = .
with a kernel
that is absolutely continuous with respect to Lebesgue measure
:
such that K(x, y) is a continuous function
.
Pick (x0, y0) such that K(x0, y0 ) > 0, and let A and B be open sets containing x0 and y0 respectively that are sufficiently small so that K(x, y) ≥ ε > 0 on A × B. Letting ρ(C) = |B ∩ C|/|B| where |B| is the Lebesgue measure
of B, we have that (2) in the above definition holds. If (1) holds, then {Xn} is a Harris chain.
Definition: If for all L(X0), P(R < ∞|X0 ∈ A) = 1, then the Harris chain is called recurrent.
Definition: A recurrent Harris chain Xn is aperiodic if ∃N, such that ∀n ≥ N, ∀L(X0), P(Xn ∈ A|X0 ∈ A) > 0.
Theorem: Let Xn be an aperiodic recurrent Harris chain with stationary distribution π. If P(R < ∞|X0 = x) then as n → ∞, distTV (L(Xn|X0 = x), π) → 0.
Stochastic process
In probability theory, a stochastic process , or sometimes random process, is the counterpart to a deterministic process...
es, a Harris chain is a Markov chain
Markov chain
A Markov chain, named after Andrey Markov, is a mathematical system that undergoes transitions from one state to another, between a finite or countable number of possible states. It is a random process characterized as memoryless: the next state depends only on the current state and not on the...
satisfying an additional property.
Definition
A Markov chainMarkov chain
A Markov chain, named after Andrey Markov, is a mathematical system that undergoes transitions from one state to another, between a finite or countable number of possible states. It is a random process characterized as memoryless: the next state depends only on the current state and not on the...
{Xn} on state space Ω with stochastic kernel
Stochastic kernel
In statistics, a stochastic kernel estimate is an estimate of the transition function of a stochastic process. Often, this is an estimate of the conditional density function obtained using kernel density estimation...
K is a Harris chain if there exist A, B ⊆ Ω, ϵ > 0, and probability measure
Probability measure
In mathematics, a probability measure is a real-valued function defined on a set of events in a probability space that satisfies measure properties such as countable additivity...
ρ with ρ(B) = 1 such that
- If τA := inf {n ≥ 0 : Xn ∈ A}, then P(τA < ∞|X0 = x) > 0 for all x ∈ Ω.
- If x ∈ A and C ⊆ B then K(x, C) ≥ ερ(C).
In essence, this technical definition can be rephrased as follows: given two points x1 and x2 in A, then there is at least an ϵ chance that they can be moved together to the same point at the next time step.
Another way to say it is that suppose that x and y are in A. Then at the next time step I first flip a Bernoulli with parameter ϵ. If it comes up one, I move the points to a point chosen using ρ. If it comes up zero, the points move independently, with x moving according to P(Xn+1 ∈ C|Xn = x) = K(x, C) − ερ(C) and y moving according to P(Yn+1 ∈ C|Yn = y) = .
Example 1: Countable state space
Given a countable set S and a pair (A′, B′ ) satisfying (1) and (2) in the above definition, we can without loss of generality take B′ to be a single point b. Upon setting A = {b}, pick c such that K(b, c) > 0 and set B = {c}. Then, (1) and (2) hold with A and B as singletons.Example 2: Chains with continuous densities
Let {Xn}, Xn ∈ Rd be a Markov ChainMarkov chain
A Markov chain, named after Andrey Markov, is a mathematical system that undergoes transitions from one state to another, between a finite or countable number of possible states. It is a random process characterized as memoryless: the next state depends only on the current state and not on the...
with a kernel
Kernel (mathematics)
In mathematics, the word kernel has several meanings. Kernel may mean a subset associated with a mapping:* The kernel of a mapping is the set of elements that map to the zero element , as in kernel of a linear operator and kernel of a matrix...
that is absolutely continuous with respect to Lebesgue measure
Lebesgue measure
In measure theory, the Lebesgue measure, named after French mathematician Henri Lebesgue, is the standard way of assigning a measure to subsets of n-dimensional Euclidean space. For n = 1, 2, or 3, it coincides with the standard measure of length, area, or volume. In general, it is also called...
:
- K(x, dy) = K(x, y) dy
such that K(x, y) is a continuous function
Continuous function
In mathematics, a continuous function is a function for which, intuitively, "small" changes in the input result in "small" changes in the output. Otherwise, a function is said to be "discontinuous". A continuous function with a continuous inverse function is called "bicontinuous".Continuity of...
.
Pick (x0, y0) such that K(x0, y0 ) > 0, and let A and B be open sets containing x0 and y0 respectively that are sufficiently small so that K(x, y) ≥ ε > 0 on A × B. Letting ρ(C) = |B ∩ C|/|B| where |B| is the Lebesgue measure
Lebesgue measure
In measure theory, the Lebesgue measure, named after French mathematician Henri Lebesgue, is the standard way of assigning a measure to subsets of n-dimensional Euclidean space. For n = 1, 2, or 3, it coincides with the standard measure of length, area, or volume. In general, it is also called...
of B, we have that (2) in the above definition holds. If (1) holds, then {Xn} is a Harris chain.
Reducibility and periodicity
In the following, R := inf {n ≥ 1 : Xn ∈ A}; i.e. R is the first time after time 0 that the process enters region A.Definition: If for all L(X0), P(R < ∞|X0 ∈ A) = 1, then the Harris chain is called recurrent.
Definition: A recurrent Harris chain Xn is aperiodic if ∃N, such that ∀n ≥ N, ∀L(X0), P(Xn ∈ A|X0 ∈ A) > 0.
Theorem: Let Xn be an aperiodic recurrent Harris chain with stationary distribution π. If P(R < ∞|X0 = x) then as n → ∞, distTV (L(Xn|X0 = x), π) → 0.