Description
Department of Electrical Engineering and Computer Sciences
EECS 126: Probability and Random Processes
Discussion 4
1. Reversible Distributions
Let (Xn)n∈N be a Markov chain with state space S.
(a) Show that for every m,k ∈ N, with m ≥ 1, we have
P(Xk = i0 | Xk+1 = i1,…,Xk+m = im) = P(Xk = i0 | Xk+1 = i1),
for all states i0,i1,…,im ∈ S. This is the backwards Markov property.
(b) In general, is the reversed chain (i.e. the chain Yk := X−k for k ∈ −N) a temporally homogeneous Markov chain? If not, provide an example.
(c) Show that if, in addition, the chain is reversible and started from a stationary distribution X0 ∼ π, then
d
(X0,…,Xn) = (Xn,…,X0).
2. Markov Chain Practice
Consider a Markov chain with three states 0, 1, and 2. The transition probabilities are P(0,1) = P(0,2) = 1/2, P(1,0) = P(1,1) = 1/2, and P(2,0) = 2/3, P(2,2) = 1/3.
(b) In the long run, what fraction of time does the chain spend in state 1?
(c) Suppose that X0 is chosen according to the steady state distribution. What is P(X0 = 0 | X2 = 2)?
3. More Almost Sure Convergence
(a) Suppose that, with probability 1, the sequence (Xn)n∈N oscillates between two values a =6 b infinitely often. Is this enough to prove that (Xn)n∈N does not converge almost surely? Justify your answer.
(b) Suppose that Y is uniform on [−1,1], and Xn has distribution
.
Does ( converge a.s.?
(c) Define random variables (Xn)n∈N in the following way: first, set each Xn to 0. Then, for each k ∈ N, pick j uniformly randomly in {2k,…,2k+1 − 1} and set Xj = 2k. Does the sequence (Xn)n∈N converge a.s.?
(d) Does the sequence (Xn)n∈N from the previous part converge in probability to some X? If so, is it true that E[Xn] → E[X] as n → ∞?
1




Reviews
There are no reviews yet.