Description
Department of Electrical Engineering and Computer Sciences
EECS 126: Probability and Random Processes
Problem Set 14 (Optional)
1. Balls in Bins Estimation
We throw n ≥ 1 balls into m ≥ 2 bins. Let X and Y represent the number of balls that land in bin 1 and 2 respectively.
(a) Calculate E[Y | X].
(b) What are L[Y | X] and Q[Y | X] (where Q[Y | X] is the best quadratic estimator of Y given X)?
Hint: Your justification should be no more than two or three sentences, no calculations necessary! Think carefully about the meaning of the MMSE.
(c) Unfortunately, your friend is not convinced by your answer to the previous part. Compute E[X] and E[Y ].
(d) Compute var(X).
(e) Compute cov(X,Y ).
(f) Compute L[Y | X] using the formula. Ensure that your answer is the same as your answer to part (b).
2. MMSE and Conditional Expectation
Let X,Y1,…,Yn be square integrable random variables. The MMSE of X given (Y1,…,Yn) is defined as the function φ(Y1,…,Yn) which minimizes the mean square error
E[(X − φ(Y1,…,Yn))2].
(a) For this part, assume n = 1. Show that the MMSE is precisely the conditional expectation E[X|Y ]. Hint: expand the difference as (X − E[X|Y ] + E[X|Y ] − φ(Y )). (b) Argue that
.
That is, the MMSE does better than the average of the individual estimates given each
Yi.
3. Geometric MMSE
Let N be a geometric random variable with parameter 1−p, and (Xi)i∈N be i.i.d. exponential random variables with parameter λ. Let T = X1 + ··· + XN. Compute the LLSE and MMSE of N given T.
Hint: Compute the MMSE first.
1
4. Gaussian Random Vector MMSE
Let
be a Gaussian random vector.
Let
1,
W = 0, if Y > 0 if Y = 0
−1, if Y < 0
be the sign of Y . Find E[WX | Y ]. Is the LLSE the same as the MMSE?
5. Gaussian Sine
Let X,Y,Z be jointly Gaussian random variables with covariance matrix
4
1
0 1
4
1
and mean vector [0,2,0]. Compute E[(sinX)Y (sinZ)]. Hint: Condition on (X,Z).
6. Error of the Kalman Filter for a Linear Stochastic System
The linear stochastic system
,
starts from an arbitrary (known) initial condition and the system noise variables
(wk, k ≥ 0) are i.i.d. normal with mean 0 and variance 1.
The state variables are not directly observable. However, we can observe
Yk = X1,k + X2,k, k ≥ 0.
Let Xˆk|k denote the minimum mean square error estimator of given (Y0,…,Yk).
Determine the asymptotic behavior of the covariance matrix of the estimation error.
Note: This problem needs thought. Note that there is no observation noise, so the assumption used in the derivation of the Kalman filter equations, that the covariance matrix of the observation noise is positive definite, is no longer valid.
2




Reviews
There are no reviews yet.