100% Guaranteed Results


EECS126 – UC Berkeley Solved
$ 29.99
Category:

Description

5/5 – (1 vote)

Department of Electrical Engineering and Computer Sciences
EECS 126: Probability and Random Processes
Problem Set 12
1. Flipping Coins and Hypothesizing You flip a coin until you see heads. Let
(
1 if the bias of the coin is q > p.
X =
0 if the bias of the coin is p.
Find a decision rule Xˆ(Y ) that maximizes P[Xˆ = 1 | X = 1] subject to P[Xˆ = 1 | X = 0] ≤ β for β ∈ [0,1]. Remember to calculate the randomization constant γ.
2. Gaussian Hypothesis Testing
Consider a hypothesis testing problem that if X = 0, you observe a sample of N(µ0,σ2), and if X = 1, you observe a sample of N(µ1,σ2), where µ0,µ1 ∈ R, σ2 > 0. Find the Neyman-Pearson test for false alarm α ∈ (0,1), that is, P(Xˆ = 1 | X = 0) ≤ α.
3. BSC Hypothesis Testing
Consider a BSC with some error probability 5). Given n inputs and outputs (xi,yi) of the BSC, solve a hypothesis problem to detect that 1 with a probability of false alarm at most equal to 0.05. Assume that n is very large and use the CLT.
Hint: The null hypothesis is 1. The alternate hypothesis is 1, which is a composite hypothesis (this means that under the alternate hypothesis, the probability distribution of the observation is not completely determined; compare this to a simple hypothesis such as
3, which does completely determine the probability distribution of the observation). The Neyman-Pearson Lemma we learned in class applies for the case of a simple null hypothesis and a simple alternate hypothesis, so it does not directly apply here.
To fix this, fix some specific 1 and use the Neyman-Pearson Lemma to find the optimal hypothesis test for the hypotheses 1 vs. . Then, argue that the optimal decision rule does not depend on the specific choice of 0; thus, the decision rule you derive will be
simultaneously optimal for testing 1 vs. for all
4. Basic Properties of Jointly Gaussian Random Variables
Let (X1,…,Xn) be a collection of jointly Gaussian random variables. Their joint density is given by (for x ∈ Rn)
,
where µ is the mean vector and C is the covariance matrix.
(a) Show that X1,…,Xn are independent if and only if they are pairwise uncorrelated.
1
(b) Show that any linear combination of these random variables will also be a Gaussian random variable.
5. Independent Gaussians
Let X = (X,Y ) be a jointly Gaussian random vector with mean vector [0,0] and covariance matrix

Find a 2 × 2 matrix U such that UX = (X0,Y 0) where X0 and Y 0 are independent.
2

Reviews

There are no reviews yet.

Be the first to review “EECS126 – UC Berkeley Solved”

Your email address will not be published. Required fields are marked *

Related products