READ: Sections 3.4-3.5 of Srinath, and Chapter 5 through p. 156 on estimation.

We will skip 3.6 until after we study estimation, and put off 3.9 until later.

We will skip Chapter 4 until after we have studied the Karhunen-Loeve expansion.

THIS WEEK: Binary and m-ary hypothesis testing with more complex decision regions.

- Under H
_{0}, random variable r has a unit (normalized) Gaussian distribution: r~N(0,1).

Under H_{1}, r has pdf p_{r}(R)=0.5e^{-|R|}. Simplify the LRT as much as possible.

There are three different possible ranges of threshold n, leading to 1, 3 or 5 decision regions.

- Under H
_{0}, random variable r has a unit (normalized) Gaussian distribution: r~N(0,1).

Under H_{1}, r is Gaussian: mean m and variance s^{2}. Simplify the LRT as much as possible.

Consider these special cases: (1) m=0; (2) m=infinity; (3) s=0; (4) s=1; (5) s=infinity.

- Denote the function L(R)=likelihood ratio=p
_{r|H1}(R|H_{1})/ p_{r|H0}(R|H_{0}).

Consider L(R) to be a function of random variable r. Prove the following:

(1) E[L^{n}|H_{1}]=E[L^{n+1}|H_{0}]; (2) E[L|H_{0}]=1; (3) E[L|H_{1}]-E[L|H_{0}]=Var[L|H_{0}].

The point of this problem is to get you used to the idea of L(R) as a random variable.

- We know random variable r is Gaussian with variance s
^{2}. Its mean could be any of:

H_{1}: -2m; H_{2}: -m; H_{3}: 0; H_{4}: m; H_{5}: 2m where m is known

The criterion is MEP (min Pr[error]) and the 5 hypotheses are equally likely*a priori*.

(1) Draw the decision regions on the R-axis. (2) Compute the attained Pr[error].

- We wish to estimate a=Pr[heads] of a coin from observation r=#heads in n independent flips.
- Compute a
_{MLE}(R). Show it is unbiased and compute the mean square error. - Compute the Cramer-Rao bound. Use it to show that a
_{MLE}(R) is efficient.

- Compute a

"An atheist is someone who watches Notre Dame and Southern Methodist University play football

and doesn't care who wins"--Dwight D. Eisenhower.