- THIS WEEK: Discrete-time random processes and convergence of sequences of RVs.
- Use the result of Problem Set #5, Problem #6 in this problem (R
- Two zero-mean random sequences x(n) and y(n) have the interesting property:

R_{x}(n,n)+R_{y}(n,n)=2R_{xy}(n,n). What can you say about sequences x(n) and y(n)? - A zero-mean WSS random sequence x(n) has R
_{x}(T)=R_{x}(0) for some constant T.

Show (with probability one) that x(n) and R_{x}(n) are both*periodic*with period T.

_{x}(i,j)=autocorrelation):

- Two zero-mean random sequences x(n) and y(n) have the interesting property:
- Stark and Woods #7.15. ALSO: (d) What happens to K
_{n}(m,n) when p approaches one?

Let e=1-p approach zero and use: (1-e)^{|m-n|}=1-e|m-n| and (m+n)-|m-n|=2 MIN[m,n].

Compare this to the result from lecture: An II process is the sum of an iid process.

HINT: x(n)=px(n-1)+w(n),x(0)=0 becomes x(n)=h(0)w(n)+...+h(n-1)w(1), h(n)=p^{n}.

- Let x(n) be a 1-sided Bernoulli process, starting at n=1, with p NOT equal to ½.

Define two new random sequences N(n)=x(1)+x(2)+...+x(n) and Y(n)=(-1)^{N(n)}.

Compute the mean function and the covariance function of both N(n) and Y(n).

Are either of the random sequences N(n) or Y(n) WSS or asymptotically WSS?

- A random sequence x(1),x(2)... is defined using x(n,s)=n if 0 < s < 1/n; x(n,s)=0 otherwise.

x(n,s) is the basic definition of a random process: A mapping with domain (time)X(sample space).

In x(n,s) the random variable s has pdf f_{s}(S)=1 if 0 < S < 1; f_{s}(S)=0 otherwise.

- Note the pmf for x(n) is given by: Pr[x(n)=n]=1/n; Pr[x(n)=0]=1-1/n.
- Show that x(n) converges to 0 in probability (you don't need x(n,s) for this; only the pmf).
- Show that x(n) converges to 0 with probability one (you DO need x(n,s) for this).
- Show that x(n) does NOT converge to 0 in mean square (you don't need x(n,s) for this).

HINT: There is virtually NO computation in this problem! Just THINK (uh-oh!).

- Let x(n),n=1,2...have zero mean and bounded autocorrelation R(i,j)=K
_{x}(i,j) (since 0-mean).

Let M(n)=[x(1)+x(2)+...+x(n)]/n=sample mean, and C(n)=[R(1,n)+R(2,n)+...+R(n,n)]/n.

PROVE that M(n) converges*in mean square*to E[x]=0 IFF C(n) goes to 0 for large n.

This*ergodic*theorem*doesn't*require x(n) to be iid, just*asymptotically uncorrelated*.

Compare this to the weak law of large numbers: M(n) converges*in probability*to E[x] if x(n) iid.

HINT: ONLY IF: Apply the Cauchy-Schwarz inequality to C(n)=E[M(n)x(n)].

HINT: IF: Show Var[M(n)]=[R(1,1)+...+R(1,n)+R(2,1)+...+R(n,n)]/n^{2}(see (6.1-10) on p. 273) and