next up previous
Next: About this document ... Up: Random Processes I Previous: Random Variables

Random Processes

Each w gives a function. Random only because we don't know w ahead of time.
View
x(t)         random waveform
x(t1),x(t2)         random variables
x(t,w)         deterministic waveform
x(t,w)         constant

Some definitions
Def
Statistics of a Random Process x(t) is the set of joint distributions on (t1,t2,...tn) for any n>0 and any values of $t_i\in R$, i.e. we have available
fX(t1)X(t2)...X(tN)(x(t1)...x(tN)) for any n>0 and $i
t_i\in R$
Def
mth order statistics $\rightarrow$ we only know up to joint distribution or M instancesa. We sing and dance when m=2 (gaussian random process has this property that it is completely described by m=2 order stats)

Def Mean

\begin{displaymath}E(X(t))=m_X(t)=\int_{-\infty}^{\infty}x f_{X(t)}(x)dx\end{displaymath}

Where does it hang out?

Def Autocorr

\begin{displaymath}R_{XX}(t_1 t_2)=\int_{-\infty}^{\infty}\int_{-\infty}^{\infty}x_1x_2 f_{X(t_1),X(t_2)}(x_1,x_2)dx_1dx_2\end{displaymath}

Def Autocov

\begin{displaymath}C_{XX}(t_1 t_2)=\int_{-\infty}^{\infty}\int_{-\infty}^{\infty}(x-m_X(t_1))(x_2-m_X(t_2)) f_{X(t_1),X(t_2)}(x_1,x_2)dx_1dx_2\end{displaymath}

Stationarity
Def Strict sense stationarity (SSS)

\begin{displaymath}f_{X(t_1)...X(t_N)}(x_1,...x_N)=f_{X(t+\Delta)...X(t_N+\Delta)}(x_1,...x_N)\end{displaymath}

$\forall D\in R \qquad \forall N>0\quad t_i\in R$
i.e. relative statistics look the same no matter what absolute time you look.
mth order stationarity $\Rightarrow$ above, only $N\le M$

Def Wide sense stationarity (WSS)
mX(t) = constant
$R_{XX}(t_1 t_2)=R_{XX}(\tau)$ only time difference matters
($R_X(\tau)$ for short)

Def Cyclostationarity
mX(t)=mX(t+kT) for some $T\in R$

\begin{displaymath}R_X(t+\tau+kT,t+kT)=R_X(t+\tau,t)\end{displaymath}

where $t,\tau,T\in R$

Autocorr of Stationary Process

1.
$R_X(\tau)=R_X(-\tau)$        (process must be real)
pf.      $R_x(\tau)=E(X(t)X(t+\tau))=E(X(t-\tau)X(t))=R_X(-\tau)$
2.
$R_X(0)\ge\vert R_X(\tau)\vert$pf.      $E[(X(t)\pm X(t-\tau))^2]\ge 0$
Therefore,

\begin{displaymath}\underbrace{E(X^2(t))}_{R_X(0)}+\underbrace{E(X^2(t-\tau))}_{R_X(0)}\pm 2 R_X(\tau)\ge 0\end{displaymath}

$R_X(0) \ge -R_X(\tau)$
$R_X(0) \ge +R_X(\tau)$
$R_X(0)\ge\vert R_X(\tau)\vert$
3.
If RX(0)=RX(T) then RX(kT)=RX(0)
pf. more involved than its worth for review of a factoid
Book use induction + Cauchy - Schwartz

\begin{displaymath}\left\vert\int_{-\infty}^{\infty}x(t)y^*(t)dt\right\vert^2=\int\vert x(t)\vert^2 dt \int\vert y(t)\vert^2 dt\end{displaymath}

Ergodicity
(Strictly stationary) (simple argument)
Define averages first

1.
$E(gX(t)))=\int_{-\infty}^{\infty}g(x)f_{X(t)}(x)dx$     ensemble average of possible values
2.

\begin{displaymath}\overline{g(x(t,w))}=\lim_{T\rightarrow\infty}\frac{1}{T}\int_{-T/2}^{T/2}g(x(t,w))dt\end{displaymath}

time average for a given sample function
if $g(x(t,w))=E(g(X(t))) \qquad \omega\in\Omega,$
then x(t) is ergodic, $\rightarrow$ only need to look at a simple x(t,w) to determine the props.
Problem: We don't live forever, so can't test ergodicity for arbitrary process.

Power and Energy (general)

\begin{displaymath}E_i=\int_{-\infty}^{\infty}x^2(t,\omega_i)dt\end{displaymath}

energy in $x(t,\omega_i)$

\begin{displaymath}P_i=\lim_{T\rightarrow\infty}\int_{-T/2}^{T/2}x^2(t,\omega_i)dt\end{displaymath}

power in $x(t,\omega_i)$

Def PX,EX random variables

CDFs:

\begin{displaymath}F_{P_X}(P_X)=Prob(\bigcup\omega_i: P_i\le \mathcal{P}_X)\end{displaymath}


\begin{displaymath}F_{E_X}(E_X)=Prob(\bigcup\omega_i: E_i\le \mathcal{E}_X)\end{displaymath}

Def
$P_X=E(\mathcal{P}_X)\qquad\qquad E_X=E(\mathcal{E}_X)$

\begin{displaymath}E_X=\int_{-\infty}^{\infty}E(X^2(t))dt=\int_{-\infty}^{\infty}R_X(t,t)dt\end{displaymath}


\begin{displaymath}P_X=\lim_{T\rightarrow\infty}\frac{1}{T}\int_{-T/2}^{T/2}E(X^2(t))dt=\lim_{T\rightarrow\infty}\int_{-T/2}^{T/2}R_X(t,t)dt\end{displaymath}

Stationary $\Rightarrow R_X(t,t)=R_X(0)$

\begin{displaymath}\mathcal{P}_X=R_X(0 \Rightarrow E_X \mbox{infinite for a stationary process}\end{displaymath}

Ergodic? $\Rightarrow P_i=P_X$
More than one process
$x(t,\omega_i) \qquad y(t,\omega_i), \omega_i \in\Omega$ (need to be defined on the same sample space)

Def

1.
X(t), Y(t) indept. if

\begin{displaymath}Prob(X(t_1)\le x,T(t_2)\le y)=Prob(X(t_1)\le x)Prob(Y(t_2)\le y) \quad \forall t_1,t_2\end{displaymath}

2.
X(t1), Y(t2) uncorrelated if

\begin{displaymath}E(X(t_1)Y(t_2))=E(X(t_1))E(Y(t_2)) \quad \forall t_1,t_2\end{displaymath}

3.
Cross correlation

RXY(t1,t2)=E(X(t1)Y(t2))=RYX(t2,t1)

4.
Jointly WSS if X(t), Y(t) each WSS and

\begin{displaymath}R_{XY}(t_1,t_2)=R_{XY}(t_1-t_2)=R_{XY}(\tau)\end{displaymath}

END

next up previous
Next: About this document ... Up: Random Processes I Previous: Random Variables

1999-02-06