# PART 3 Random Processes

Document Sample

```					    PART 3
Random Processes

Huseyin Bilgekul
Eeng571 Probability and astochastic Processes
Department of Electrical and Electronic Engineering
Eastern Mediterranean University              EE571   1
Random Processes

EE571   2
Kinds of Random Processes

EE571   3
Random Processes
• A RANDOM VARIABLE X, is a rule for
assigning to every outcome, w, of an
experiment a number X(w).
– Note: X denotes a random variable and X(w) denotes
a particular value.
• A RANDOM PROCESS X(t) is a rule for
assigning to every w, a function X(t,w).
– Note: for notational simplicity we often omit the
dependence on w.

EE571   4
Conceptual Representation of RP

EE571   5
Ensemble of Sample Functions
The set of all possible functions is called the
ENSEMBLE.

EE571      6
Random Processes

• A general Random or Stochastic
Process can be described as:
– Collection of time functions
(signals) corresponding to various
outcomes of random experiments.
– Collection of random variables
observed at different times.
• Examples of random processes
in communications:
– Channel noise,                       t1           t2
– Information generated by a source,
– Interference.

EE571        7
Random Processes
Let  denote the random outcome of an experiment. To every such
outcome suppose a waveform                     X (t,  )
X (t , ) is assigned.

The collection of such            X (t,  )
n
waveforms form a                  X (t,  )         
k
stochastic process. The
set of { k } and the time        X (t,  )
2

index t can be continuous
X (t,  )
or discrete (countably            1
t
0           t t
infinite or finite) as well.                   1        2

For fixed  i  S (the set of
all experimental outcomes), X (t , ) is a specific time function.
For fixed t,
X 1  X (t1 , i )
is a random variable. The ensemble of all such realizations
X (t , ) over time represents the stochastic                 EE571    8
Random Process for a Continuous Sample Space

EE571   9
Random Processes

EE571   10
Wiener Process Sample Function

EE571   11
EE571   12
Sample Sequence for Random Walk

EE571   13
Sample Function of the Poisson Process

EE571   14
Random Binary Waveform

EE571   15
Autocorrelation Function of the Random Binary Signal

EE571   16
Example

EE571   17
EE571   18
Random Processes
Introduction (1)

EE571   19
Introduction

• A random process is a process (i.e., variation in
time or one dimensional space) whose behavior
is not completely predictable and can be
characterized by statistical laws.
• Examples of random processes
– Daily stream flow
– Hourly rainfall of storm events
– Stock index

EE571   20
Random Variable
• A random variable is a mapping function which assigns outcomes of a
random experiment to real numbers. Occurrence of the outcome
follows certain probability distribution. Therefore, a random variable
is completely characterized by its probability density function (PDF).

EE571   21
STOCHASTIC PROCESS

EE571   22
STOCHASTIC PROCESS

EE571   23
STOCHASTIC PROCESS

EE571   24
STOCHASTIC PROCESS

• The term “stochastic processes” appears mostly in
statistical textbooks; however, the term “random
processes” are frequently used in books of many
engineering applications.

EE571   25
STOCHASTIC PROC ESS

EE571   26
DENSITY OF STOCHASTIC PROCESSES
• First-order densities of a random process
A stochastic process is defined to be completely or
totally characterized if the joint densities for the
random variables X (t1 ), X (t2 ),  X (tn ) are known for all
times t1 , t2 ,, tn and all n.
In general, a complete characterization is practically
impossible, except in rare cases. As a result, it is
desirable to define and work with various partial
characterizations. Depending on the objectives of
applications, a partial characterization often suffices to
ensure the desired outputs.

EE571     27
DENSITY OF STOCHASTIC PROCESSES

• For a specific t, X(t) is a random variable with
distribution F ( x, t )  p[ X (t ) .x]
• The function F ( x, t ) is defined as the first-order
distribution of the random variable X(t). Its
derivative with respect to x
F ( x, t )
f ( x, t ) 
x

is the first-order density of X(t).

EE571   28
DENSITY OF STOCHASTIC PROCESSES

• If the first-order densities defined for all time t, i.e. f(x,t),
are all the same, then f(x,t) does not depend on t and we call
the resulting density the first-order density of the random
process X (t ); otherwise, we have a family of first-order
densities.
• The first-order densities (or distributions) are only a partial
characterization of the random process as they do not
contain information that specifies the joint densities of the
random variables defined at two or more different times.

EE571   29
MEAN AND VARIANCE OF RP
• Mean and variance of a random process
The first-order density of a random process, f(x,t), gives the
probability density of the random variables X(t) defined for all time
t. The mean of a random process, mX(t), is thus a function of time
specified by

m X (t )  E[ X (t )]  E[ X t ]   xt f ( xt , t )dxt

• For the case where the mean of X(t) does not depend on t, we have

mX (t )  E[ X (t )]  mX (a constant).
• The variance of a random process, also a function of time, is defined
by
 X (t )  E  X (t )  m X (t )]2   E[ X t2 ]  [m X (t )] 2
2
[

EE571   30
HIGHER ORDER DENSITY OF RP

• Second-order densities of a random process
For any pair of two random variables X(t1) and X(t2),
we define the second-order densities of a random
process as f ( x1 , x2 ; t1 , t2 ) or f ( x1 , x2 ) .
• Nth-order densities of a random process
The nth order density functions for X (t )at times
t1 , t2 ,, tn are given by
f ( x , x ,, x ; t , t ,, t ) or f ( x1 , x2 ,, xn ) .
1   2      n   1   2   n

EE571   31
Autocorrelation function of RP

• Given two random variables X(t1) and X(t2), a
measure of linear relationship between them is
specified by E[X(t1)X(t2)]. For a random process,
t1 and t2 go through all possible values, and
therefore, E[X(t1)X(t2)] can change and is a
function of t1 and t2. The autocorrelation function
of a random process is thus defined by

R(t1 , t2 )  EX (t1 ) X (t2 )  R(t2 , t1 )

EE571   32
Autocovariance Functions of RP

EE571   33
Stationarity of Random Processes

f x1 , x2 ,, xn ; t1 , t2 ,, tn )  f x1 , x2 ,, xn ; t1  , t2  ,, tn   )
• Strict-sense stationarity seldom holds for random
processes, except for some Gaussian processes.
Therefore, weaker forms of stationarity are needed.

EE571    34
Stationarity of Random Processes

PDF of X(t)
X(t)

Time, t

EE571   35
Wide Sense Stationarity (WSS) of Random Processes

EX (t )  m (constant) for all t.
R(t1, t2 )  Rt2  t1 )  Rt2  t1 ), for all t1 and t2 .
EE571   36
Equality and Continuity of RP

• Equality

• Note that “x(t, wi) = y(t, wi) for every wi” is not
the same as “x(t, wi) = y(t, wi) with probability
1”.

EE571   37
Equality and Continuity of RP

EE571   38
Mean Square Equality of RP
• Mean square equality

EE571   39
Equality and Continuity of RP

EE571   40
EE571   41
Random Processes
Introduction (2)

EE571   42
Stochastic Continuity

EE571   43
Stochastic Continuity

EE571   44
Stochastic Continuity

EE571   45
Stochastic Continuity

EE571   46
Stochastic Continuity

EE571   47
Stochastic Continuity

EE571   48
Stochastic Convergence

• A random sequence or a discrete-time random
process is a sequence of random variables
{X1(w), X2(w), …, Xn(w),…} = {Xn(w)}, w  .
• For a specific w, {Xn(w)} is a sequence of
numbers that might or might not converge. The
notion of convergence of a random sequence
can be given several interpretations.

EE571   49
Sure Convergence (Convergence Everywhere)

• The sequence of random variables {Xn(w)}
converges surely to the random variable X(w) if the
sequence of functions Xn(w) converges to X(w) as n
  for all w  , i.e.,
Xn(w)  X(w) as n   for all w  .

EE571   50
Stochastic Convergence

EE571   51
Stochastic Convergence

EE571   52
Almost-sure convergence (Convergence with
probability 1)

EE571   53
Almost-sure Convergence (Convergence with
probability 1)

EE571   54
Mean-square Convergence

EE571   55
Convergence in Probability

EE571   56
Convergence in Distribution

EE571   57
Remarks
•   Convergence with probability one applies to the
individual realizations of the random process.
Convergence in probability does not.
•   The weak law of large numbers is an example of
convergence in probability.
•   The strong law of large numbers is an example of
convergence with probability 1.
•   The central limit theorem is an example of convergence
in distribution.

EE571    58
Weak Law of Large Numbers (WLLN)

EE571   59
Strong Law of Large Numbers (SLLN)

EE571   60
The Central Limit Theorem

EE571   61
Venn Diagram of Relation of Types of Convergence

Note that even sure
convergence may not
imply mean square
convergence.

EE571   62
Example

EE571   63
Example

EE571   64
Example

EE571   65
Example

EE571   66
Ergodic Theorem

EE571   67
Ergodic Theorem

EE571   68
The Mean-Square Ergodic Theorem

EE571   69
The Mean-Square Ergodic Theorem

The above theorem shows that one can expect
a sample average to converge to a constant in
mean square sense if and only if the average of
the means converges and if the memory dies
out asymptotically, that is , if the covariance
decreases as the lag increases.

EE571   70
Mean-Ergodic Process

EE571   71
Strong or Individual Ergodic Theorem

EE571   72
Strong or Individual Ergodic Theorem

EE571   73
Strong or Individual Ergodic Theorem

EE571   74
Examples of Stochastic Processes

• iid random process
A discrete time random process {X(t), t = 1, 2,
…} is said to be independent and identically
distributed (iid) if any finite number, say k, of
random variables X(t1), X(t2), …, X(tk) are
mutually independent and have a common
cumulative distribution function FX() .

EE571   75
iid Random Stochastic Processes

• The joint cdf for X(t1), X(t2), …, X(tk) is given by
FX 1 , X 2 ,, X k ( x1 , x2 ,, xk )  P X 1  x1 , X 2  x2 ,, X k  xk )
 FX ( x1 ) FX ( x2 ) FX ( xk )

• It also yields
p X 1 , X 2 ,, X k ( x1 , x2 ,, xk )  p X ( x1 ) p X ( x2 ) p X ( xk )
where p(x) represents the common probability
mass function.

EE571   76
Bernoulli Random Process

EE571   77
Random walk process

EE571   78
Random walk process

• Let 0 denote the probability mass function of
X0. The joint probability of X0, X1,  Xn is

P( X 0  x0 , X 1  x1 ,, X n  xn )
 P X 0  x0 , 1  x1  x0 ,,  n  xn  xn 1 )
 P( X 0  x0 ) P(1  x1  x0 ) P( n  xn  xn 1 )
  0 ( x0 ) f ( x1  x0 ) f ( xn  xn 1 )
  0 ( x0 ) P( x1 | x0 ) P( xn | xn 1 )

EE571   79
Random walk process

P( X n 1  xn 1 | X 0  x0 , X 1  x1 ,, X n  xn )
P( X 0  x0 , X 1  x1 ,, X n  xn , X n 1  xn 1 )

P( X 0  x0 , X 1  x1 ,, X n  xn )
 0 ( x0 ) P( x1 | x0 ) P( xn | xn 1 )  P( xn 1 | xn )

 0 ( x0 ) P( x1 | x0 ) P( xn | xn 1 )
 P( xn 1 | xn )

EE571    80
Random walk process

The property
P( X n 1  xn 1 | X 0  x0 , X 1  x1 ,, X n  xn )  P( X n  xn 1 | X n  xn )
is known as the Markov property.
A special case of random walk: the Brownian
motion.

EE571    81
Gaussian process

• A random process {X(t)} is said to be a
Gaussian random process if all finite
collections of the random process, X1=X(t1),
X2=X(t2), …, Xk=X(tk), are jointly Gaussian
random variables for all k, and all choices of t1,
t2, …, tk.
• Joint pdf of jointly Gaussian random variables
X1, X2, …, Xk:

EE571   82
Gaussian process

EE571   83
Time series – AR random process

EE571   84
The Brownian motion
(one-dimensional, also known as random walk)

• Consider a particle randomly moves on a real line.
• Suppose at small time intervals  the particle jumps a small
distance  randomly and equally likely to the left or to the
right.
• Let X  (t ) be the position of the particle on the real line at
time t.

EE571   85
The Brownian motion
• Assume the initial position of the particle is at the
origin, i.e. X  (0)  0
• Position of the particle at time t can be expressed as
X  (t )   Y1  Y2    Y[ t /  ] ) where Y1 , Y2 ,
are independent random variables, each having
probability 1/2 of equating 1 and 1.
( t /   represents the largest integer not exceeding
t /  .)

EE571   86
Distribution of X(t)

• Let the step length  equal  , then
X (t )   Y1  Y2    Y[t /  ] )
• For fixed t, if  is small then the distribution of X  (t )
is approximately normal with mean 0 and variance t,
i.e., X  (t ) ~ N 0, t ) .

EE571   87
Graphical illustration of Distribution of X(t)

PDF of X(t)
X(t)

Time, t

EE571   88
• If t and h are fixed and  is sufficiently small
then
X  (t  h)  X  (t )
  Y1  Y2 
                     Y[(t  h ) / ] )  Y1  Y2     Y[ t /  ] ) 

  Y[t / ]1  Y[t / ] 2      Y[(t  h ) / ] )

   Yt    Yt  2  
                                Yt  h  

                                  

EE571   89
Graphical Distribution of the displacement of
X  (t  h)  X  (t )

• The random variable X  (t  h)  X  (t ) is normally
distributed with mean 0 and variance h, i.e.
  u2 
P X  (t  h)  X  (t ) )  x 
1    x

2h   2h 
exp

du


EE571    90
• Variance of X  (t ) is dependent on t, while variance
of X  (t  h)  X  (t ) is not.
• If 0  t1  t2    t2 m , then X  (t2 )  X  (t1 ) ,
X  (t4 )  X  (t3 ), , X  (t2 m )  X  (t2 m 1 )
are independent random variables.

EE571   91
X

t

EE571   92
Covariance and Correlation functions of                                                          X  (t )

Cov X  (t ), X  (t  h)  E X  (t ) X  (t  h)

 E  Y1  Y2    Y t     Y1  Y2    Yt  h  


 
                             


                       2
  Y  Y    Y   Y                                                    
 E  Y1  Y2    Y t    1 2                    t     t  1  Yt   2    Yt  h  
                                                                                              
                       
2
 E  Y1  Y2    Y t   
                     

t

Cov  X  (t ), X  (t  h) 
Correl  X  (t ), X  (t  h)  
t  t  h )
t
=
t  t  h)                           EE571       93

```
DOCUMENT INFO
Shared By:
Categories:
Tags:
Stats:
 views: 10 posted: 12/14/2011 language: English pages: 93