# Let Y be a RV following the Poisson distribution with parameter

Document Sample

```					Theorem 1: For an M/G/1 queue at steady-state, the distribution of customers seen by an
arriving customer is the same as that left behind by a departing customer.
Proof: Customers arrive one at a time and depart one at a time.
   A(t ), D(t ) : number of arrivals and departures (respectively) in (0,t)
   U n (t ) : number of (n,n+1) transitions in (0,t) = number of arrivals that find system at state n

   Vn (t ) : number of (n+1,n) transitions in (0,t) = number of departures that leave system at state n

   U n (t ) and Vn (t ) differ by at most 1 [when a (n,n+1) transition occurs, another (n,n+1) transition
can occur only if the state has moved back to n, i.e., after a (n+1,n) transition has occurred]
 Stationary   probability that an arriving customer finds the system at state n:
n  lim P{N (t )  n | arrival at t  }
t 

    n is the proportion of arrivals that find the system at state n:
U (t )
n  lim n
t  A(t )

 Similarly,   stationary probability that a departing customer leaves system at state n:
Vn (t )
 n  lim
t D (t )
 Noting that lim A(t ) / t  lim D(t ) / t   , we have:
t           t 
U (t )      V (t )     U (t )      A(t )       V (t )       D(t )
lim n  lim n  lim n lim                      lim n lim               n   n
t    t    t    t    t  A(t ) t   t      t  D (t ) t   t
Theorem 2: For an M/G/1 queue at steady-state, the probability that an arriving customer finds n
customers in the system is equal to the proportion of time that there are n customers in the
system. Therefore, the distribution seen by an arriving customer is identical to the stationary
distribution.
Proof: Identical to the PASTA theorem due to:
 Poisson arrivals
 Lack of anticipation: future arrivals independent of current state N(t)

Theorem 3: For an M/G/1 queue at steady-state, the system appears statistically identical to an
arriving and a departing customer. Both an arriving and a departing customer, at steady-state, see
a system that is statistically identical to the one seen by an observer looking at the system at an
arbitrary time.

Analysis of the M/G/1 Queue:
   Consider the embedded Markov chain resulting by observing the system at departure epochs
   At steady-state, the embedded Markov chain and {N(t)} are statistically identical
   Stationary distribution pn is equal to the stationary distribution of the embedded Markov chain
    s j : time of jth departure
    L j  N ( s j ) : number of customers left behind by the jth departing customer

Show that {L j : j  1} is a Markov chain
   If L j 1  1 : customer j enters service immediately at time s j 1 . Then:

L j  L j 1  1  Aj , if L j 1  1
   If L j 1  0 : customer j arrives after time s j 1 and departs at time s j . Then:
L j  A j , if L j 1  0
   Combining the above:
L j  L j 1  A j  1{L j 1  0}

   A j : number of arrivals during service time X j :
                                              
P{ Aj  k }   P{ Aj  k | X j  t} f X (t )dt  (1/ k !)  e t (t ) k f X (t )dt
0                                               0

   A1, A2,…: independent – arrivals in disjoint intervals
L j depends on the past only through L j 1 . Thus: {L j : j  1} is a Markov chain
A1, A2, …: iid. Drop the index j – equivalent to considering the system at steady state
                            1 
ak  P{A  k}   P{A  k | X  t} f X (t )dt   e t (t ) k f X (t )dt , k  0,1,...
0                             k! 0
Find the first two moments of A

Proposition: For the number of arrivals A during service time X, we have:
E[ A]  E[ X ]  
E[ A2 ]   2 E[ X 2 ]  E[ X ]
Proof: Given X=t, the number of arrivals A follows the Poisson distribution with parameter t .
                                                                 
E[ A]   E[ A | X  t ] f X (t )dt   (t ) f X (t )dt   tf X (t )dt E [ X ]
0                                       0                           0

                                      
E[ A2 ]   E[ A2 | X  t ] f X (t )dt   ( 2 t 2  t ) f X (t )dt
0                                       0
                     
                t f X (t )dt    tf X (t )dt   2 E[ X 2 ]  E[ X ]
2            2
0                        0

Lemma: Let Y be a RV following the Poisson distribution with parameter   0 . Then:
                                          
k                                          k

E[Y ]   k 1 k  e             , E[Y 2 ]   k 1 k 2  e 
                                          
 2  
k!                                         k!
   Transition Probabilities: Pij  P{Ln 1  j | Ln  i}
P0 j   j ,   j  0,1,...
        , j  i 1
Pij   j i 1            , i 1
0,         j  i 1
   Stationary distribution:  j  lim P{Ln  j}
n 

 0 1 2       
 j  0  j  1 j    j 1   j 10 , j  1                                
  or   P, P                    
0   1    2

0  0 0  10                                                     0 0 1        
                
                
Unique solution:      j is the fraction of departing customers that leave j customers behind
   From Theorem 3:  j is also the proportion of time that there are j customers in the system

   Applying Little’s Theorem for the server, the proportion of time that the server is busy is:
1  0  E[T ]    0  1  
   Stationary distribution can be calculated iteratively:
0  0 0  10
1  0 1  11  2 0

Iterative calculation might be prohibitively involved
Often, we want to find only the first few moments of the distribution, e.g., E[N] and E[N2]

   We will present a general methodology based on z-transforms that can be used to
1.   Find the moments of the stationary distribution without calculating the distribution itself
2.   Find the stationary distribution, in special cases
3.   Derive approximations of the stationary distribution


Definition: Moment generating function of random variable X; for any t R
  etx f ( x)dx,
          X               X continuous
M X (t )  E[etX ]   
 j e P{ X  x j },
tx j
X discrete


Theorem 1: If the moment generating function M X (t ) exists and is finite in some neighborhood
of t=0, it determines the distribution (pdf or pmf) of X uniquely.

Theorem 2: For any positive integer n:
dn
1.        M X (t )  E[ X n etX ]
dt n
dn
2.        M X (0)  E[ X n ]
dt n

Theorem 3: If X and Y are independent random variables:
M X Y (t )  M X (t )M Y (t )
   For a discrete random variable, the moment generating function is a polynomial of et .
   It is more convenient to set z  et and define the z-transform (or characteristic function):
GX ( z )  E[ z X ]   z j P{ X  x j }
x

j

   Let X be a discrete random variable taking values 0, 1, 2,…, and let pn  P{ X  n} . The z-
transform is well-defined for | z | 1 :

GX ( z )  p0  zp1  z 2 p2  z 3 p3      pn z n
n0

   Z-transform uniquely determines the distribution of X
   If X and Y are independent random variables: GX Y ( z)  GX ( z )GY ( z )
Calculating factorial moments:
            
lim GX ( z )  lim  npn z n 1   npn  E[ X ]

z 1          z 1
n 1        n 1
                      
lim GX ( z )  lim  n( n  1) pn z n  2   n(n  1) pn  E[ X ( X  1)]

z 1          z 1
n2                    n2

   Higher factorial moments can be calculated similarly
We have established:
Lj  Lj 1  1{Lj 1  0}  Aj  ( Lj 1  1)  Aj

Let n  lim P{L j  n} be the stationary distribution and GL ( z )   n 0 n z n its z-transform.


j 

Noting that ( Lj 1  1) and A j are independent, we have:
( L j 1 1)
E[ z j ]  E[ z
L                                      A
]E[ z j ]
At steady-state, L j and L j 1 are statistically identical, with pmf  n . Therefore:
E[ z j ]  E[ z                   ]  GL ( z)
L                 L j 1

Moreover: E[ z ]  GA ( z )   n 0 n z
Aj                                       n

Let X be a discrete random variable taking values 0, 1, 2,…, and let pn  P{ X  n} . Then:

E[ z( X 1) ]  p0  p1  zp2  z 2 p3                                 p0  z 1 ( E[ z X ]  p0 )
( L j 1 1)
]  0  z 1 ( E[ z             ]  0 )  0  z 1 (GL ( z)  0 )
L j 1
Therefore: E[ z
Then:
0 ( z  1)G A ( z )
GL ( z )  [ 0  z 1 (GL ( z )  0 )]G A ( z )  GL ( z ) 
z  GA ( z )

Therefore, the z-transform of the pmf of {N (t ) : t  0} is:
0 ( z  1)G A ( z )
GN ( z ) 
z  GA ( z )
Probability  0 can be calculated by requiring lim GL ( z )   n 0 n  1


z 1
Using L’Hospital’s rule:

0 [GA ( z )  ( z  1)G A ( z )]      0
1  lim                                                0  1  E[ A]
z 1                  
1  GA ( z )               1  E[ A]

Recall that E[ A]  E[ X ]   . For 0  0 , we must have:   E[ A]  1. Finally:
       ( x ) k
G A ( z )   n  0 z k k   n  0 z k  e x
              
f X ( x )dx
0           k!
          (xz )      k

  e x  n  0

f X ( x )dx   e ( z 1) x f X ( x )dx
0               k!                  0

 M X (( z  1))

where M X (t )  E[etX ]  0 etx f X ( x )dx is the moment generating function of the service time X

At steady-state the number of customers left behind by a departing customer and the number
of customers in the system are statistically identical, i.e., {L j } and {N(t)} have the same pmf
Concluding:
(1  )( z  1)G A ( z ) (1  )( z  1) M X (( z  1))
G N ( z )  GL ( z )                          
z  GA ( z )             z  M X (( z  1))
Example 1: M/M/1 Queue. X is exponentially distributed with mean 1/μ. Then:

M X (t )  E[etX ] 
t
Then, the z-transform of the number of arrivals during a service time is:
            
GA ( z )  M X (( z  1))                   
  ( z  1)     z
The P-K Transform equation, then, gives:

(1  )( z  1)
(1  )( z  1)G A ( z )                        z       1 
GN ( z )                                                            
z  GA ( z )                 z
                 1  z
    z
For | z |  :
1
 1  z   2 z 2 
1  z
Then:
GN ( z)  n 0 (1 )n  z n


Therefore:
pn  (1  )n , n  0
Assume that the z-transform is of the form:
U ( z)
G( z)           ,
V ( z)
U(z) and V(z) polynomials without common roots. Let z1 ,                       , zm be the roots of V(z). Then:

V ( z )  ( z  z1 )( z  z2 )    ( z  zm )
Expansion of G(z) in partial fractions:
1         2                        m
G( z)                                 
( z1  z ) ( z2  z )                 ( zm  z )

Given such an expansion, for | z || zk | :
2
1         z  z 
1    
1  z / zk    zk  zk 
Then:
n
m
      1         m
   z          m
k
G( z )   k                  k      n 1 z n
k 1 zk 1  z / zk   k 1 zk n  0  zk  n  0 k 1 zk

Therefore:
m
k
pn         n 1
k 1 zk
U ( z)
G( z)            , V ( z )  ( z  z1 )( z  z2 )       ( z  zm )
V ( z)
Expansion of G(z) in partial fractions:
1         2                  m
G( z)                              
( z1  z ) ( z2  z )           ( zm  z )
Determining the coefficients of the partial fractions:
1  lim( z1  z )G( z )
z  z1

Note that:
U ( z1 )          U ( z1 )
lim( z1  z )G ( z )                               
z  z1                       ( z1  z2 ) ( z1  zm )    V ( z1 )
Therefore, the coefficients can be determined as:
U ( zk )
k  lim( zk  z )G ( z )  
z  zk                    V ( zk )
Wi : waiting time of customer i
   X i : service time of customer i
   Qi : number of customers waiting in queue (excluding the one in service) upon arrival of
customer i
    Ri : residual service time of customer i = time until the customer found in service by customer
i completes service
    Ai : number of arrivals during the service time X i of customer i

Service Times
   X1, X2, …, independent identically distributed RVs
   Independent of the inter-arrival times
   Follow a general distribution characterized by its pdf f X ( x ) , or cdf FX ( x )
   Common mean E[ X ]  1/ 

   Common second moment E[ X 2 ]

State Representation:
   {N (t ) : t  0} is not a Markov process – time spent at each state is not exponential
   R(t) = the time until the customer that is in service at time t completes service
   {( N (t ), R(t )) : t  0} is a continuous time Markov process, but the state space is not a
countable set
   Finding the stationary distribution can be a rather challenging task

Goals:
   Calculate average number of customers and average time-delay without first calculating the
stationary distribution
   Pollaczek-Khinchin (P-K) Formula:
E [ X 2 ]
E [W ] 
2(1  E [ X ])
   To find the stationary distribution, we use the embedded Markov chain, defined by observing
N (t ) at departure epochs only – transformation methods


   Assume FCFS discipline
   Waiting time for customer i is:
 X i Qi  Ri   j i Q X j
i 1
Wi  Ri  X i 1  X i  2 
i

   Take the expectation on both sides and let i   , assuming the limits exist:
E[Wi ]  E[ Ri ]  E   j i Q X j   E[ Ri ]  E[ X ]E[Qi ] 
i 1

          i    
E[W ]  E[ R]  E[ X ]E[Q ]
   Averages E[Q], E[R] in the above equation are those seen by an arriving customer.
   Poisson arrivals and Lack of Anticipation: averages seen by an arriving customer are equal
averages seen by an outside observer – PASTA property
   Little’s theorem for the waiting area only:
E[Q]  E[W ]
E[ R ]
   E[W ]  E[ R ]  E[ X ]  E[W ]  R  E[W ]  E[W ] 
1 
     E[ X ]   /  : utilization factor = proportion of time server is busy
  E[ X ]  1  p0
Calculate the average residual time: E[ R]  lim E[ Ri ]
i 

Proposition: Sum of a Random Number of Random Variables
   N: random variable taking values 0,1,2,…, with mean E[ N ]
   X1, X2, …, XN: iid random variables with common mean E[ X ]
Then: E[ X 1       X N ]  E[ X ]  E[ N ]

Proof: Given that N=n the expected value of the sum is

E   j 1 X j | N  n   E   j 1 X j    j 1 E[ X j ] nE[ X ]
N                          n              n

                                     
Then:
                                            
E   j 1 X j    E   j 1 X j | N  n   P{N  n}   nE [ X ]  P{N  n}
N                    N

             n 1                                    n 1

 E[ X ] nP{N  n}  E[ X ]E[ N ]
n 1
   Assume that steady state averages E[Q], E[R], E[W] exist
   Ergodic argument: steady-state averages are equal to long-term averages
   Long-term average residual time: graphical calculation

   Graphical calculation of the long-term average of the residual time
t
   Time-average residual time over [0,t]: t 1  R ( s )ds
0
   Consider time t, such that R(t)=0. Let D(t) be the number of departures in [0,t] and assume
that R(0)=0. From the figure we see that:
1 D ( t ) X 2 1 D(t )  i 1 X i
2           D(t )
1 t
R ( s )ds   i  
t 0
            
t i 1 2      2 t      D (t )

 i 1 X i2
D(t )
1 t           1       D (t )
lim  R( s )ds   lim             lim
t  t 0           2 t  t t  D (t )
   Ergodicity: long-term time averages = steady-state averages (with probability 1)
1 t
E[ R]  lim E[ Ri ]  lim  R( s )ds
i           t  t 0

i 1 X i2
D(t )
1 t            1      D(t )
   lim  R( s )ds   lim            lim
t  t 0            2 t  t t  D(t )
   lim D(t ) / t : long-term average departure rate. Should be equal to the long-term average
t 
arrival rate. Long-term averages = steady-state averages (with probability 1):
D(t )
lim        
t   t
   Law of large numbers:
                         
D(t )                         n
X i2                         X i2
lim     i 1
 lim         i 1
 E[ X 2 ]
t    D(t )               n         n
Average residual time:
1
E [ R ]  E [ X 2 ]
2
P-K Formula:
E [ R ] E [ X 2 ]
E [W ]                  
1   2(1  )
P-K Formula:
E [ R ] E [ X 2 ]
E [W ]         
1   2(1  )
   Average time a customer spends in the system
1 E [ X 2 ]
E [T ]  E [ X ]  E [W ]     
 2(1  )
   Average number of customers waiting for service:
 2 E[ X 2 ]
E [Q ]  E [W ] 
2(1  )
   Average number of customers in the system (waiting or in service):
 2 E[ X 2 ]
E [ N ]  E [T ]   
2(1  )
Averages E[W], E[T], E[Q], E[N] depend on the first two moments of the service time

M/D/1 Queue: Deterministic service times all equal to 1/μ
1                  1
   E[ X ]      ,   E[ X 2 ] 
                  2
E [ X 2 ]                    2 E[ X 2 ]   2
   E [W ]                      , E [Q ]               
2(1  ) 2(1  )             2(1  ) 2(1  )
1 E [ X 2 ] 1          2                         (2  )
   E[T ]                                  , E[ N ]  E[T ] 
 2(1  )  2(1  ) 2(1  )                     2(1  )

M/M/1 Queue: Exponential service times with mean 1/μ
1                  2
   E[ X ]      ,   E[ X 2 ] 
                  2
E [ X 2 ]                  2 E[ X 2 ]   2
   E [W ]                   , E[Q ]               
2(1  ) (1  )            2(1  ) (1  )
1 E [ X 2 ] 1         1                        
   E [T ]                            , E[ N ]  E[T ] 
 2(1  )  (1  )                        
   M/G/1 system with arriving customers divided in c priority classes
   Class 1 highest priority, class 2 second highest, up to class c which is the lowest priority class
   Class k customers arrive according to a Poisson process with rate  k
   Service times of class k customers are iid, following a general distribution with mean
X k  1/  k and second moment X k2
   Arrival processes are independent of each other and independent of the service times
   k   k X k   k /  k : utilization factor for class k

   Wk : average queueing time (at steady state) of class k customers
   Preemptive or non-preemptive priority discipline
Develop a formula that gives the average queueing time for each priority class

   Non-preemptive: service of a customer completes uninterrupted, even if customers of higher
priority arrive in the meantime
   A separate queue is maintained for each class; each time the server becomes free, the first
customer in the highest priority queue (that is not empty) enters service
   Non-preemptive policy: the mean residual service R time seen by an arriving customer is the
same for all priority classes

Priority Class 1
   Queueing time of class 1 customer = residual service time + time required to service all class
1 customers found in the queue upon arrival
   Similarly to the derivation of P-K formula, this implies:
1
W1  R  Q1
1
   Little’s formula: Q1  1W1
   Combining the two:
R
W1 
1  1
Priority Class 2:
   Queueing time for class 2 customer is the sum of the following:
1.   Residual service time
2.   Time to service all class 1 customers found in queue upon arrival
3.   Time to service all class 2 customers found in queue upon arrival
4.   Time to service all class 1 customers that arrive while the customer waits in the queue
   Focusing on averages of these times:
1   1  1
W2  R  Q1  Q2  1W2
1   2 1
1            1             1
R   1W1   2W2  1W2
1          2            1
   Solving for W2 and using the expression for W1 :
R
W2 
(1  1 )(1  1   2 )

Priority Class k: Using induction
R
Wk 
(1  1      k 1 )(1  1      k 1   k )
   Mean Residual Service Time: Using the graphical method developed in the proof of the P-K
formula, one can show:
1 c
R   k X k 2

2 k 1
   Average time a class k customer spends in the system:
1
Tk      Wk
k
   Using Little’s formula the number of customers in the system for each class, and for all
customers, the average time delay per customer:
1T1        cTc
T
1       c
     Preemptive/resume policy: service of a customer is interrupted when a higher priority
customer arrives; it resumes from the point of interruption when all higher priority customers
have been served
    Priority k customers are not affected by the presence of lower priority customers
    Calculate Tk = average time a class k customer spends in the system. This consists of:

1.   Average service time of the customer 1/  k
2.   Average time required to service customers of priority 1 through k found in the system upon
arrival. This is equal to the average waiting time in an M/G/1 system where customers of
priority lower than k are neglected, that is:
Rk            1 k
, Rk    i X i2
1  1    k       2 i 1
3.   Average time requited to service customers of priority higher than k that arrive while the
customer is in the system:
k 1             k 1
1

i 1
 i Tk   i Tk , k  1
i 1
i
Combining terms:
1          Rk
Tk                       Tk (1        k 1 )
 k 1  1    k
Final solution:
(1/  k )(1  1   k )  Rk
Tk 
(1  1    k 1 )(1  1    k )
where:
1 k
Rk       i X i2
2 i 1

```
DOCUMENT INFO
Shared By:
Categories:
Tags:
Stats:
 views: 1 posted: 8/31/2012 language: English pages: 20
How are you planning on using Docstoc?