TCOM 501 Networking Theory Fundamentals by utg65734

VIEWS: 8 PAGES: 37

									           TCOM 501:
Networking Theory & Fundamentals


            Lecture 2
        January 22, 2003
      Prof. Yannis A. Korilis


                                   1
2-2   Topics
       Delay in Packet Networks
       Introduction to Queueing Theory

       Review of Probability Theory

       The Poisson Process

       Little’s Theorem

           Proof and Intuitive Explanation
           Applications
2-3   Sources of Network Delay
         Processing Delay
             Assume processing power is not a constraint
         Queueing Delay
             Time buffered waiting for transmission
       Transmission Delay
       Propagation Delay

             Time spend on the link – transmission of electrical signal
             Independent of traffic carried by the link
          Focus: Queueing & Transmission Delay
2-4   Basic Queueing Model
                            Buffer     Server(s)


              Arrivals                             Departures
                              Queued   In Service


         A queue models any service station with:
           One or multiple servers
           A waiting area or buffer

       Customers arrive to receive service
       A customer that upon arrival does not find a

        free server is waits in the buffer
2-5   Characteristics of a Queue



                         b        m


       Number of servers m: one, multiple, infinite
       Buffer size b

       Service discipline (scheduling): FCFS, LCFS,

        Processor Sharing (PS), etc
        Arrival process
        Service statistics
2-6   Arrival Process
                             n 1        n     n 1
                                    n

                                         tn              t

        n : interarrival time between customers n and n+1
        n is a random variable

         
       { n , n  1} is a stochastic process

          Interarrival times are identically distributed and have
          a common mean
                      E[ n ]  E[ ]  1/ l
         l is called the arrival rate
2-7   Service-Time Process
                                  n 1            n        n 1

                                                      sn
                                                                  t


       s n : service time of customer n at the server
       {sn , n  1} is a stochastic process

          Service times are identically distributed with common mean
                         E [ sn ]  E [ s ]  m
         m is called the service rate

       For   packets, are the service times really random?
2-8   Queue Descriptors
       Generic descriptor: A/S/m/k
       A denotes the arrival process

             For Poisson arrivals we use M (for Markovian)
         B denotes the service-time distribution
             M: exponential distribution
             D: deterministic service times
             G: general distribution
       m is the number of servers
       k is the max number of customers allowed in the

        system – either in the buffer or in service
             k is omitted when the buffer size is infinite
2-9   Queue Descriptors: Examples

       M/M/1: Poisson arrivals, exponentially distributed
        service times, one server, infinite buffer
       M/M/m: same as previous with m servers

       M/M/m/m: Poisson arrivals, exponentially distributed

        service times, m server, no buffering
       M/G/1: Poisson arrivals, identically distributed service

        times follows a general distribution, one server,
        infinite buffer
       */D/∞ : A constant delay system
2-10   Probability Fundamentals

        Exponential Distribution
        Memoryless Property

        Poisson Distribution

        Poisson Process

            Definition and Properties
            Interarrival Time Distribution

            Modeling Arrival and Service Statistics
2-11   The Exponential Distribution
          A continuous RV X follows the exponential distribution
           with parameter m, if its probability density function is:
                                 me m x   if x  0
                     f X ( x)  
                                 0         if x  0

           Probability distribution function:
                                        1  e m x    if x  0
                 FX ( x )  P{ X  x}  
                                         0            if x  0
2-12   Exponential Distribution (cont.)
          Mean and Variance:
                                               1                              1
                                 E[ X ]           , Var( X ) 
                                               m                          m2
           Proof:
                                                  
             E[ X ]   x f X ( x ) dx   x m e  m x dx 
                        0                          0
                                                            1
                       xe m x   
                                         e m x dx 
                                   0
                                          0                  m
                                                                                    2              2
            E[ X ]   x m e
                 2           2     mx
                                         dx   x e    2 mx 
                                                                  2  xe m x dx        E[ X ] 
                        0
                                                             0
                                                                          0           m              m2
                                                         2       1            1
           Var( X )  E[ X 2 ]  ( E[ X ])2                         
                                                        m2       m2           m2
2-13   Memoryless Property
          Past history has no influence on the future

                    P{ X  x  t | X  t}  P{ X  x}

           Proof:
                                     P{ X  x  t , X  t} P{ X  x  t}
           P{ X  x  t | X  t}                         
                                          P{ X  t}         P{ X  t}
                 e m ( x t )
                  mt  e m x  P{ X  x}
                  e
          Exponential: the only continuous distribution with the
           memoryless property
2-14   Poisson Distribution
          A discrete RV X follows the Poisson distribution with
           parameter l if its probability mass function is:
                                            lk
                       P{ X  k }  e  l        , k  0,1, 2,...
                                            k!
          Wide applicability in modeling the number of random
           events that occur during a given time interval – The
           Poisson Process:
              Customers that arrive at a post office during a day
              Wrong phone calls received during a week
              Students that go to the instructor’s office during office hours
              … and packets that arrive at a network switch
2-15   Poisson Distribution (cont.)
          Mean and Variance
                                   E[ X ]  l, Var( X )  l
           Proof:
                                                          
                                                                      lk               
                                                                                                  lk
              E[ X ]   kP{ X  k }  e             l
                                                           k k !  e  ( k  1)!l

                        k 0                              k 0                        k 0
                                   
                                       lj
                       e l
                         l
                                             e  l l el  l
                               j 0    j!
                                                              
                                                                          lk                 
                                                                                                         lk
             E[ X ]   k P{ X  k }  e
                  2            2                      l
                                                           k         2
                                                                                e    l
                                                                                            k ( k  1)!
                        k 0                                k 0           k!              k 0
                                   
                                               lj              
                                                                               lj                  
                                                                                                         lj
                       e l  ( j  1)
                         l
                                                      l  je             l
                                                                                      le    l
                                                                                                              l2  l
                               j 0             j!             j 0             j!                j 0   j!
            Var( X )  E[ X 2 ]  ( E[ X ])2  l 2  l  l 2  l
2-16   Sum of Poisson Random Variables
          Xi , i =1,2,…,n, are independent RVs
        Xi follows Poisson distribution with parameter li
        Partial sum defined as:

                       Sn  X 1  X 2  ...  X n

           Sn follows Poisson distribution with parameter l
                           l  l1  l2  ...  ln
2-17   Sum of Poisson Random Variables (cont.)
       P r o o f : For n = 2. Generalizat ion by induc-
       t ion. T he pm f of S = X 1 + X 2 is
                           m
                           X
       P f S = mg =              P f X 1 = k; X 2 = m ¡ k g
                          k= 0
                           m
                           X
                     =           P f X 1 = kg P f X 2 = m ¡ kg
                          k= 0
                           m
                           X             k
                                  ¡ ¸ 1¸ 1     ¡ ¸2     ¸ m¡ k
                                                          2
                     =           e            ¢e
                          k= 0           k!           ( m ¡ k) !
                                            m
                                       1 X           m!
                    =     e¡ ( ¸ 1 + ¸ 2 )                  ¸ k ¸ m¡ k
                                                              1 2
                                      m ! k= 0 k!( m ¡ k) !
                                                   m
                        ¡ ( ¸ 1+ ¸ 2) ( ¸ 1 + ¸ 2)
                    = e
                                            m!
       Poisson w it h param et er ¸ = ¸ 1 + ¸ 2 .
2-18   Sampling a Poisson Variable
        X follows Poisson distribution with parameter l
        Each of the X arrivals is of type i with probability pi,

         i =1,2,…,n, independently of other arrivals;
          p1 + p2 +…+ pn = 1
        Xi denotes the number of type i arrivals



           X1 , X2 ,…Xn are independent
           Xi follows Poisson distribution with parameter li lpi
2-19   Sampling a Poisson Variable (cont.)
       P r o o f : For n = 2. Generalize by induct ion. Joint pm f:

       P f X 1 = k1 ; X 2 = k2 g =
         =    P f X 1 = k1 ; X 2 = k2 j X = k1 + k2 g P f X = k1 + k2 g
              ³k + k ´                                ¸ k 1 + k2
                  1         2    k 1 k2       ¡ ¸
         =                     p1 p2 ¢e
                    k1                             ( k1 + k2 ) !
                  1
         =               ( ¸ p1 ) k 1 ( ¸ p2 ) k 2 ¢e¡ ¸ ( p1 + p2 )
              k 1 !k 2 !
                               k1                        k2
               ¡ ¸ p1 ( ¸ p1 )           ¡ ¸ p2 ( ¸ p2 )
         =    e                     ¢e
                          k1 !                      k2 !

         ² X 1 and X 2 are indep endent
                                                    k1                                             k2
         ² P f X 1 = k1g =     e¡ ¸ p1 ( ¸ k 11!)
                                           p
                                                         , P f X 2 = k2 g =   e¡ ¸ p2 ( ¸ k 22!)
                                                                                          p


             X i f ollow s Poisson dist ribut ion w it h param et er ¸ pi .
2-20       Poisson Approximation to Binomial
          Binomial distribution with               Proof:
           parameters (n, p)                                      n k
                                                     P{ X  k }    p (1  p ) n  k
                                                                  k 
                           n                                                                 nk
               P{ X  k}    p k (1  p )nk          ( n  k  1)...( n  1)n  l   l 
                                                                                      k

                                                                                  1  
                           k                                     k!             n  n
          As n→∞ and p→0, with np=l                 ( n  k  1)...( n  1)n
                                                                                1 
           moderate, binomial distribution                      nk               n

           converges to Poisson with                  l
                                                             n

           parameter l                                1    e
                                                              n
                                                                   l
                                                      n
                                                      l
                                                             k

                                                      1    1
                                                              n
                                                                  
                                                      n
                                                                            l   lk
                                                     P{ X  k }  e
                                                                 n
                                                                     
                                                                                 k!
2-21   Poisson Process with Rate l
          {A(t): t≥0} counting process
              A(t) is the number of events (arrivals) that have occurred from
               time 0 – when A(0)=0 – to time t
              A(t)-A(s) number of arrivals in interval (s, t]
          Number of arrivals in disjoint intervals independent
          Number of arrivals in any interval (t, t+] of length 
             Depends only on its length 


             Follows Poisson distribution with parameter l


                                             l ( l )
                                                        n
              P{ A(t   )  A(t )  n}  e               , n  0,1,...
                                               n!
               Average number of arrivals l; l is the arrival rate
2-22       Interarrival-Time Statistics
          Interarrival times for a Poisson process are independent
           and follow exponential distribution with parameter l
           tn: time of nth arrival; n=tn+1-tn: nth interarrival time
                              P{ n  s}  1  e  l s , s  0

       Proof:
        Probability distribution function


           P{ n  s}  1  P{ n  s}  1  P{ A(tn  s )  A(tn )  0}  1  e  l s

          Independence follows from independence of number of arrivals in
           disjoint intervals
2-23   Small Interval Probabilities
          Interval (t+ d, t] of length d
           P{ A(t  d )  A(t )  0}  1  ld   (d )
           P{ A(t  d )  A(t )  1}  ld   (d )
           P{ A(t  d )  A(t )  2}   (d )

       Proof:
                                    ld          ( ld )2
       P{ A(t  d )  A(t )  0}  e  1  ld            1  ld   (d )
                                                    2
                                     ld                ( ld )2 
       P{ A(t  d )  A(t )  1}  e ld  ld  1  ld             ld   (d )
                                                           2 
                                          1
       P{ A(t  d )  A(t )  2}  1   P{ A(t  d )  A(t )  k}
                                      k 0
                           1  (1  ld   (d ))  ( ld   (d ))   (d )
2-24       Merging & Splitting Poisson Processes
           l1                                                           lp
                                                               p
                                                l
                                  l1  l2
                                                             1-p
                                                                        l(1-p)
           l2

          A1,…, Ak independent Poisson        A: Poisson processes with rate l
           processes with rates l1,…, lk       Split into processes A1 and A2
          Merged in a single process           independently, with probabilities p
           A= A1+…+ Ak                          and 1-p respectively
           A is Poisson process with rate       A1 is Poisson with rate l1= lp
           l= l1+…+ lk                          A2 is Poisson with rate l2= l(1-p)
2-25   Modeling Arrival Statistics
        Poisson process widely used to model packet arrivals
         in numerous networking problems
        Justification: provides a good model for aggregate

         traffic of a large number of “independent” users
              n traffic streams, with independent identically distributed (iid)
               interarrival times with PDF F(s) – not necessarily exponential
              Arrival rate of each stream l/n
               As n→∞, combined stream can be approximated by Poisson
               under mild conditions on F(s) – e.g., F(0)=0, F’(0)>0
          Most important reason for Poisson assumption:
           Analytic tractability of queueing models
2-26   Little’s Theorem

           l                       N



                            T
        l: customer arrival rate
        N: average number of customers in system

        T: average delay per customer in system

         Little’s Theorem: System in steady-state
                       N  lT
2-27   Counting Processes of a Queue
                               (t)

                               N(t)


                                 b(t)



                                t
        N(t) : number of customers in system at time t
        (t) : number of customer arrivals till time t

        b(t) : number of customer departures till time t

        Ti : time spent in system by the ith customer
2-28        Time Averages
          Time average over interval [0,t]      Little’s theorem N=λT
          Steady state time averages            Applies to any queueing system
                                                  provided that:
                  1 t
           Nt     N ( s )ds   N  lim N t       Limits T, λ, and d exist, and
                  t 0                t 
                                                  λ= d
                  a (t )
           lt                  l  lim lt
                    t                t 
                                                  We give a simple graphical proof
                    1 a (t )                      under a set of more restrictive
           Tt            Ti
                  a (t ) i 1
                                T  lim Tt
                                     t          assumptions
                  b (t )
           dt                  d  lim d t
                    t                 t 
2-29       Proof of Little’s Theorem for FCFS
                                            (t)             FCFS system, N(0)=0
                                                              (t) and b(t): staircase graphs
                                            N(t)
       i                                                       N(t) = (t)- b(t)
                                                     Ti
                                                              Shaded area between graphs
                                              b(t)                               t
                                                                        S (t )   N ( s )ds
                                                                                 0
                         T2
                    T1
                                             t
              Assumption: N(t)=0, infinitely often. For any such t
                                                  (t ) 1 Ti
                                                                (t )
                               (t )
                                   1 t
                  N ( s)ds   Ti   N ( s)ds 
                t
               0            i 1  t 0             t  (t )
                                                               N t  ltTt

               If limits Nt→N, Tt→T, λt→λ exist, Little’s formula follows
               We will relax the last assumption
2-30       Proof of Little’s for FCFS (cont.)
                                        (t)

                                         N(t)
       i                                         Ti
                                          b(t)

                       T2
                  T1

              In general – even if the queue is not empty infinitely often:
                                           b (t )  T 1 t             (t )  T
                                                   b                         
               b (t )                (t )            (t )                  (t)


                Ti  0 N ( s)ds   Ti  t b (t )  t 0 N ( s)ds  t  (t )
                        t
                                                  1          i          1         i


                i 1                 i 1

                d tTt  N t  ltTt
              Result follows assuming the limits Tt →T, λt→λ, and dt→d exist,
               and λ=d
2-31   Probabilistic Form of Little’s Theorem
        Have considered a single sample function for a
         stochastic process
        Now will focus on the probabilities of the various

         sample functions of a stochastic process
        Probability of n customers in system at time t


                       pn (t )  P{N (t )  n}

          Expected number of customers in system at t
                                                 
               E [ N (t )]   n.P{N (t )  n}   npn (t )
                          n 0                   n 0
2-32   Probabilistic Form of Little (cont.)
          pn(t), E[N(t)] depend on t and initial distribution at t=0
          We will consider systems that converge to steady-state
          there exist pn independent of initial distribution
                      lim pn (t )  pn , n  0,1,...
                      t 

          Expected number of customers in steady-state [stochastic aver.]
                              
                     EN   npn  lim E [ N (t )]
                                      t 
                             n 0
          For an ergodic process, the time average of a sample function is
           equal to the steady-state expectation, with probability 1.
                      N  lim Nt  lim E[ N (t )]  EN
                             t     t 
2-33   Probabilistic Form of Little (cont.)
          In principle, we can find the probability distribution of the delay
           Ti for customer i, and from that the expected value E[Ti], which
           converges to steady-state

                            ET  lim E[Ti ]
                                       i 

          For an ergodic system
                              
                                   
                                       Ti
                      T  lim      1
                                             lim E[Ti ]  ET
                           i    i           i 

           Probabilistic Form of Little’s Formula: EN  l.ET
           Arrival rate define as
                                     E[ (t )]
                            l  lim
                                t     t
2-34   Time vs. Stochastic Averages

        “Time averages = Stochastic averages,” for all
         systems of interest in this course
        It holds if a single sample function of the stochastic

         process contains all possible realizations of the
         process at t→∞
          Can be justified on the basis of general properties of
           Markov chains
2-35   Moment Generating Function
       1. D e¯ nit ion: for any t 2 IR:
                               8 Z1
                               >
                               >        et x f X ( x ) dx ;   X cont inuous
                               <
                         tX       X¡ 1
          M X ( t) = E [e ] =
                               >
                               >       et x j P f X = x j g ; X discret e
                               :
                                          j


       2. If t he m om ent generat ing funct ion M X ( t) of X
          exist s and is ¯ nit e in som e neighb orhoo d of t = 0,
          it det erm ines t he dist ribut ion of X uniquely.

       3. Fundam ent al P rop ert ies: f or any n 2 IN :
                dn
          ( i)     n
                     M X ( t ) = E [X n et X ]
                dt
                 dn
          ( ii)       M X ( 0) = E [X n ]
                 dt n

       4. M om ent Generat ing Funct ions and Independence:
           X ; Y : independent )                 M X + Y ( t ) = M X ( t) M Y ( t )
          T he opp osit e is not t rue.
2-36      Discrete Random Variables

   Distribut ion       Prob. Mass Fun.               Moment Gen. Fun.        Mean     Variance
   (paramet ers)          P f X = kg                     M X (t)             E [X ]   Var(X )
                       ¡ n¢
       Binomial         k pk (1 ¡ p) n¡ k              (pet + 1 ¡ p) n        np      np(1 ¡ p)
        (n; p)          k = 0; 1; : : : ; n

                                                                pet            1         1¡ p
       Geometric            (1 ¡ p) k ¡ 1 p                 1¡ (1¡ p)et        p          p2
          p                 k = 1; 2; : : :
                   ³          ´                         h               ir
                       k¡ 1        r          k¡ r              pet            r       r (1¡ p)
  Negative Bin.        r¡ 1p (1 ¡ p)                        1¡ (1¡ p)et        p          p2
     (r ; p)           k = r; r + 1; : : :

                                   ¡ ¸ ¸k                     ¸ (et ¡ 1)
        Poisson                   e k!                       e                 ¸          ¸
           ¸                k = 0; 1; : : :
2-37     Continuous Random Variables

Dist ribution   Prob. Density Fun.      Moment Gen. Fun.          Mean     Variance
(paramet ers)         f X (x)               M X (t)               E [X ]   Var(X )

                           1                  et b ¡ et a          a+ b     (b¡ a) 2
Uniform over             b¡ a                  t(b¡ a)              2          12
    (a; b)           a< x < b

Exponent ial           ¸ e¡ ¸ x                  ¸
                                                ¸¡ t
                                                                    1
                                                                    ¸
                                                                               1
                                                                               ¸
    ¸                  x¸ 0
                                2   2                    2
   Normal        p 1 e¡ (x ¡ ¹ ) =2¾       e¹ t + (¾t)       =2
                                                                    ¹         ¾2
                   2¼¾
         2        ¡ 1 < x< 1
   (¹ ; ¾ )

								
To top