Docstoc

ch3

Document Sample
ch3 Powered By Docstoc
					    Chapter 3

Discrete Distributions
隨機變數(random variable,r.v.)之定義

Definition 3.1-1:
  Given a random experiment with an outcome
  space S0 , a function X that assigns to each element
  s in S0 one and only one real number X( s ) = x is
  called a random variable .
 The space of X is the set of real numbers
 S  {x : X ( s)  x, s  S0}
For convenience, we will think of the space S of X as
  being the outcome space.
(1) If S is a countable set, we say that X is a r.v of
  discrete type.

(2)If S is an interval (possibly unbounded) or a
  union of intervals, we say that X is a r.v of
  continuous type.
Ex3.1.1:
  A rat is selected at random from a cage and its sex is
  determined . The set of possible outcomes is female
  and male . Thus the outcome space of this
  experiment is
   S0 ={ female , male } = { F , M }.
 Let X be a function defined on S0 such that
  X ( F ) = 0 and X( M ) =1.
 The space of the random variable X is S={0,1}.
Ex3.1-2

Let the random experiment be the cast of a die,
  observing the number of spots on the side facing
  up.
The outcome space S0 ={1,2,3,4,5.6}.

Let X(s)=s. (X is the identity function.)

Then the space of the random variable X is
S={1,2,3,4,5,6}.
Ex 4.1-1

Let the r.v. X be the length of time in minutes of
  waiting in line to buy tickets.
The space of the random variable X is
 S  {x : 0  x, x  R}
                      Notations

Let X be a random variable and B be an event.
The probability P(B) of event B will be also
  denoted by

(1)    P(a  X  b)    if   B  {s : a  X (s)  b}

(2)     P( a  X )     if   B  {s : a  X (s)}
(3)      P( X  a )    if   B  {s : X (s)  a}
                 p.m.f. of discrete type r.v.
Def 3.1-2
The probability mass function (p,m,f) of
discrete type variable X with space S is a function f(x)
    satisfying the following conditions:

(a) f(x) >0 , x  S .

(b)    f ( x)  1.
      xS


(c) P( X  A)         f ( x),   where   A  S.
                      xA
             Discrete distributions

•   Uniform distribution(均勻分配)
•   Hypergeometric distribution (超幾何分配)
•   Bernoulli distribution(白努力分配)
•   Binomial distribution(二項分配)
•   Negative geometric distribution(負幾何分配)
•   Geometric distribution(幾何分配)
•   Poisson distribution(卜瓦松分配)
               p.d.f. of continuous type r.v.
Def 4.1-2
 The probability density function (p,d,f) of
continuous type variable X with space S is an
   integrable function f(x) satisfying the following
   conditions:

(a) f(x) >0 , x  S .

(b)   S f ( x)dx  1
(c) The probability of the event X  A is

       P( X  A)   f ( x)dx, where     A S
                        A
              Continuous distributions

•   Uniform distributions
•   Exponential distributions
•   Gamma distributions
•   Chi-square distributions
•   Normal distributions
•   Beta distributions
          Uniform distribution(均勻分配)

Def:
If X is a r.v. with a constant p.m.f., we will say
   that X has a uniform distribution.


Ex:
In Ex 3.1-2, the p.m.f. of X is
         1
 f ( x)  , x  1,2,3,4,5,6.
         6
   Hypergeometric distribution (超幾何分配)

Def:
We say that a r.v. of discrete type X has a
  hypergeometric distribution if it’s p.m.f. is
               N1   N 2 
                         
     f ( x)   x  n  x ,
                N1  N 2 
                         
                    n    
     where    x  {x : x is an integer ,
                        0  x  n, x  N1, and x  N 2},
     n is a positive integer .
Ex3.1-6
A lots, consisting of 100 fuses, 20 of them are
  defective. Five fuses are chosen at random and
  tested. Let X be a r.v. of defective fuses in the
  sample of five, then the p.m.f. of X is


                          20  80 
                          
                          x  5  x 
                                      
   f ( x)  P ( X  x)             , x  0,1,2,3,4,5.
                            100 
                            
                             5  
                                 
  3.2 Mathematical Expectation (期望值)
Ex3.2-1
An enterprising young man who needs a little extra
money devises a game of chance in which some of
his friends might wish to participate. The game
that he proposes is to let the participant cast an
unbiased die and then receive a payment according
to the following schedule :
If the event A={1,2,3} occurs, he receives 1¢ ;
if B ={4,5} occurs, he receives 5¢; and
if C={6} occurs, he receives 35¢.
Q: How much should be charged for the opportunity
  of playing the game?

Let X be the r.v. defined by the outcome of the cast
  of the die. The p.m.f. of X is f ( x)  16 , x  1,2,3,4,5,6.
                         1 x  1,2,3
The payment is u ( x)   5 x  4,5
                        
                        35   x6
                        
The mathematical expectation of payment is
   6
   u(x)f(x)
  x 1
       1      1      1      1      1       1
   1   1   1   5   5   35   8
       6      6      6      6      6       6
 Definition 3.2-1
If f(x) is the p.m.f of the r. v. X of the discrete type
with space S and if  u  x  f  x  exists,
                       xs

then the sum is called the mathematical expectation
or the expected value of the function u(X) ,
and it is denoted by E[u(X)] .


              E u  X        u(x) f(x)
  That is ,

                               xs
                             Remark

If u(X)=X,    E[ X ]  .X


If u( X )  ( X   X )2 , where  X is the population
mean of X, E[( X   X )2 ]   X   2
Theorem3.2-1
   When it exists, mathematical expectation E satisfies
   the following properties :

(a) If c is a constant, E[c] = c .
(b) If c is a constant and u is a function,
    E[cu(X)] = cE[u(X)] .

(c) If c1 and c2 are constants and u1 and u2 are
     functions,
    then Ec1u1 X   c2u2  X   c1Eu1 X   c2 Eu2  X 
Ex 3.2-4
 Let u  x    x  b 2 , where b is not a function of X, and
suppose E[( X  b) 2 ] exists. To find that value of b for
Which E[( X  b) 2 ] is a minimum.
      Mean of a hypergeometric distribution
Ex 3.2-5
Let X have a hypergeometric distribution in which n
objects are selected from N  N1  N 2 objects.

Then the mean of X is


   E X    x
                      n( N )
                   N1     N2
                          n x
                     
                   x             1
                        N
             xs        n     N
  The variance of a hypergeometric distribution

Ex 3.2-9
Let X have a hypergeometric distribution in which n
objects are selected from N  N1  N 2 objects.

The variance of X
                        
      σ 2  E  X  μ 2  E X 2  μ 2
          E[ X ( X  1)]  E[ X ]  μ 2
           n(n  1)( N1 )( N1  1) nN1 nN1 2
                                       (   )
                 N ( N  1)          N      N
             N1 N 2 N  n
          n( )( )(             )
              N N N 1
  Mean and Variance of a Uniform Distribution

Ex 3.2-8
The mean of X, which has a uniform distribution on the
first m positive integers, is given by

             m 1
μ  E X  
              2
                   1  m  12m  1
              2
            m
 EX   2
           x   
            x 1  m         6


                           2
                                  
  σ 2  Var  X   E  X  μ   E X 2    μ2 
                                                 m2  1
                                                  12
                          Remark
Let X be a r. v. with mean  X and variance     X .
                                                 2

If Y=aX+b, where a and b are constant, is a r. v., too.
The mean of Y is
μY  E Y   E aX  b  aE  X   b  aμ X  b

The variance of Y is
  2
                  
 σY  E Y  μY 2  E aX  b  aμ X  b 2   
    E a  X  μ    a E[( X  
         2
                  X
                      2   2
                                      X ) 2 ]  a 2σ 2
                                                     X



The standard deviation of Y is σ Y  a σ X
3.3 Bernoulli Trials & The Binomial
Distribution

A Bernoulli experiment is a random experiment, the
outcome of which can be classified in but one of two
mutually exclusive and exhaustive ways, say, success or
failure .
A sequence of Bernoulli trials occurs when a Bernoulli
experiment is performed several times so that the
probability of success remain the same from trial to trial.
         Bernoulli 實驗

一實驗若滿足下述三特性, 則稱此實驗為Bernoulli
  實驗.
(1)實驗的outcome可非分為互斥的兩部分,一稱成功
  (success),一稱失敗(failure).
(2)成功機率(令為 p),在此實驗下恆固定不變.
(3)試行間獨立 .
                 Bernoulli distribution
Def:
Let X be a r. v. associated with a Bernoulli trial by
  defining it as follows:
   X ( success )  1 and X ( failure)  0.
  (or X    一次Bernoulli實驗中,成功的次數)
The p.m.f. of X can be written as
    f ( x)  p x (1  p)1 x , x  0,1
  we say that X has a Bernoulli distribution
  ( X ~ Ber( p) , X服從白努力分配 ).
      Mean and variance of a Bernoulli distribution
Let X be a r. v. with a Bernoulli distribution,
The mean of X is
                   1
    E ( X )   xp x (1  p)1 x  p
                  x 0



The variance of X is
                   1
  Var ( X ) 
  2
                   ( x  p)2 p x (1  p)1 x  pq
                  x 0


The standard deviation of X is               p(1  p)  pq
              Binomial experiment
A binomial experiment is a sequence of Bernoulli
  experiment which satisfies the following properties:
   1. A Bernoulli ( success-failure ) experiment is
  performed n times .
   2. The trials are independent .
   3. The probability of success on each trial is a
  constant p ; the probability of failure is
  (1-p).
  4.The random variable X equals the number of
  success in the n trials.
                  Binomial distribution
Def:
Let X be the number of observed successes in n Bernoulli
trials, the possible values of X are 0 , 1 , 2…….. , n.

Then the p.m.f. of X is
                n x
      f ( x)    p (1  p) n x , x  0,1,2,........,
                x                                   n
                

 and the r. v. X is said to have a binomial distribution
( X ~ b(n, p) , X服從二項分配 ).

The constants n and p are called the parameters of the
binomial distribution.
                               Remark

Let n be a positive integer

    n
        n x
    x    p (1  p ) n  x  [(1  p )  p ]n  1
   x  0 



             n x
   f ( x)    p (1  p ) n  x , x  0,1,, n.
              x
  is a p.m.f.
Example 3.3-5
In the instant lottery with 20% winning tickets, if X is equal
to the number of winning tickets among n = 8 that are
purchased, the probability of purchasing two winning
tickets is

                         8 
                          (0.2) 2 (0.8)6  0.2936
     f (2)  P( X  2)   
                          2
    Cumulative Distribution Function (c.d.f.)
              ( 累積分配函數)

Def:
The cumulative distribution function of a r.v. X is
  defined by

        F ( x)  P( X  x)
                         Remark
1.
                            f (t )   if X is discrete type
                          t{ X  x}
     F ( x)  P( X  x)  
                             x
                            f(t)dt if X is continuous type
                           


2.
        0  F ( x)  1, x  R

3. F(x) is nondecreasing. That is, x1  x2  F ( x1 )  F ( x2 )
         The c.d.f. of binomial distribution

Values of the c.d.f. of a r.v. X ~b(n,p) are given in
  Table II (p647-p651).

                          x
                                n!
 F ( x)  P( X  x)                  p (1  p)
                                        k        nk

                      k  0 k!(n  k )!
Ex 3.3-8
Let p=0.5 be the probability of a female chick hatching.
  Assuming independence, let X equal the number of
  female chicks out of newly hatched chicks selected at
  random. Then X ~ b(10,0.5)
The probability of five or less female is
                     查表
 P( X  5)  F (5)  0.6230
The probability of exactly six female is
P( X  6)  P( X  6)  P( X  5)  0.8281 0.6230

The probability of at least six female is
 P( X  6)  1  P( X  5)  1  0.6230
Ex3.3-9
Suppose that 65% of American public approve of the
  way the President of the U.S.A. is handling his job.
  Take a random sample of n=8 American and let Y
  equal the number who give approve. The Y~b(8,0.65).
  Find P(Y  6), P(Y  5) and P(Y  5).
                                           查表
   P(Y  6)  P(8  Y  8  6)  P( X  2)  0.4278
   where X ~ b(8,0.35)

   P(Y  5)  P(8  Y  8  5)  P( X  3)  1  P( X  2)

  P(Y  5)  P(8  Y  8  5)  P( X  3)  P( X  3)  P( X  2)
                       Remark
Suppose that an urn contains N1 success balls and N 2
                             N1
                        p
failure balls and we let      N1  N 2 and X equal the
number of success balls in a random sample of size n
that is taken from this urn.

If the sampling is done with replacement, the
   distribution of X is b(n,p);

and if the sampling is done without replacement, X has a
  hypergeometric distribution.
Ex:
若一箱中有8個紅球,4個白球. 現從此箱中隨機不
  歸還取出3個球, 則此3個球中, 紅球的平均值為
  何?變異數為何?
Sol:
令X  3個球中,紅球的數目

 X has a hypergeome tric distributi on with N  12, n  3.

                   8                  12  3     2 1 6
  E( X )  3        2, Var ( X )          3   .
                  12                  12  1     3 3 11
3.4 The Moment-Generating Function

          (動差母函數)
Def:
Let X be a r. v. of the discrete type with p.m.f. f(x) and
  space S. If there is a positive number h such that
           E (e )   e f(x)
               tX           tx

                      xS

exist and is finite for –h<t<h, h>0,then the function of
  t defined by
          M X (t )  E (etX )     etx f ( x)
                                  xS
is called the moment-generating function of X . This
   is often abbreviated as m.g.f.
                          Remark

(1)If r.v. X and Y have the p.m.f. f(x) and g(y)
  respectively, the same space S={b1,b2,b3,…} and the
  same m.g.f.
   tb1 f ( )  tb2 f ( )     tb1 g ( )  tb2 g ( )  ...
  e b1 e b2 ... e b1 e b2
for all t  (h, h) ,

Then mathematical theory requires that

         f (bi )  g (bi ), i  1,2,3, 
                               Remark

(2)If X is a r.v. with space S={x1,x2,x3,…}, then

    p.m.f. of X is f ( xi )  P ( X  xi )  pi , i  1,2,
                         
     m.g.f .of X is  pi etxi
                        i 1
Example 3.4-2:
                                                 et
Suppose the m.g.f. of X is               2 , t  ln 2.
                                  M (t ) 
                                     1 et
                                           2
What is the p.m.f. of X? Find P(X > 3).
Since M (t )
                          1
         e t
                   e 
                     t
                                 et       et  e 2t e 3t   
              1 
                                   1      2  3  ...
         2         2 
                                2    
                                          2   2    2      
                                                           
                                       2                  3
               1           1          1
        (e t )   (e 2 t )   (e 3t )   
               2           2          2
                           x
                         1                     et  1.
    f ( x)  P( X  x)    , x  {1,2,}, when
                         2                       2
                                             2        3
                                  1 1 1
   P( X  3)  1  P( X  3)  1       
                                  2 2 2
Ex:
Find the probability distribution of r.v.X in (1)and (2).

(1)If the m.g.f. of X is given by
             1 t 2 2t 2 4t
   M X (t )  e  e  e ,    t  .
             5   5    5

(1)If the m.g.f. of X is given by
                t
    M X (t )  e ,    t  .
Ex:
If r.v.X has the m.g.f. M X (t )  1 (2t  3t  5t ),    t  .
                                    3
find the c.d.f. of X.
                       Properties of m.g.f.
Thm:
Let X be a r.v. of discrete type, a and c are
  constant, a  0 , then
 (1) M a (t )  e at
 (2) M aX (t )  M X (at )
 (3) M X  c (t )  ect M X (t )
 (4) M aX  c (t )  ect M X (at )
 (5) For any positive integer r , M Xr ) (0)  E ( X r ).
                                    (

                                                            tr 
 (6) When M X (t ) exists , M X (t )  M X (0)   E ( X r ) .
                                                             r! 
                                                 r 1        
Example 3.4-7 Let the moments of X be defined by

      E ( X r )  0.8,   r  1,2,3,...

 Find the m.g.f.of X.

The moment-generating function of X is

                      tr 
M (t )  M (0)   0.8   0.2e 0t  0.8e1t
                       r! 
                 r 1  
Thm:
Let X be a r.v. with m.g.f. M(t),

then   M (0) ,  2  M (0)  M (0).2
Pf:
        E ( X )  M (0)

      
          2
               E ( X )  [ E ( X )]  M
                     2             2
                                         (0)  M (0)2 .
      Mean & Variance of Binomial Distribution

Thm:

Let     X ~ b(n, p) , then

. (1) M X (t )  ( pet  1  p ) n .
 (2)  X  np.
 (3) X  np (1  p ).
      2
Pf:
The p.m.f. of X ~ b(n,p) is
            n x        n x
  f ( x)    p 1  p  , x  0,1,2,, n.
            x
the m.g.f. is

                     e  x  p 1  p 
                                  tx n 
                           n                    n x
 M X (t )  E e   tX                        x

                       x 0  
              n t
                         1  p n x
             n
           pe
                            x

         x  0 x 

         
        (1  p)  pe       t n
                                ,    t 
                              t n 1
M (t )  n 1  p     pe          
                                        pe 
                                           t

                                           
                                          

M '' (t )  n(n  1)[(1  p )  pet ]n  2 ( pet ) 2
            n[(1  p)  pet ]n 1 ( pet )


 X  M (0)  np

X
 2
       M (0)  M (0)  np(1  p)  2
Ex:
Find the probability distribution of X if the m.g.f. of X is
given by

                                   20
                    1 t 3 2t 
        M X (t )   e  e  ,    t  .
                   4    4 
                                 20                    20
                   1 t 3 2t  20 t  1 3 t 
pf :  M X (t )   e  e   e   e 
                  4    4          4 4 
                    20
         1 3 t                          3
    and   e  is the m.g.f .of b(20 ,p  )
         4 4                            4
    by the property of m.g.f .,
                                               3
   we know that X  Y  20, where Y ~ b(20, p  )
                                               4
    the p.m.f. of X is
     f ( x)  P( X  x)  P(Y  x  20)  g ( x  20)
              20  3  x  20  1  40  x
                                      , x  20,,40.
            x  20  4       4
Mean & variance of Bernoulli Distribution

In the special case when n=1,
X has a Bernoulli distribution and
                              t
     M X (t )  (1  p)    pe ,      t  .

     X  p
     X
      2
             p(1  p)
          Negative Binomial Distribution
Observe a sequence of Bernoulli trials until exactly r
 success occur, where r is a fixed positive integer. If
 X denote the trial number on which the rth success is
 observed.
  Then the p.m.f.of X is

             x  1 r
   f ( x)         p (1  p) x  r , x  r , r  1,
             r  1


We say that X has a negative binomial distribution
                                         Remark
Let h(q)  (1  q)  r , r  0
                    h ( k ) (0)   r  k  1 k
                     
(1  q )  r                      r  1 q , for  1  q  1
                                              
               k 0      k!      k 0        
          ( let x  k  r ) 
                              x  1 x  r
                          r  1q
                             
                         xr 
                                    
                                    

     
                        x  1 r x  r
                          
         f ( x)           p q  p r (1  q)  r  1, where q  1  p
                        r  1
    xr            xr       

            x  1 r
 f ( x)         p (1  p ) x  r , x  r , r  1, is a p.m.f.
            r  1
                 
               Geometric Distribution

When r =1.
The p.m.f. of X is
                         x 1
    f ( x)  p (1  p)          , x  1,2,

and We say that X has a geometric distribution.
                        Remark
 
                           p
 p(1  p)   x 1
                    
                      1  (1  p )
                                   1
x 1



 f ( x)  p (1  p ) x 1, x  1,2,is a p.m.f.
 Example 3.4-4 Some biology students were
  checking the eye color for a large number of fruit
  flies. For the individual fly,suppose that the
  probability of white eyes is 1/4 and the probability
  of red eyes is 3/4 , and that we may treat these
  observations as having independent Bernoulli
  trials.The probability that at least four flies have to
  be checked for eye color to observe a white-eyed
 fly is given by

                             3 3 27
P( X  4)  P( X  3)  q  ( ) 
                              3
                                      0.4219
                             4    64
  The probability that at most four flies have to be
checked for eye color to observe a white-eyed
fly is given by

                          3 4 175
 P( X  4)  1  q  1  ( ) 
                      4
                                    0.6836
                          4    256
  The probability that the first fly with white eyes
is the forth fly considered is

                41      3 3 1   27
P( X  4)  q         p( ) ( )     0.1055
                         4 4 256
   m.g.f. of the negative binomial distribution

Let X be a r.v. with negative binomial distribution,
  then the m.g.f. of X is

              
                      x  1 r
     M (t )   e 
                  tx
                      r  1  p (1  p) x  r
              xr            
                        
                            x  1
             ( pe )  
                  t r
                            r  1[(1  p )e t ]x  r
                       xr        
                   ( pet ) r
                                 , where (1  p )et  1
              [1  (1  p )e t ]r
    Mean & variance of the negative binomial
                 distribution
Thm:
Let X be a r.v. with negative binomial distribution,
  then
                       r
                 X 
                       p

                   r (1  p )
                
                 2
                 X
                       p2
Pf:

M ' (t )  r ( pet ) r [1  (1  p )et ] r 1
and
M '' (t )  r ( pet ) r (r  1)[1  (1  p )et ] r  2 [(1  p )et ] 
          r 2 ( pet ) r 1 ( pet )[1  (1  p )et ] r 1


 M ' (0)  rp 1 , M '' (0)  rp  2 (r  1  p )
               r                           r (1  p )
  X  M (0)  ,  X  M (0)  [ M (0)] 
                '   2    ''        '    2

               p                               p2
Example 3.4-5
Suppose that during practice, a basketball player
can make a free throw 80% of the time.
Furthermore, assume that a sequence of free-throw
shooting can be thought of as independent Bernoulli
trials.
Let X equal the minimum number of free throws that
this player must attempt to make a total of 10 shots.
The p.m.f. of X is

          x 1               x 10
         10  1(0.80) (0.20)
g ( x)        
                       10
                                     ,   x  10,11,12...
               

       1                 10(0.20)
  10        12.5,  
                        2
                                     3.125,
       0.80 
                                 2
                            0.80

and     1.768
 Mean & variance of the geometric distribution
Thm:
Let X be a r.v. with geometric distribution, then

                      pet
     M X (t )                  ,   t   ln(1  p )
                1  (1  p )e t



            1
     X   
            p

            (1  p )
     X
      2
          
               p2
    3-5 THE POISSON DISTRIBUTION

Some experiments result in counting the number of
  times particular events occur in given times or on
  given physical objects. For example,
 (1) count the number of phone calls arriving at a
  switchboard between 9 and 10 A.M..
 (2) count the number of flaws in 100 feet of wire.
 (3) count the number of customers that arrive at a
  ticket window between 12 noon and 2P.M..
 (4) count the number of defects in a 100-foot roll of
  aluminum screen that is 2 feet wide.
                Poisson experiment
Def:
Let the number N(t) of changes that occur in a given
  continuous interval of length t be counted. We
  have a Poisson experiment with parameter   0 if
  the following are satisfied:

(1)The probability of exactly one change in a
  sufficiently short interval of length h is
  approximately t .

   P( N (t )  1)  h  o(t ), ast  0,
                                                o(t )
   where o(t ) is a function of t such that lim        0.
                                            t 0 t
(2) The probability of two or more changes in a
  sufficiently short interval is essentially zero.

      P( N (t )  2)  0, as t  0.
(3)The number of changes occurring in non-
  overlapping intervals are independent.
           p.m.f. of Poisson distributions.

For a Poisson experiment, let N(t) denote the number of
  changes in an interval of length t.
• Divide the interval of length t into n subintervals
  of equal length t n .
• If n is sufficiently large, one change occurs in each of
  exactly y of these n subintervals.
• Consider the occurrence or nonoccurrence of a change
  in each subinterval as a Bernoulli trial. By (1)-(3), we
                                                  t t
  have a sequence of Bernoulli trial with   p      o  .
                                                  
                                                   n    n
                      p.m.f. of Poisson distributions
Then the p.m.f. of N (t ) is
                                                         y                          n y
                                   n   t   t                 t   t 
f ( y )  P ( N (t )  y )  lim       o            1      o  
                             n   y   n   n                 n   n 

                                              y                        n                     y
       n(n  1)(n  y  1)  t     t          t      t            t      t 
 lim                            o            1   o              1   o  
  n           y!          n       n             n    n               n    n 


                 n  1  2                                       y
                                                                   t  
                                                          n
  (t ) y                          y  1  t     t    t          
         lim  1  1  1        1   o   1   o   
    y!    n    n  n  n      n      n    n    n    n  
                                                                         


  e  t (t ) y
                ,   y  0,1,2,
         y!
              p.m.f. of Poisson distributions.

For a Poisson experiment, let X denote the number of
  changes in an interval of length 1.
Since X=N(1), the p.m.f. of X is


            x e 
 f ( x)              x  0,1,2, ...   where   0
               x!
                     Poisson distribution

Def:
We say that a r. v. X has a Poisson distribution with
  parameter   0,

if its p.m.f is of the form

           x e 
f ( x)                x  0,1,2, ...   where   0
              x!
                      Remark

(1) f  x   0 for x  0,1, 

                   x                
                     e                          x
 (2)  f(x)                e   
                                        x!              
                                                       e e 1
    x 0      x 0    x!               x 0


 f is a p.m.f.
 Mean & variance of Poisson distribution

Thm:
Let X be a r. v. of Poisson distribution with
parameter  , then

                ( e t 1)
 M X (t )  e
  M      ' 0  
   X       X

 X
  2
                                          
          " 0  [ M ' 0]2  2    2  
        MX           X
Pf:
                                        x 
                 
                             
                                   e
  M X t   E e    tX
                            e    tx

                             x 0   x!

          e   
                    
                     
                      e   e ee  e e 1
                             t x             t       t


                    x 0    x!


   ' t   e t e et 1, M " t   (e t ) 2 e et 1  et e et 1
  MX                           X




       X  M 0  
                                                
       X  M 0  [ M 0]2  2    2  
        2
EX 3.5-1
Let X have a Poisson distribution with a mean of   5 .
Then

              6   x  5 ( 查表)
                 5 e
P X  6                    0.762
            x  0 x!
                              ( 查表)
P  X  5  1  P  X  5          1  0.616  0.384
and
P X  6  P(X  6 ) - P X  5  0.762  0.616  0.146
                       Remark

• If events in a Poisson process occur at a mean rate
  of  per unit , then the expected number of
  occurrences in an interval of length t is t .

• The number of occurrences, say X ,in the interval
  of length t has the Poisson p.m.f.

                (t ) x e -  t
       f ( x)                  , X  0,1,2,
                      x!
Ex:
It is known that bacteria of a certain kind occur in water
   at the rate of three bacteria per cubic centimeter of
   water. Assume that this phenomeon obeys Poisson
   probability Law, what is the probability that a sample
   of two cubic centimeter of water contain
(1) at most two bacteria.
(2) at least three bacteria.
Sol :
N (1) ~ Po (  3)
 N (2) ~ Po (  6)
                         2
                            6 x e6
 (1) P ( N (2)  2)               25e  6
                       x  0 x!

   (2) P ( N (2)  3)  1  P ( N (2)  2)  1  25e  6 .
Example 3.5-5
Telephone calls enter a college switchboard on the
  average of two every 3 minutes. If one assumes a
  Poisson distribution, what is the probability of five
  or more calls arriving in a 9-minute period?
   Sol :
   N (3) ~ Po (  2)
    N (9) ~ Po (  6)
   Let X  N (9)
                                        4
                                           6 x e  6 查表
   P ( X  5)  1  P ( X  4)  1                  1  0.285
                                      x  0 x!
                            Remark
• If X has Poisson distribution with parameter   0. If n
  is large, then
                                              n x
                     n             
                                 x
                                     
        P( X  x)               1  
                     x  n         n

• That is, if X ~b (n, p) with large n and small p, then
         n x                 (np ) x e np
           p (1  p) n  x 
          x                       x!

• The approximation is quite accurate if n  20 and p  0.05
  or if n  100 and p  0.10
Example 3.5-6
A manufacturer of Christmas tree light bulbs knows that
  2% of its bulbs are defective.
  To approximate the probability that
  a box of 100 of there bulbs contains at most three
  defective bulbs , we use

 (1) Poisson distribution
 with   100(0.02)  2   which gives

         3    x   2
             2 e
         x!
        x 0
                        0.857
(2) binomial distribution
    3
        100 
    x 
            0.02x 0.98100 x  0.859
   x 0     


  The Poisson approximation is extremely close to
  the true value, but much easier to find.

				
DOCUMENT INFO
Shared By:
Categories:
Tags:
Stats:
views:27
posted:4/14/2012
language:English
pages:86