19501
TRANSCRIPT
-
8/13/2019 19501
1/59
4. Poisson Processes
4.1 Definition
4.2 Derivation of exponential distribution
4.3 Properties of exponential distribution
a. Normalized spacings
b. Campbells Theorem
c. Minimum of several exponential random variables
d. Relation to Erlang and Gamma Distribution
e. Guarantee Time
f. Random Sums of Exponential Random Variables
4.4 Counting processes and the Poisson distribution
4.5 Superposition of Counting Processes4.6 Splitting of Poisson Processes
4.7 Non-homogeneous Poisson Processes
4.8 Compound Poisson Processes
155
-
8/13/2019 19501
2/59
4 Poisson Processes
4.1 Definition
Consider a series of events occurring over time, i.e.
> Time0 X X X X
DefineTi as the time between the(i 1)st andith event. Then
Sn =T1+ T2+ . . . + Tn = Time tonth event.
DefineN(t) =no. of events in(0, t].
Then
P{Sn > t}=P{N(t)< n}
If the time for thenth event exceedst, then the number of events in(0, t]
must be less thann
.
156
-
8/13/2019 19501
3/59
P{Sn > t}=P{N(t)< n}
pt(n) =P{N(t) =n} =P{N(t)< n + 1} P{N(t)< n}
=P{Sn+1> t} P{Sn > t}
where Sn =T1+ T2+ . . . + Tn.
DefineQn+1(t) =P{Sn+1> t}, Qn(t) =P{Sn > t}
Then we can write
pt(n) =Qn+1(t) Qn(t)
and taking LaPlace transforms
ps(n) =Qn+1(s) Q
n(s)
157
-
8/13/2019 19501
4/59
If qn+1(t)andqn(t)are respective pdfs.
Qn+1(s) = 1 qn+1(s)
s , Qn(s) =
1 qn(s)
s
and
ps(n) = 1 qn+1(s)s
1 qn(s)s
= qn(s) q
n+1(s)
s
RecallT1 is time between0 and first event, T2 is time between first and
second event, etc.
158
-
8/13/2019 19501
5/59
Assume {Ti} i= 1, 2, . . . are independent and with the exception of
i= 1, are identically distributed with pdf q(t). Also assumeT1 has pdf
q1(t). Then
qn+1(s) =q1(s)[q
(s)]n, qn(s) =q1(s)[q
(s)]n1
and
ps(n) =qn(s) q
n+1(s)
s =q1(s)q
(s)n1
1 q(s)
s
Note thatq1(t)is a forward recurrence time. Hence
q1(t) = Q(t)
m and q1(s) =
1 q(s)
sm
ps(n) = q(s)n1
m
1 q(s)
s
2
159
-
8/13/2019 19501
6/59
Assume q(t) =et
fort >0 (m= 1/)= 0 otherwise.
Then q(s) = / + s, 1 q(s)
s = 1/( + s)
and ps(n) =
+ s
n1 1
+ s
2 =
1
( / + s)
n+1
.
ps(n) = 1(/ + s)n+1
However(/ + s)n+1 is the LaPlace transform of a gamma distribution
with parameters(, n + 1)
i.e.
f(t) = et(t)n+11
(n + 1) fort >0
160
-
8/13/2019 19501
7/59
.. pt(n) = L1{ps(n)} =
et(t)n
n!
which is the Poisson Distribution. HenceN(t)follows a Poissondistribution and
P{N(t)< n}=n1
r=0
pt(r) =P{Sn > t}
P{Sn > t}=n1
r=0et(t)r
r! .
We have shown that if the times between events are iid following an
exponential distribution theN(t)is Poisson withE[N(t)] =t.
161
-
8/13/2019 19501
8/59
Alternatively ifN(t) follows a Poisson distribution, thenSn has a
gamma distribution with pdff(t) = et(t)n1
(n) fort >0.
This implies time between events are exponential.
SinceP{Sn > t}=P{N(t)< n} we have proved the identity
P{Sn > t}= t
et
(t)n1
(n)
dx=
n1r=0
et
(t)r
r! .
This identity is usually proved by using integration by parts.
WhenN(t)follows a Poisson distribution withE[N(t)] =t, the
set {N(t), t >0} is called a Poisson Process.
162
-
8/13/2019 19501
9/59
4.2 Derivation of Exponential Distribution
Define Pn(h) =Prob. ofn events in a time intervalh
Assume
P0(h) = 1 h + o(h); P1(h) =h + o(h); Pn(h) =o(h) forn >1
whereo(h)means a term(h)so that limh0
(h)
h = 0. Consider a finite
time interval(0, t). Divide the interval inton sub-intervals of lengthh.
Thent =nh.
0 t
h h h h
t=nh
The probability of no events in(0, t)is equivalent to no events in each
sub-interval; i.e.
Pn{T > t}=P{no events in(0, t)}
T =Time for
1
st event
163
-
8/13/2019 19501
10/59
Suppose the probability of events in any sub interval are independent of
each other. (Assumption of independent increments.) Then
Pn{T > t}= [1 h + o(h)]n = [1
t
n + o(h)]n
= (1 t
n)n + n o(h)(1
tn
)n1 + . . .
Since
limn(1
t
n)n
=et
and
limn
n o(h) = limh0
t
ho(h) = 0
We have P{T > t}= limh0
Pn{T > t}=et.
.. The pdf ofT is d
dtP{T > t}=et. (Exponential Distribution)
164
-
8/13/2019 19501
11/59
4.3 Properties of Exponential Distribution
q(t) =et t >0
E(T) = 1/=m, V(t) = 1/2 =m2
q(s) =/ + s
Considerr < t.
Then
P{T > r+ t|T > r}=Conditional distribution
= Q(r+ t)
Q(r) =
e(r+t)
er =et
i.e. P{T > r+ t|T > r}=P{T > t} for allr andt.
Also P{T > r+ t}=e(r+t) =Q(r)Q(t) =Q(r+ t)
Exponential distribution is only function satisfying Q(r + t) =Q(r)Q(t)
165
-
8/13/2019 19501
12/59
Proof:
Q
2
n
=Q
1
n
2and in generalQ
mn
=Q
1
n
m
Q(1) =Q 1
n + 1
n + . . . +1
n
=
Q 1
nn
, m=n
.. Q
m
n =
Q
1
n
nm/n=Q(1)m/n.
IfQ()is continuous or left or right continuous we can write
Q(t) =Q(1)t.
SinceQ(1)t =et logQ(1) we havelog Q(1)is the negative of the rateparameter. Hence
Q(t) =et where = log Q(1).
166
-
8/13/2019 19501
13/59
a. Normalized Spacings
Let {Ti} i= 1, 2, . . . , n be iid following an exponential distribution with
E(Ti) = 1/.
Define T(1)T(2). . . T(n) Order statistics
Then the joint distribution of the order statistics is
f(t(1), t(2), . . . , t(n))dt(1), t(2), . . . t(n) =P{t(1) < T(1), t(1)+dt(1), . . . }
= n!
1! 1! . . . 1!
= n!et(1) et(2) . . . et(n)dt(1) . . . d t(n)
f(t(1), . . . , t(n)) =n!nen
1 t(i) =n!neS
whereS=n
1t(i) =
n
1ti, 0t(1) . . . t(n)
167
-
8/13/2019 19501
14/59
f(t(1), . . . , t(n)) =n!n
eS
, 0 t(1) . . . t(n)
S =n1
t(i)
Consider
Z1=nT(1), Z2 = (n 1)(T(2) T(1)), ,
Z(i)
= (n i + 1)(T(i)
T(i1)
), , Z(n)
=T(n)
T(n1)
.
We shall show that {Zi} are iid exponential.
f(Z1, Z2, . . . , Z n) =f(t(1), . . . , t(n))(t(1), . . . , t(n))(Z1, . . . , Z n)
where (t(1), . . . , t(n))
(Z1, . . . , Z n) is the determinant of the Jacobian.168
-
8/13/2019 19501
15/59
We shall find the Jacobian by making use of the relation
(t(1), . . . , t(n))(Z1, . . . , Z n)
= (Z1, Z2, . . . Z n)t(1), . . . , t(n))
1
Zi= (n i + 1)(T(i) T(i1)), T(0) = 0
ZiT
(j)
=
n i + 1 j =i
(n i + 1) j =i 1
0 Otherwise
169
-
8/13/2019 19501
16/59
(Z1, . . . , Z n)
(t(1), . . . , t(n)=
n 0 0 0 . . . 0
(n 1) (n 1) 0 0 . . . 0
0 (n 2) (n 2) 0 . . . 0
...
0 0 . . . . . . 1 1
170
-
8/13/2019 19501
17/59
Note: The determinant of a trangular matrix is the product of the maindiagonal terms
.. (Z1, . . . , Z n)
(t(1), . . . , t(n))
=n(n 1)(n 2) . . . 2 1 =n!
and
f(z1, z2, . . . , zn) =n!n eS
1
n!=ne
n1
zi=neS
asS=n
i=1
t(i) =z1+ . . . + zn.
The spacings Zi = (n i + 1)(T(i) T(i1))are sometimes called
normalized spacings.
171
-
8/13/2019 19501
18/59
Homework:
1. Suppose there aren observations which are iid
exponential(Ti = 1/). However there arer non-censored observations
and(n r) censored observations all censored att(r).
ShowZi = (n i + 1)(T(i) T(i1)) fori = 1, 2, . . . , r are iidexponential.
2. Show that
T(i) = Z1
n +
Z2n 1
+ . . . + Zi
n i + 1
and prove
E(T(i)) = 1
ij=1
1
n j+ 1
Find variances and covariances of{T(i)}.
172
-
8/13/2019 19501
19/59
b. Campbells Theorem
Let {N(t), t >0} be a Poisson Process. Assumen events occur in the
interval(0, t]. Note thatN(t) =nis the realization of a random variable
and has probabilityP{N(t) =n}=et(t)n
n!DefineWn =Waiting time forn
th event.
If{Ti} i= 1, 2, . . . , n are the random variables representing the time
between events
f(t1, . . . , tn) = n1e
ti =ne
n
1 ti
But
n1
ti =Wn, hence
f(t1, . . . , tn) =neWn
173
-
8/13/2019 19501
20/59
f(t1, . . . , tn) =neWn
Now consider the transformation
W1 =t1, W2=t1+ t2, . . . , W n =t1+ t2+ . . . + tn
The distribution ofW= (W1, W2, . . . , W n)is
f(W) =f(t)
(t)
W
where(t)W is the determinant of the Jacobian.
174
-
8/13/2019 19501
21/59
Note: (W)
t=
1 0 0 0 . . . 0
1 1 0 0 . . . 0
1 1 1 0 . . . 0
1 1 1 1 . . . 0...
1 1 1 1 . . . 1
and
(t)
W
=
(W)
t
1
= 1
.. f(w1, . . . , wn) =newn 0< w1. . . wn < t.
But there are no events in the interval(wn, t]. This carries probability
e(twn). Hence the joint distribution of theWis
f(W) =newn e(twn) =net
175
-
8/13/2019 19501
22/59
f(W) =net 0w1 w2. . .wn < t
Consider
f(W|N(t) =n) = net
et(t)n/n! =n!/tn.
This is the joint distribution of the order statistics from a uniform(0, t)distribution; i.e., f(x) =
1
t 0< x t.
Hence E(Wi|N(t) =n) = it
n + 1
i= 1, 2, . . . , n
We can consider the unordered waiting times, conditional onN(t) =n,
as following a uniform (0, t)distribution.
176
-
8/13/2019 19501
23/59
Sincew1 =t1, w2=t1+ t2, . . . , wn =t1+ t2+ . . . + tn
ti =wi wi1 (w0= 0)
The difference between the waiting times are the original timesti. Thesetimes follow the distribution conditional onN(t) =n; i.e.
f(t1, . . . , tn|N(t) =n) =n!/tn
Note that iff(ti) = 1/t 0< ti < t, the joint distribution
fori = 1, 2, . . . , n ofn independent uniform(0, t) random variables
isf(t) = 1/tn. If0 < t(1)t(2) . . . t(n) < tthe distribution of the
order statistics isf(t(1), . . . , t(n)) =n!/t
n
which is the same asf(t1, . . . , tn|N(t)).
177
-
8/13/2019 19501
24/59
c. Minimum of Several Exponential Random Variables
LetTi(i= 1, . . . , n)be ind. exponential r.v. with parameteri and let
T =min(T1, . . . , T n)
P{T > t}=P{T1> t, T2> t, . . . , T n > t}=n
i=1P{Ti> t}
=ni=1eit =et, =
n1
i
T is exponential with parameter
P{T > t}=et =n
i=1i
T= min(T1, . . . , T n)
If alli =0, =n0, P{T > t}=en0t
178
-
8/13/2019 19501
25/59
DefineNas the index of the random variable which is the smallest failure
time.
For example ifTr Ti for alli, thenN=r.
ConsiderP{T > t, Tr Ti alli}=P{N=r, T > t}
P{N=r, T > t}=P{T > t, Ti Tr, i=r}
= t P{T > t
r, i=r| tr}f(tr)dtr
=
t
e(r)trrertrdtr
=r t
etrdtr
=r
et
179
-
8/13/2019 19501
26/59
P{N=r, T > t}= r
et
P{N=r, T >0}=P{N=r}= r
, =n1
i
P{N=r, T > t}=P{N=r}P{T > t}
N (index of smallest) andTare independent
If i =0
P{N=r}= 0
n0=
1
n(All populations have the same prob. of being the smallest.)
180
-
8/13/2019 19501
27/59
D. Relation to Erlang and Gamma Distribution
ConsiderT =T1+ . . . + Tn
Sinceqi(s) =
ii+ s , q
T(s) =
ni=1
ii+ s
which is L.T. of Erlang distribution. Ifi are all distinct
q(t) =
ni=1
Aieit ,Ai =j=i
ij i
Ifi =, qT(s) =
+ sn
q(t) = (t)n1et
(n) Gamma Distribution
181
-
8/13/2019 19501
28/59
E. Guarantee Time
Consider the r.v. following the distribution having pdf
q(t) =e
(tG)
fort > G= 0 fort G
The parameterG is called a guarantee time
If the transformation Y =T Gis made then f(y) =ey fory >0.
.. E(Y) = 1/, V (Y) = 1/2, . . .
Since T =Y + G, E(T) = 1 + G
and central moments ifT andYare the same.
182
-
8/13/2019 19501
29/59
F. Random Sums of Exponential Random Variables
Let {Ti} i= 1, 2, . . . , N be iid withf(t) =et and consider
SN =T1+ T2+ . . . + TN
with P{N=n}=pn.
The Laplace Transform ofSN is (/ + s)n
for fixedN=n. Hence
f(s) =E
+ s
Nresulting in a pdf which is a mixture of gamma
distributions.
f(t) =n=1
(t)n1et(n)
pn
183
-
8/13/2019 19501
30/59
Supposepn =pn1q n= 1, 2, . . . (negative exponential distribution)
f(s) =
n=1
+ sn
pn1q=
n=1
q
p p
+ sn
= q
pp/ + s
1 p+s
= q
p
p
s + (1 p) =
q
s + q
f(s) = q
s + q
SN =T1+ T2+ . . . + TN, P{N=n}=pn1q
has exponential distribution with parameter(q).
184
-
8/13/2019 19501
31/59
4.4 Counting Processes and the Poisson Distribution
Definition: A stochastic process {N(t), T >0} is said to be a counting
process whereN(t)denotes the number of events that have occurred in
the interval(0, t]. It has the properties.
(i.) N(t)is integer value
(ii.) N(t) 0
(iii.) Ifs < t, N (s)N(t) and N(t) N(s) =number of events
occurring is(s, t].
185
-
8/13/2019 19501
32/59
A counting process has independent increments if the events in disjoint
intervals are independent; i.e.N(s) are N(t) N(s)are independent
events.
A counting process has stationary increments if the probability of the
number of events in any interval depends only on the length of the
interval; i.e.N(t)andN(s + t) N(s)
have the same probability distribution for alls. A Poisson process
is a counting process having independent and stationary
increments.
186
-
8/13/2019 19501
33/59
TH. Assume {N(t), t0} is a Poisson Process. Then the dsitribution of
Ns(t) =N(s + t) N(s) is independent ofs and only depends on the
length of the interval, i.e.
P{N(t + s) N(s)|N(s)}=P{N(t + s) N(s)}
for alls. This implies that knowledge ofN(u)for0 < usis also
irrelevant.
P{N(t + s) N(s)|N(u), 0< us}
=P{N(t + s) N(s)}.
This feature defines a stationary process.
187
-
8/13/2019 19501
34/59
TH. A Poisson Process has independent increments.
Consider0 t1< t2< t3 < t4
> Time
0
X X X X
t1 t2 t3 t4
Consider events in(t3, t4]; i.e.
N(t4) N(t3)
P{N(t4) N(t3) | N(u), o < u t3}
=P{N(t4) N(t3)}.
Distribution is independent of what happened prior tot3. Hence if the
intervals(t1, t2]and(t2, t4)are non-overlapping
N(t2) N(t1)andN(t4) N(t3) are independent.
188
-
8/13/2019 19501
35/59
TH. Cov(N(t), N(s + t)) =t (Poisson Process)
Proof N(s + t) N(t)is independent ofN(t)Cov(N(s + t) N(t), N(t)) = 0
= Cov(N(s + t), N(t)) V(N(t)) = 0
.. Cov(N(s + t), N(t)) =V(N(t)) =t
as variance ofN(t)ist.
An alternative statement of theorem is
Cov(N(s), N(t)) = min(s, t)
189
-
8/13/2019 19501
36/59
TH. A counting process {N(t), t 0} is a Poisson Process if and only if
(i) It has stationary and independent increments
(ii) N(0) = 0 and
P{N(h) = 0} = 1 h + 0(h)
P{N(h) = 1} =h + 0(h)
P{N(h) =j} = 0(h), j >1
Notes: The notation0(h)little o of h refers to some function(h)for
which
limh0
(h)
h = 0
Divide interval(0, t]inton sub-intervals of lengthh; i.e.nh =t
P{N(kh) N(k 1)h)}=P{N(h)}
190
-
8/13/2019 19501
37/59
T =Time to event beginning att = 0.
P{T > t} =P{N(t) = 0}=P{No events in each sub-interval}
P{N(t) = 0} =P{T > t}= [1 h + o(h)]n
= (1 h)n + n(1 h)n1o(h) + o(h2)
= (1 h)n{1 + n o(h)
1 h+ . . . }
=
1 tnn
1 + t1 tn
o(h)h + . . .
et as n , h 0
P{T > t}=et
HenceT is exponential; i.e. Time between events is exponential.
{N(t), t 0} is Poisson Process
191
-
8/13/2019 19501
38/59
4.5 Superposition of Counting Processes
Suppose there arek counting processes which merge into a single
counting process; e.g.k = 3.
X X
X
X
X XX X X
X
Process 1:
Process 2:
Process 3:
Merged Process:
The merged process is called the superposition of the individual counting
processes
N(t) =N1(t) + N2(t) + . . . + Nk(t)
192
-
8/13/2019 19501
39/59
A. Superposition of Poisson Processes
N(t) =N1(t) + . . . + Nk(t)
Suppose {Ni(t), t0} i= 1, 2, . . . , k are Poisson Processes with
E[Ni(t)] =it.
Note that each of the counting processes has stationary and independent
increments.
AlsoN(t)is Poisson with parameter
E(N(t)) =k
i=1
(it) =t, =k
i=1
i
N(t)is a Poisson Process
Hence {N(t), t 0} has stationary and independent increments.
193
-
8/13/2019 19501
40/59
B. General Case of Merged Process
Consider the merged process fromk individual processes
XX
XX XX X X
X Vk> v}=Gk(v) =P{Tf(1)> v, Tf(2)> v, . . . , T f(k)> v}
=Qf(v)k
195
-
8/13/2019 19501
42/59
Gk(v) =P{Vk > v}=Qf(v)k where
Qf(v) =
v
qf(x)dx, q f(x) =Q(x)/m
Let gk(x) =pdf of merged process
Gk(v) =
v
gk(x)dx=Qf(v)k
d
dvGk(v) =gk(v) =kQf(v)
k1qf(v)
gk(v) =kQf(v)k1Q(v)
m
196
-
8/13/2019 19501
43/59
Consider transformation z= Vkm/k
= kVk
m ,
dz
dv =
k
m
gk(z) =gk(v) Vz = k
mQfmz
kk1 Qmz
k m
k
gk(z) =Qmz
k
1 mz
k
o
Q(x)
m dx
k1
as Qf
mzk
=
mz
k
Q(x)
m dx= 1
mzk
0
Q(x)
m dx
For fixedz,ask ,
zm
k 0 and Q
mzk
1
197
-
8/13/2019 19501
44/59
Also mzk
0
Q(x)
m dx
Qmzk
m
mz
k =Qmz
k
zk
z
k
.. ask
gk(z)
1 z
k
k1ez
Thus ask , the forward recurrence time (multiplied bymk)z =
mkVk is distributed as a unit exponential distribution. Hence for
largek,Vk = kmz has an asymptotic exponential distribution with
parameter =k/m. Since the asymptotic forward recurrence time isexponential, the time between events (of the merged process), is
asymptotically exponential.
198
-
8/13/2019 19501
45/59
Note: A forward recurrence time is exponential if and only if the time
between events is exponential; ie.
qf(x) = Q(x)
m =ex ifQ(x) =ex
and ifqf(x) =ex
Q(x) =ex
Additional Note: The merged process isN(t) =k
i=1
Ni(t). Suppose
E(Ni(t)) =t. Units ofare no. of events per unit time
The units ofm are time per event
Thus E(N(t)) = (k)tand(k)is mean events per unit time. The units
of 1k or 1
is mean time per event. Hencem = 1/for an
individual process and the mean of the merged process is1/k.
Ex. =6 events per year m= 126 = 2months (mean time between
events).
199
-
8/13/2019 19501
46/59
5. Splitting of Poisson Processes
Example: Times between births (in a family) follow an exponential
distribution. The births are categorized by gender.
Example: Times between back pain follow an exponential distribution.
However the degree of pain may be categorized as the required
medication depends on the degree of pain.
Consider a Poisson Process {N(t), t 0} where in addition to observingan event, the event can be classified as belonging to one ofr possible
categories.
DefineNi(t) = no. of events of type i during(0, t]fori = 1, 2, . . . , r
N(t) =N1(t) + N2(t) + . . . + Nr(t)
200
-
8/13/2019 19501
47/59
This process is referred to as splitting the process.
Bernoulli Splitting Mechanism
Suppose an event takes place in the interval(t, t + dt]. Define the
indicator random variableZ(t) =i(i= 1, 2, . . . , r)such that
P{Z(T) =i|event at(t, t + dt]}=pi.
Notepi is independent of time.
Then ifN(t) =r
i=1
Ni(t) the counting processes {Ni(t), t0} are
Poisson process with parameter(pi)for i= 1, 2, . . . , r.
201
-
8/13/2019 19501
48/59
Proof: Suppose over time(0, t],n events are observed of whichsi are
classified as of timei withr
i=1 si =n.
P{N1(t) =s1, N2(t) =s2, . . . , N r(t) =sr|N(t) =n}
= n!
s1!s2! . . . sr!ps1
1 ps2
2 . . . psr
r
Hence P{Ni(t) =si, i= 1, . . . , r andN(t) =n}
=
n!ri=1 si!p
s11 p
s22 . . . p
srr
et(t)n
n!
=r
i=1(pit)
siepit
si! =
r
i=1P{Ni(t) =si}
which shows that the {Ni(t)} are independent and follow Poisson
distributions with parameters {pi}.
{Ni(t), t0} are Poisson Processes.
202
-
8/13/2019 19501
49/59
Example of Nonhomogenous Splitting
Suppose a person is subject to serious migraine headaches. Some of theseare so serious that medical attention is required. Define
N(t) = no. of migraine headaches in(0, t]
Nm(t) = no. of migraine headaches requiring medical attention
p() = prob. requiring medical attention if
headache occurs at(, + d).
Suppose an event occurs at(, + d); then Prob.of requiring
attention= p()d.
Note that conditional on a single event taking place in(0, t], is uniform
over(0, t]; i.e.
f(|N(t) = 1) = 1/t 0< t and = 1
t
t0
p()d
203
-
8/13/2019 19501
50/59
= 1
t
t0
p()d
0x
t
No event(, t)Time to event
.. P{Nm(t) =k|N(t) =n}=
n
k
k(1 )(nk)
P{Nm(t) =k, N(t) =n}=n
k
k
(1 )nk e
t(t)n
n!
204
-
8/13/2019 19501
51/59
P{Nm(t) =k}=n=k
n
k
k(1 )nk
et(t)n
n!
= k
k!et
n=k
(t)n
(n k)!(1 )nk
= k
k!e
t(t
)k
n=k
(t)nk(1 )nk
(n k)!
= k
k!(t)ket et(1)
P{Nm(t) =k}=et (t)
k
k!
205
-
8/13/2019 19501
52/59
4.7 Non-homogeneous Poisson Processes
Preliminaries
LetN(t) follow a Poisson distribution; i.e.
P{N(t) =k}=et(t)k/k!
Holdingt fixed, the generating function of the distribution is
N(t)(s) =E[esN(t)] =
k=0
et(t)k
k!
esk
=et
k=0(est)k
k! =etee
st
N(t)(s) =et[es1] =et(z1) if z=es
The mean isE[N(t)] =t
206
{
-
8/13/2019 19501
53/59
Consider the Counting Process {N(t), t0} having the Laplace
Transform
(*) N(t)(s) =e(t)[es1] =e(t)[z1]
E[N(t)] = (t), P{N(t) =k}=e(t) [(t)]k/k!
For the Poisson Process(t) =tand the mean is proportional tot.
However whenE[N(t)]=twe call the process {N(t), t 0} a
non-homogenized Poisson Process and E(N(t)] = (t)
(t)can be assumed to be continuous and differentiable
d
dt(t) = (t) =(t).
The quantity(t)is called intensity function.(t)can be represented by
(t) =
t0
(x)dx
207
-
8/13/2019 19501
54/59
IfN(t)has the Transform given by()then
P{N(t) =k}=e(t)(t)k/k!
SinceP{Sn > t}=P{N(t)< n}
We have P{S1> t}=P{N(t) t) =e(t)
Thus pdf of time between events is
f(t) =(t)e t
0(x)dx, (t) =
t0
(x)dx
Note that ifH= (t), thenHis a random variable following a unit
exponential distribution.
208
A i d d i i N (t ) N ( ) d N ( )
-
8/13/2019 19501
55/59
Assume independent increments; i.e.N(t + ) N()andN()are
independent
L.T. Transform (z, t) =e(t)[z1] z=es
Generating function =E[esN(t)] =E[zN(t)]
e(t+u)(z1) =E[zN(t+u)] =E[zN(t+u)N(u)+N(u)]
=E[zN(t+u)N(u)] E[zN(u)]
=e
(u)[z1]
.. =E[zN(t+u)N(u)]] = e(t+u)(z1)
e(u)(z1) =e[(t+u)(u)][z1]
where(t + u) (u) = t+uu
(x)dx
.. P{N(t + u) N(u) =k}= e[(t+u)(u)][(t + u) (u)]k
k!
209
-
8/13/2019 19501
56/59
Axiomatic Derivation ofNon-Homogenized Poisson Distribution
Assume counting process {N(t), t 0}(i)N(0) = 0
(ii) {N(t), t0} has independent increments; i.e.N(t + s) N(s)
andN(s)are independent
(iii)P{N(t + h) =k|N(t) =k}= 1 (t)h + 0(h)
P{N(t + h) =k+ 1|N(t) =k}=(t) + 0(h)
P{N(t + h) =k+j|N(t) =k}=o(h) j 2
p{N(t + s) N(s) =k}=e[(t+s)(s)]2[(t + s) (s)]k
k!
210
-
8/13/2019 19501
57/59
4.8 Compound Poisson Process
Example. Consider a single hypodermic needle which is shared. The
times between use follow a Poisson Process. However at each use, severalpeople use it. What is the distribution of total use?
Let {N(t), t 0} be a Poisson process and {Zn, n 1} be iid random
variables which are independent ofN(t). Define
Z(t) =
N(t)n=1
Zn
The processZ(t)is called a Compound Poisson Process. It will be
assumed that {Zn} takes on integer values.
211
Define A(s) E[eszn ] Then
-
8/13/2019 19501
58/59
Define A(s) =E[e szn ]. Then
(s|N(t) =r) =E[esZ(t)] =A(s)r
(s|N(t) =r) =A(s)r
(s) =r=0
(s|N(t) =r)P(N(t) =r)
=
r=0A
(s)
r et(t)r
r! =e
tr=0
(A(s)t)r
r!
(s) =eteA(s)t = et(1A
(s))
A(s) =E(eszN) = 1 sm1+ s2
2m2+ . . .
t(1 A(s)) =t[sm1 s2
2m2+ . . . ], mi =E(zin)
212
-
8/13/2019 19501
59/59
Cumulant function =K(s) = log (s)
K(s) =sm +s2
22 + . . .
where(m, 2)refer toZ(t).
K(s) =t[sm1s2
2
m2+ . . . ]
E[Z(t)] =tm1
V[Z(t)] =tm2
mi =E(zin)
213