**Chapter 2. Poisson Processes**

**Chapter 2. Poisson Processes**

**Prof. Ai-Chun Pang**

**Prof. Ai-Chun Pang**

**Graduate Institute of Networking and Multimedia,**

**Graduate Institute of Networking and Multimedia,**

**Department of Computer Science and Information Engineering,**

**Department of Computer Science and Information Engineering,**

**National Taiwan University, Taiwan**

**National Taiwan University, Taiwan**

**Outline**

*• Introduction to Poisson Processes*

*• Properties of Poisson processes*
**– Inter-arrival time distribution**
**– Waiting time distribution**

**– Superposition and decomposition**

*• Non-homogeneous Poisson processes (relaxing stationary)*

*• Compound Poisson processes (relaxing single arrival)*

*• Modulated Poisson processes (relaxing independent)*

*• Poisson Arrival See Average (PASTA)*

**Introduction**

*t*

~1

*x*

~*x*2

~*x*3

0

*S*~ ~^{1}

*S* ~^{2}

*S* ~^{3}

*S* ~^{4}

*S*
)

*~ t*(

*n* Inter-arrival
time

arrival epoch

*(i) n** _{th}* arrival epoch ˜

*S*

*is*

_{n}*S*˜* _{n}* =

*x*˜

_{1}+ ˜

*x*

_{2}

*+ . . . + ˜x*

*=*

_{n}^{}

^{n}

_{i=1}*x*˜

_{i}*S*˜

_{0}= 0

*(ii) Number of arrivals at time t is: ˜n(t). Notice that:*

*{˜n(t) ≥ n}* ^{if f}*⇔ { ˜S*_{n}*≤ t}, {˜n(t) = n}* ^{if f}*⇔ { ˜S*_{n}*≤ t and ˜S*_{n+1}*> t}*

**Introduction**

Arrival Process: *X =* *{˜x*_{i}*, i = 1, 2, . . .}; ˜x** _{i}*’s can be any

*S =*

*{ ˜S*

_{i}*, i = 0, 1, 2, . . .}; ˜S*

*’s can be any*

_{i}*N =* *{˜n(t), t ≥ 0}; −→ called arrival process*

Renewal Process: *X =* *{˜x*_{i}*, i = 1, 2, . . .}; ˜x** _{i}*’s are i.i.d.

*S =* *{ ˜S*_{i}*, i = 0, 1, 2, . . .}; ˜S** _{i}*’s are general distributed

*N =*

*{˜n(t), t ≥ 0}; −→ called renewal process*

Poisson Process: *X =* *{˜x*_{i}*, i = 1, 2, . . .}; ˜x** _{i}*’s are iid exponential distributed

*S =*

*{ ˜S*

_{i}*, i = 0, 1, 2, . . .}; ˜S*

*’s are Erlang distributed*

_{i}*N =* *{˜n(t), t ≥ 0}; −→ called Poisson process*

**Counting Processes**

*• A stochastic process N = {˜n(t), t ≥ 0} is said to be a counting process*
if ˜*n(t) represents the total number of “events” that have occurred up*
*to time t.*

*• From the deﬁnition we see that for a counting process ˜n(t) must*
satisfy:

1. ˜*n(t)* *≥ 0.*

2. ˜*n(t) is integer valued.*

*3. If s < t, then ˜n(s)* *≤ ˜n(t).*

*4. For s < t, ˜n(t)* *− ˜n(s) equals the number of events that have*
*occurred in the interval (s, t].*

**Definition 1: Poisson Processes**

*The counting process N =* *{˜n(t), t ≥ 0} is a Poisson process with rate λ*
*(λ > 0), if:*

1. ˜*n(0) = 0*

2. Independent increments relaxed *⇒ Modulated Poisson Process*
*P [˜n(t)* *− ˜n(s) = k*_{1}*|˜n(r) = k*_{2}*, r* *≤ s < t] = P [˜n(t) − ˜n(s) = k*_{1}]
3. Stationary increments relaxed *⇒ Non-homogeneous Poisson Process*

*P [˜n(t + s)* *− ˜n(t) = k] = P [˜n(l + s) − ˜n(l) = k]*

4. Single arrival relaxed *⇒ Compound Poisson Process*
*P [˜n(h) = 1]* = *λh + o(h)*

*P [˜n(h)* *≥ 2] = o(h)*

**Definition 2: Poisson Processes**

*The counting process N =* *{˜n(t), t ≥ 0} is a Poisson process with rate λ*
*(λ > 0), if:*

1. ˜*n(0) = 0*

2. Independent increments

*3. The number of events in any interval of length t is Poisson distributed*
*with mean λt. That is, for all s, t* *≥ 0*

*P [˜n(t + s)* *− ˜n(s) = n] = e*^{−λt}*(λt)*^{n}

*n!* *,* *n = 0, 1, . . .*

**Theorem: Definitions 1 and 2 are equivalent.**

* Proof. We show that Deﬁnition 1 implies Deﬁnition 2. To start, ﬁx u ≥ 0*
and let

*g(t) = E[e** ^{−u˜n(t)}*]

*We derive a diﬀerential equation for g(t) as follows:*

*g(t + h)* = *E[e** ^{−u˜n(t+h)}*]

= *E*

*e*^{−u˜n(t)}*e**−u[˜n(t+h)−˜n(t)]*

= *E*

*e*^{−u˜n(t)}

*E*

*e**−u[˜n(t+h)−˜n(t)]*

by independent increments

= *g(t)E*

*e*^{−u˜n(h)}

by stationary increments (1)

**Theorem: Definitions 1 and 2 are equivalent.**

Conditioning on whether ˜*n(t) = 0 or ˜n(t) = 1 or ˜n(t)* *≥ 2 yields*
*E*

*e*^{−u˜n(h)}

= 1 *− λh + o(h) + e*^{−u}*(λh + o(h)) + o(h)*

= 1 *− λh + e*^{−u}*λh + o(h)* (2)

From (1) and (2), we obtain that

*g(t + h) = g(t)(1* *− λh + e*^{−u}*λh) + o(h)*
implying that

*g(t + h)* *− g(t)*

*h* *= g(t)λ(e*^{−u}*− 1) +* *o(h)*
*h*

**Theorem: Definitions 1 and 2 are equivalent.**

*Letting h* *→ 0 gives*

*g*^{}*(t) = g(t)λ(e*^{−u}*− 1)*
or, equivalently,

*g*^{}*(t)*

*g(t)* *= λ(e*^{−u}*− 1)*
*Integrating, and using g(0) = 1, shows that*

*log(g(t)) = λt(e*^{−u}*− 1)*
or

*g(t) = e*^{λt(e}^{−u}^{−1)}*→ the Laplace transform of a Poisson r. v.*

*Since g(t) is also the Laplace transform of ˜n(t), ˜n(t) is a Poisson r. v.*

**Theorem: Definitions 1 and 2 are equivalent**

*Let P*_{n}*(t) = P [˜n(t) = n].*

*We derive a diﬀerential equation for P*_{0}*(t) in the following manner:*

*P*_{0}*(t + h)* = *P [˜n(t + h) = 0]*

= *P [˜n(t) = 0, ˜n(t + h)* *− ˜n(t) = 0]*

= *P [˜n(t) = 0]P [˜n(t + h)* *− ˜n(t) = 0]*

= *P*_{0}*(t)[1* *− λh + 0(h)]*

Hence

*P*_{0}*(t + h)* *− P*0*(t)*

*h* = *−λP*_{0}*(t) +* *o(h)*
*h*
*Letting h* *→ 0 yields*

*P*_{0}^{}*(t) =* *−λP*_{0}*(t)*
*Since P*_{0}(0) = 1, then

*P*_{0}*(t) = e*^{−λt}

**Theorem: Definitions 1 and 2 are equivalent**

*Similarly, for n* *≥ 1*

*P*_{n}*(t + h)* = *P [˜n(t + h) = n]*

= *P [˜n(t) = n, ˜n(t + h)* *− ˜n(t) = 0]*

*+P [˜n(t) = n* *− 1, ˜n(t + h) − ˜n(t) = 1]*

*+P [˜n(t + h) = n, ˜n(t + h)* *− ˜n(t) ≥ 2]*

= *P*_{n}*(t)P*_{0}*(h) + P*_{n−1}*(t)P*_{1}*(h) + o(h)*

= (1 *− λh)P*_{n}*(t) + λhP*_{n−1}*(t) + o(h)*
Thus *P*_{n}*(t + h)* *− P*_{n}*(t)*

*h* = *−λP*_{n}*(t) + λP*_{n−1}*(t) +* *o(h)*
*h*
*Letting h* *→ 0,*

*P*_{n}^{}*(t) =* *−λP*_{n}*(t) + λP*_{n−1}*(t)*

**The Inter-Arrival Time Distribution**

**Theorem. Poisson Processes have exponential inter-arrival time**
distribution, i.e., *{˜x*_{n}*, n = 1, 2, . . .} are i.i.d and exponentially*

*distributed with parameter λ (i.e., mean inter-arrival time = 1/λ).*

**Proof.**

˜

*x*_{1} : *P (˜x*_{1} *> t) = P (˜n(t) = 0) =* *e*^{−λt}*(λt)*^{0}

0! *= e*^{−λt}

*·**·*

*·* *x*˜_{1} *∼ e(t; λ)*

˜

*x*_{2} : *P (˜x*_{2} *> t|˜x*_{1} *= s)*

*= P{0 arrivals in (s, s + t]|˜x*_{1} *= s}*

*= P{0 arrivals in (s, s + t]}(by independent increment)*

*= P{0 arrivals in (0, t]}(by stationary increment)*

*= e*^{−λt}_{·}^{·}_{·}*x*˜_{2} is independent of ˜*x*_{1} and ˜*x*_{2} *∼ exp(t; λ).*

*⇒ The procedure repeats for the rest of ˜x** _{i}*’s.

**The Arrival Time Distribution of the** *n* **th Event**

**Theorem. The arrival time of the nth event, ˜**S* _{n}* (also called the waiting

*time until the nth event), is Erlang distributed with parameter (n, λ).*

**Proof. Method 1 :**

*·**·**·* *P [ ˜S*_{n}*≤ t] = P [˜n(t) ≥ n] =* ^{}^{∞}

*k=n*

*e*^{−λt}*(λt)*^{k}*k!*

*·**·*

*·* *f*_{S}_{˜}

*n**(t) =* *λe*^{−λt}*(λt)*^{n−1}

*(n* *− 1)!* (exercise)
Method 2 :

*f*_{S}_{˜}

*n**(t)dt* = *dF*_{S}_{˜}

*n**(t) = P [t < ˜S*_{n}*< t + dt]*

= *P{n − 1 arrivals in (0, t] and 1 arrival in (t, t + dt)} + o(dt)*

= *P [˜n(t) = n* *− 1 and 1 arrival in (t, t + dt)] + o(dt)*

= *P [˜n(t) = n* *− 1]P [1 arrival in (t, t + dt)] + o(dt)(why?)*

**The Arrival Time Distribution of the** *n* **th Event**

= *e*^{−λt}*(λt)*^{n−1}

*(n* *− 1)!* *λdt + o(dt)*

*·**·*

*·* lim

*dt→0*

*f*_{S}_{˜}

*n**(t)dt*

*dt* *= f*_{S}_{˜}

*n**(t) =* *λe*^{−λt}*(λt)*^{n−1}*(n* *− 1)!*

**Conditional Distribution of the Arrival Times**

**Theorem. Given that ˜***n(t) = n, the n arrival times ˜S*_{1}*, ˜S*_{2}*, . . . , ˜S** _{n}* have

*the same distribution as the order statistics corresponding to n i.i.d.*

*uniformly distributed random variables from (0, t).*

. . . .
**Order Statistics. Let ˜***x*_{1}*, ˜x*_{2}*, . . . , ˜x*_{n}*be n i.i.d. continuous random*

*variables having common pdf f . Deﬁne ˜x*_{(k)}*as the kth smallest value*
among all ˜*x** _{i}*’s, i.e., ˜

*x*

_{(1)}

*≤ ˜x*

_{(2)}

*≤ ˜x*

_{(3)}

*≤ . . . ≤ ˜x*

*, then ˜*

_{(n)}*x*

_{(1)}

*, . . . , ˜x*

*are known as the “order statistics” corresponding to random variables*

_{(n)}˜

*x*_{1}*, . . . , ˜x** _{n}*. We have that the joint pdf of ˜

*x*

_{(1)}

*, ˜x*

_{(2)}

*, . . . , ˜x*

*is*

_{(n)}*f*

_{˜x}_{(1)}

_{,˜}

_{x}_{(2)}

_{,...,˜}

_{x}

_{(n)}*(x*

_{1}

*, x*

_{2}

*, . . . , x*

_{n}*) = n!f (x*

_{1}

*)f (x*

_{2}

*) . . . f (x*

_{n}*),*

*where x*

_{1}

*< x*

_{2}

*< . . . < x*

*(check the textbook [Ross]).*

_{n}. . . .

**Conditional Distribution of the Arrival Times**

* Proof. Let 0 < t*1

*< t*

_{2}

*< . . . < t*

_{n+1}*= t and let h*

*be small enough so that*

_{i}*t*

_{i}*+ h*

_{i}*< t*

_{i+1}*, i = 1, . . . , n.*

*·**·**·* *P [t*_{i}*< ˜S*_{i}*< t*_{i}*+ h*_{i}*, i = 1, . . . , n|˜n(t) = n]*

=

*P*

⎛

⎝ *exactly one arrival in each [t*_{i}*, t*_{i}*+ h** _{i}*]

*i = 1, 2, . . . , n, and no arrival elsewhere in [0, t]*

⎞

⎠

*P [˜n(t) = n]*

= *(e*^{−λh}^{1}*λh*_{1}*)(e*^{−λh}^{2}*λh*_{2}*) . . . (e*^{−λh}^{n}*λh*_{n}*)(e*^{−λ(t−h}^{1}^{−h}^{2}^{...−h}^{n}^{)})
*e*^{−λt}*(λt)*^{n}*/n!*

= *n!(h*_{1}*h*_{2}*h*_{3} *. . . h** _{n}*)

*t*

^{n}*·**·*

*·* *P [t*_{i}*< ˜S*_{i}*< t*_{i}*+ h*_{i}*, i = 1, . . . , n|˜n(t) = n]*

*h*_{1}*h*_{2} *. . . h** _{n}* =

*n!*

*t*^{n}

**Conditional Distribution of the Arrival Times**

Taking lim

*h*_{i}*→0,i=1,...,n*( ), then
*f*_{S}_{˜}

1*, ˜**S*2*,..., ˜**S**n**|˜n(t)**(t*_{1}*, t*_{2}*, . . . , t*_{n}*|n) =* *n!*

*t*^{n}*, 0 < t*_{1} *< t*_{2} *< . . . < t*_{n}*.*

**Conditional Distribution of the Arrival Times**

**Example (see Ref [Ross], Ex. 2.3(A) p.68). Suppose that travellers arrive**
*at a train depot in accordance with a Poisson process with rate λ. If*
*the train departs at time t, what is the expected sum of the*

*waiting times of travellers arriving in (0, t)? That is, E[*^{}_{i=1}^{˜n(t)}*(t− ˜S** _{i}*)] =?

. . . *t*

1

*S*~ ~^{2}

*S* ~^{3}

*S* ~^{~}^{(} ^{)}

*t*

*S**n*

**Conditional Distribution of the Arrival Times**

**Answer. Conditioning on ˜***n(t) = n yields*
*E[*

*˜n(t)*

*i=1*

*(t* *− ˜S** _{i}*)

*|˜n(t) = n] = nt − E[*

^{}

^{n}*i=1*

*S*˜* _{i}*]

*= nt* *− E[*^{}^{n}

*i=1*

˜

*u** _{(i)}*] (by the theorem)

*= nt* *− E[*^{}^{n}

*i=1*

˜

*u** _{i}*] (

^{·}

_{·}

^{·}*n*
*i=1*

˜

*u** _{(i)}* =

*n*
*i=1*

˜
*u** _{i}*)

*= nt* *−* *t*

2 *· n =* *nt*

2 (^{·}_{·}^{·}*E[˜u** _{i}*] =

*t*2)

*To ﬁnd E[*

*˜n(t)*

*i=1*

*(t* *− ˜S** _{i}*)], we should take another expectation

*·**·*

*·* *E[*

*˜n(t)*

*i=1*

*(t* *− ˜S** _{i}*)] =

*t*

2 *· E[˜n(t)]*_{
}

*=λt*

= *λt*^{2}
2

**Superposition of Independent Poisson Processes**

**Theorem. Superposition of independent Poisson Processes**
*(λ*_{i}*, i = 1, . . . , N ), is also a Poisson process with rate*

*N*
1

*λ** _{i}*.

Poisson Poisson

Poisson

Poisson

λ1

λ2

λN ^{rate =} ^{∑}

*N*

1

λi

* <Homework> Prove the theorem (note that a Poisson process must*
satisfy Deﬁnitions 1 or 2).

**Decomposition of a Poisson Process**

**Theorem.**

*• Given a Poisson process N = {˜n(t), t ≥ 0};*

*• If ˜n*_{i}*(t) represents the number of type-i events that occur by time*
*t, i = 1, 2;*

*• Arrival occurring at time s is a type-1 arrival with probability p(s),*
and type-2 arrival with probability 1 *− p(s)*

*⇓then*

*• ˜n*1*, ˜n*_{2} are independent,

*• ˜n*_{1}*(t)* *∼ P (k; λtp), and*

*• ˜n*_{2}*(t)* *∼ P (k; λt(1 − p)), where p =* 1
*t*

_{t}

0 *p(s)ds*

**Decomposition of a Poisson Process**

Poisson ^{Poisson} ^{(}^{k}^{;}^{λ}

### ∫

^{p}^{(}

^{s}^{)}

^{ds}^{)}

### ∫

^{[}

^{1}

^{−}

^{(}

^{)]}

^{)}

;

(*k* λ *p* *s* *ds*

Poisson

λ ^{p(s)}

*1-p(s)*

*special case: If p(s) = p is constant, then*

Poisson Poisson rate

### λ

*p*

) 1

( − *p*

### λ

Poisson rate

### λ

^{p}*1-p*

**Decomposition of a Poisson Process**

**Proof. It is to prove that, for ﬁxed time t,**

*P [˜n*_{1}*(t) = n, ˜n*_{2}*(t) = m]* = *P [˜n*_{1}*(t) = n]P [˜n*_{2}*(t) = m]*

= *e*^{−λpt}*(λpt)*^{n}

*n!* *·* *e*^{−λ(1−p)t}*[λ(1* *− p)t]*^{m}*m!*

. . . .

*P [˜n*_{1}*(t) = n, ˜n*_{2}*(t) = m]*

=

*∞*
*k=0*

*P [˜n*_{1}*(t) = n, ˜n*_{2}*(t) = m|˜n*_{1}*(t) + ˜n*_{2}*(t) = k]* *· P [˜n*_{1}*(t) + ˜n*_{2}*(t) = k]*

= *P [˜n*_{1}*(t) = n, ˜n*_{2}*(t) = m|˜n*_{1}*(t) + ˜n*_{2}*(t) = n + m]* *· P [˜n*_{1}*(t) + ˜n*_{2}*(t) = n + m]*

**Decomposition of a Poisson Process**

*• From the “condition distribution of the arrival times”, any event*

occurs at some time that is uniformly distributed, and is independent of other events.

*• Consider that only one arrival occurs in the interval [0, t]:*

*P [type - 1 arrival|˜n(t) = 1]*

=

_{t}

0 *P [type - 1 arrival|arrival time ˜S*_{1} *= s, ˜n(t) = 1]*

*×f**S*˜1*|˜n(t)**(s|˜n(t) = 1)ds*

=

_{t}

0 *P (s)* *·* 1

*t* *ds =* 1
*t*

_{t}

0 *P (s)ds = p*

**Decomposition of a Poisson Process**

*·**·*

*·* *P [˜n*_{1}*(t) = n, ˜n*_{2}*(t) = m]*

= *P [˜n*_{1}*(t) = n, ˜n*_{2}*(t) = m|˜n*_{1}*(t) + ˜n*_{2}*(t) = n + m]* *· P [˜n*_{1}*(t) + ˜n*_{2}*(t) = n + m]*

=

⎛

⎝ *n + m*
*n*

⎞

⎠ *p** ^{n}*(1

*− p)*

^{m}*·*

*e*

^{−λt}*(λt)*

^{n+m}*(n + m)!*

= *(n + m)!*

*n!m!* *p** ^{n}*(1

*− p)*

^{m}*·*

*e*

^{−λt}*(λt)*

^{n+m}*(n + m)!*

= *e*^{−λpt}*(λpt)*^{n}

*n!* *·* *e*^{−λ(1−p)t}*[λ(1* *− p)t]*^{m}*m!*

**Decomposition of a Poisson Process**

**Example (An Inﬁnite Server Queue, textbook [Ross]).**

Poisson λ ^{departure}

∞

*G*

*• G*_{˜s}*(t) = P ( ˜S* *≤ t), where ˜S = service time*

*• G*_{˜s}*(t) is independent of each other and of the arrival process*

*• ˜n*_{1}*(t): the number of customers which have left before t;*

*• ˜n*2*(t): the number of customers which are still in the system at*
*time t;*

*⇒ ˜n*_{1}*(t)* *∼? and ˜n*_{2}*(t)* *∼?*

**Decomposition of a Poisson Process**

**Answer.**

˜

*n*_{1}*(t): the number of type-1 customers*

˜

*n*_{2}*(t): the number of type-2 customers*

type-1: *P (s)* = *P (ﬁnish before t)*

= *P ( ˜S* *≤ t − s) = G**˜s**(t* *− s)*
type-2: 1 *− P (s) = ¯G*_{˜s}*(t* *− s)*

*·**·*

*·* *n*˜_{1}*(t)* *∼ P*

*k; λt* *·* 1
*t*

_{t}

0 *G*_{˜s}*(t* *− s)ds*

˜

*n*_{2}*(t)* *∼ P*

*k; λt* *·* 1
*t*

_{t}

0

*G*¯_{˜s}*(t* *− s)ds*

**Decomposition of a Poisson Process**

*·**·*

*·* *E[˜n*_{1}*(t)]* = *λt* *·* 1
*t*

_{t}

0 *G(t* *− s)ds*

= *λ*

_{0}

*t*

*G(y)(−dy)* *t* *− s = y*
*s = t* *− y*

= *λ*

_{t}

0 *G(y)dy*

*As t* *→ ∞, we have*

*t→∞*lim *E[˜n*_{2}*(t)] = λ*

_{t}

0

*G(y)dy = λE[ ˜*¯ *S]* (Little’s formula)

**Non-homogeneous Poisson Processes**

*• The counting process N = {˜n(t), t ≥ 0} is said to be a non-stationary*
*or non-homogeneous Poisson Process with time-varying intensity*

*function λ(t), t* *≥ 0, if:*

1. ˜*n(0) = 0*

*2. N has independent increments*
*3. P [˜n(t + h)* *− ˜n(t) ≥ 2] = o(h)*

*4. P [˜n(t + h)* *− ˜n(t) = 1] = λ(t) · h + o(h)*

*• Deﬁne “integrated intensity function” m(t) =* ^{} ^{t}

0 *λ(t*^{}*)dt** ^{}*.

**Theorem.**

*P [˜n(t + s)* *− ˜n(t) = n] =* *e**−[m(t+s)−m(t)]**[m(t + s)* *− m(t)]*^{n}*n!*

**Proof. < Homework >.**

**Non-homogeneous Poisson Processes**

**Example. The “output process” of the M/G/∞ queue is a**

non-homogeneous Poisson process having intensity function
*λ(t) = λG(t), where G is the service distribution.*

**Hint. Let D(s, s + r) denote the number of service completions in the***interval (s, s + r] in (0, t]. If we can show that*

*• D(s, s + r) follows a Poisson distribution with mean λ* ^{}_{s}^{s+r}*G(y)dy,*
and

*• the numbers of service completions in disjoint intervals are*
independent,

then we are ﬁnished by deﬁnition of a non-homogeneous Poisson process.

**Non-homogeneous Poisson Processes**

**Answer.**

*• An arrival at time y is called a type-1 arrival if its service*
*completion occurs in (s, s + r].*

*• Consider three cases to ﬁnd the probability P (y) that an arrival at*
*time y is a type-1 arrival:*

*s* *s+r* *t*

Case 1 Case 2 Case 3

*y* *y* *y*

**– Case 1: y ≤ s.**

*P (y) = P{s − y < ˜S < s + r* *− y} = G(s + r − y) − G(s − y)*
**– Case 2: s < y ≤ s + r.**

*P (y) = P{ ˜S < s + r* *− y} = G(s + r − y)*

**Non-homogeneous Poisson Processes**

**– Case 3: s + r < y ≤ t.**

*P (y) = 0*

*• Based on the decomposition property of a Poisson process, we*

*conclude that D(s, s + r) follows a Poisson distribution with mean*
*λpt, where p = (1/t)*^{}_{0}^{t}*P (y)dy.*

_{t}

0 *P (y)dy* =

_{s}

0 *[G(s + r* *− y) − G(s − y)]dy +* ^{} ^{s+r}

*s*

*G(s + r* *− y)dy*
+

_{t}

*s+r*

*(0)dy*

=

_{s+r}

0 *G(s + r* *− y)dy −* ^{} ^{s}

0 *G(s* *− y)dy*

=

_{s+r}

0 *G(z)dz* *−* ^{} ^{s}

0 *G(z)dz =*

_{s+r}

*s*

*G(z)dz*

**Non-homogeneous Poisson Processes**

*• Because of*

**– the independent increment assumption of the Poisson arrival**
process, and

**– the fact that there are always servers available for arrivals,**

*⇒ the departure process has independent increments*

**Compound Poisson Processes**

*• A stochastic process {˜x(t), t ≥ 0} is said to be a compound Poisson*
*process if*

**– it can be represented as**

˜

*x(t) =*

*˜n(t)*

*i=1*

˜

*y*_{i}*,* *t* *≥ 0*
**– {˜**n(t), t*≥ 0} is a Poisson process*

**– {˜**y_{i}*, i* *≥ 1} is a family of independent and identically distributed*
random variables which are also independent of *{˜n(t), t ≥ 0}*

*• The random variable ˜x(t) is said to be a compound Poisson random*
variable.

*• E[˜x(t)] = λtE[˜y*_{i}*] and V ar[˜x(t)] = λtE[˜y*_{i}^{2}].

**Compound Poisson Processes**

**• Example (Batch Arrival Process). Consider a parallel-processing**

system where each job arrival consists of a possibly random number of tasks. Then we can model the arrival process as a compound

*Poisson process, which is also called a batch arrival process.*

*• Let ˜y** _{i}* be a random variable that denotes the number of tasks
comprising a job. We derive the probability generating function

*P*

_{˜x(t)}*(z) as follows:*

*P*_{˜x(t)}*(z)* = *E*

*z*^{˜x(t)}

*= E*

*E*

*z*^{˜x(t)}*|˜n(t)*^{} *= E*

*E*

*z*^{˜y}^{1}^{+···+˜y}^{n(t)}^{˜} *|˜n(t)*^{}

= *E*

*E*

*z*^{˜y}^{1}^{+···+˜y}^{n(t)}^{˜}

(by independence of ˜*n(t) and* *{˜y*_{i}*})*

= *E* ^{}*E* ^{}*z*^{˜y}^{1}^{} *· · · E* ^{}*z*^{˜y}^{n(t)}^{˜} ^{} (by independence of ˜*y*_{1}*,* *· · · , ˜y** _{˜n(t)}*)

= *E*

*(P*_{˜y}*(z))*^{˜n(t)}

*= P*_{˜n(t)}*(P*_{˜y}*(z))*

**Modulated Poisson Processes**

*• Assume that there are two states, 0 and 1, for a “modulating process.”*

**0** **1**

*• When the state of the modulating process equals 0 then the arrive rate*
*of customers is given by λ*_{0}, and when it equals 1 then the arrival rate
*is λ*_{1}.

*• The residence time in a particular modulating state is exponentially*
*distributed with parameter μ and, after expiration of this time, the*
modulating process changes state.

*• The initial state of the modulating process is randomly selected and is*
equally likely to be state 0 or 1.

**Modulated Poisson Processes**

*• For a given period of time (0, t), let Υ be a random variable that*
indicates the total amount of time that the modulating process has
been in state 0. Let ˜*x(t) be the number of arrivals in (0, t).*

*• Then, given Υ, the value of ˜x(t) is distributed as a non-homogeneous*
Poisson process and thus

*P [˜x(t) = n|Υ = τ] =* *(λ*_{0}*τ + λ*_{1}*(t* *− τ))*^{n}*e*^{−(λ}^{0}^{τ +λ}^{1}^{(t−τ))}*n!*

*• As μ → 0, the probability that the modulating process makes no*

transitions within t seconds converges to 1, and we expect for this case that

*P [˜x(t) = n] =* 1
2

*(λ*_{0}*t)*^{n}*e*^{−λ}^{0}^{t}

*n!* + *(λ*_{1}*t)*^{n}*e*^{−λ}^{1}^{t}*n!*

**Modulated Poisson Processes**

*• As μ → ∞, then the modulating process makes an inﬁnite number of*
*transitions within t seconds, and we expect for this case that*

*P [˜x(t) = n] =* *(βt)*^{n}*e*^{−βt}

*n!* *,* *where β =* *λ*_{0} *+ λ*_{1}
2

**• Example (Modeling Voice).**

**– A basic feature of speech is that it comprises an alternation of**
silent periods and non-silent periods.

**– The arrival rate of packets during a talk spurt period is Poisson**

*with rate λ*_{1} *and silent periods produce a Poisson rate with λ*_{0} *≈ 0.*

**– The duration of times for talk and silent periods are exponentially**
*distributed with parameters μ*_{1} *and μ*_{0}, respectively.

*⇒ The model of the arrival stream of packets is given by a modulated*
Poisson process.

**Poisson Arrivals See Time Averages (PASTA)**

*• PASTA says: as t → ∞*

Fraction of arrivals who see the system in a given state

upon arrival (arrival average)

= Fraction of time the system is in a given state (time average)

= The system is in the given state at any random time

after being steady

*• Counter-example (textbook [Kao]: Example 2.7.1)*

0 1 2 3 4 5

1 1

1/2 1/2

service time = 1/2 inter-arrival time = 1

**Poisson Arrivals See Time Averages (PASTA)**

**– Arrival average that an arrival will see an idle system = 1**
**– Time average of system being idle = 1/2**

*• Mathematically,*

**– Let X = {˜**x(t), t*≥ 0} be a stochastic process with state space S,*
*and B* *⊂ S*

**– Deﬁne an indicator random variable**

˜

*u(t) =*

⎧⎨

⎩

*1,* if ˜*x(t)* *∈ B*
*0,* otherwise

**– Let N = {˜**n(t), t*≥ 0} be a Poisson process with rate λ denoting the*
arrival process

then,

**Poisson Arrivals See Time Averages (PASTA)**

*t→∞*lim

_{t}

0 *u(s)d˜*˜ *n(s)*

˜

*n(t)* = lim

*t→∞*

_{t}

0 *u(s)ds*˜
*t*

*(arrival average)* *(time average)*

*• Condition – For PASTA to hold, we need the lack of anticipation*
*assumption (LAA): for each t* *≥ 0,*

**– the arrival process {˜**n(t + u)*− ˜n(t), u ≥ 0} is independent of*
*{˜x(s), 0 ≤ s ≤ t} and {˜n(s), 0 ≤ s ≤ t}.*

*• Application:*

**– To ﬁnd the waiting time distribution of any arriving customer**
**– Given: P[system is idle] = 1 − ρ; P[system is busy] = ρ**

**Poisson Arrivals See Time Averages (PASTA)**

Poisson

Poisson

Case 1: system is idle

Case 2: system is busy

*⇒ P ( ˜w* *≤ t) = P ( ˜w* *≤ t|idle) · P (idle upon arrival)*
+ *P ( ˜w* *≤ t|busy) · P (busy upon arrival)*

**Memoryless Property of the Exponential Distribution**

*• A random variable ˜x is said to be without memory, or memoryless, if*
*P [˜x > s + t|˜x > t] = P [˜x > s] for all s, t ≥ 0* (3)

*• The condition in Equation (3) is equivalent to*
*P [˜x > s + t, ˜x > t]*

*P [˜x > t]* *= P [˜x > s]*

or

*P [˜x > s + t] = P [˜x > s]P [˜x > t]* (4)

*• Since Equation (4) is satisﬁed when ˜x is exponentially distributed (for*
*e*^{−λ(s+t)}*= e*^{−λs}*e** ^{−λt}*), it follows that exponential random variable are
memoryless.

*• Not only is the exponential distribution “memoryless,” but it is the*
unique continuous distribution possessing this property.

**Comparison of Two Exponential Random Variables**

Suppose that ˜*x*_{1} and ˜*x*_{2} are independent exponential random variables
*with respective means 1/λ*_{1} *and 1/λ*_{2}*. What is P [˜x*_{1} *< ˜x*_{2}]?

*P [˜x*_{1} *< ˜x*_{2}] =

_{∞}

0 *P [˜x*_{1} *< ˜x*_{2}*|˜x*_{1} *= x]λ*_{1}*e*^{−λ}^{1}^{x}*dx*

=

_{∞}

0 *P [x < ˜x*_{2}*]λ*_{1}*e*^{−λ}^{1}^{x}*dx*

=

_{∞}

0 *e*^{−λ}^{2}^{x}*λ*_{1}*e*^{−λ}^{1}^{x}*dx*

=

_{∞}

0 *λ*_{1}*e*^{−(λ}^{1}^{+λ}^{2}^{)x}*dx*

= *λ*_{1}

*λ*_{1} *+ λ*_{2}

**Minimum of Exponential Random Variables**

Suppose that ˜*x*_{1}*, ˜x*_{2}*,· · · , ˜x** _{n}* are independent exponential random variables,
with ˜

*x*

_{i}*having rate μ*

_{i}*, i = 1,*

*· · · , n. It turns out that the smallest of the ˜x*

_{i}*is exponential with a rate equal to the sum of the μ*

*.*

_{i}*P [min(˜x*_{1}*, ˜x*_{2}*,* *· · · , ˜x*_{n}*) > x]* = *P [˜x*_{i}*> x for each i = 1,* *· · · , n]*

=

*n*
*i=1*

*P [˜x*_{i}*> x]* (by independence)

=

*n*
*i=1*

*e*^{−μ}^{i}^{x}

= *exp*

*−*

_{n}

*i=1*

*μ*_{i}

*x*

How about max(˜*x*_{1}*, ˜x*_{2}*,* *· · · , ˜x** _{n}*)? (exercise)