• 沒有找到結果。

OTHER DISCRETE RANDOM VARIABLES

在文檔中 ProbabilitY Fundamentals of (頁 105-117)

Technique 2: We have that

5.3 OTHER DISCRETE RANDOM VARIABLES

Section 5.3 Other Discrete Random Variables 99

= m=0

n+ m n



pn(1− p)m· e−λt(λt )n+m (n+ m)!

= m=0

(n+ m)!

n! m! pn(1− p)me−λtpe−λt(1−p)(λt )n(λt )m (n+ m)!

= m=0

e−λtpe−λt(1−p)(λtp)n

λt (1− p)m

n! m!

= e−λtp(λtp)n n!

m=0

e−λt(1−p)

λt (1− p)m

m!

= e−λtp(λtp)n n! .

It can easily be argued that the other properties of Poisson process are also satisfied for the process

N1(t ): t ≥ 0 . So

N1(t ): t ≥ 0

is a Poisson process with rate λp. By symmetry,

N2(t ): t ≥ 0

is a Poisson process with rate λ(1− p).

25. Let N (t) be the number of females entering the store between 0 and t. By Exercise 24, N (t ): t ≥ 0

is a Poisson process with rate 1· (2/3) = 2/3. Hence the desired probability is

P

N (15)= 15

= e−15(2/3)

15(2/3)15

15! = 0.035.

26. (a) Let A be the region whose points have a (positive) distance d or less from the given tree.

The desired probability is the probability of no trees in this region and is equal to e−λπd2(λπ d2)0

0! = e−λπd2.

(b) We want to find the probability that the region A has at most n− 1 trees. The desired quantity is

n−1 i=0

e−λπd2(λπ d2)i

i! .

27. p(i) = (λ/i)p(i − 1) implies that for i < λ, the function p is increasing and for i > λ it is decreasing. Hence i = [λ] is the maximum.

2. S=

ss, f ss, sf s, sff s, ff ss, f sf s, sfff s, f sff s, fff ss, ff sf s, . . . . 3. (a) 1/(1/12)= 12. (b)11

12 2 1

12

≈ 0.07.

4. (a) (1− pq)r−1pq. (b) 1/pq.

5.

7 2



(0.2)3(0.8)5≈ 0.055.

6. (a) (0.55)5(0.45)≈ 0.023. (b) (0.55)3(0.45)(0.55)3(0.45)≈ 0.0056.

7. 5 1

45 7

50 8



= 0.42.

8. The probability that at least n light bulbs are required is equal to the probability that the first n− 1 light bulbs are all defective. So the answer is pn−1.

9. We have

P (N = n) P (X= x) =

n− 1 x− 1



px(1− p)n−x

n x



px(1− p)n−x

= x n.

10. Let X be the number of the words the student had to spell until spelling a word correctly. The random variable X is geometric with parameter 0.70. The desired probability is given by

P (X≤ 4) = 4

i=1

(0.30)i−1(0.70)= 0.9919.

11. The average number of digits until the fifth 3 is 5/(1/10) = 50. So the average number of digits before the fifth 3 is 49.

12. The probability that a random bridge hand has three aces is

p=

4 3

48 10



52 13

 = 0.0412.

Therefore, the average number of bridge hands until one has three aces is 1/p= 1/0.0412 = 24.27.

13. Either the (N + 1)st success must occur on the (N + M − m + 1)st trial, or the (M + 1)st

Section 5.3 Other Discrete Random Variables 101 failure must occur on the (N + M − m + 1)st trial. The answer is

N+ M − m N

1 2

N+M−m+1

+

N+ M − m M

1 2

N+M−m+1

.

14. We have that X+ 10 is negative binomial with parameters (10, 0.15). Therefore, ∀i ≥ 0, P (X= i) = P (X + 10 = i + 10) =

i+ 9 9



(0.15)10(0.85)i. 15. Let X be the number of good diskettes in the sample. The desired probability is

P (X≥ 9) = P (X = 9) + P (X = 10) =

10 1

90 9



100 10

 +

90 10

10 0



100 10

 ≈ 0.74.

16. We have that 560(0.35)= 196 persons make contributions. So the answer is

1−

364 15



560 15

 −

364 14

196 1



560 15

 = 0.987.

17. The transmission of a message takes more than t minutes, if the first[t/2] + 1 times it is sent it will be garbled, where[t/2] is the greatest integer less than or equal to t/2. The probability of this is p[t/2]+1.

18. The probability that the sixth coin is accepted on the nth try is

n− 1 5



(0.10)6(0.90)n−6. Therefore, the desired probability is

n=50

n− 1 5



(0.10)6(0.90)n−6= 1 − 49 n=6

n− 1 5



(0.10)6(0.90)n−6= 0.6346.

19. The probability that the station will successfully transmit or retransmit a message is (1−p)N−1. This is because for the station to successfully transmit or retransmit its message, none of the other stations should transmit messages at the same instance. The number of transmissions and retransmissions of a message until the success is geometric with parameter (1− p)N−1. Therefore, on average, the number of transmissions and retransmissions is 1/(1− p)N−1.

20. If the fifth tail occurs after the 14th trial, ten or more heads have occurred. Therefore, the fifth tail occurs before the tenth head if and only if the fifth tail occurs before or on the 14th flip.

Calling tails success, X, the number of flips required to get the fifth tail is negative binomial with parameters 5 and 1/2. The desired probability is given by

14 n=5

P (X= n) = 14 n=5

n− 1 4

1 2

51 2

n−5

≈ 0.91.

21. The probability of a straight is 10

45

 − 40 52

5

 = 0.003924647.

Therefore, the expected number of poker hands required until the first straight is 1/0.003924647= 254.80.

22. (a) Since

P (X= n − 1) P (X= n) = 1

1− p >1,

P (X= n) is a decreasing function of n; hence its maximum is at n = 1.

(b) The probability that X is even is given by

k=1

P (X= 2k) =

k=1

p(1− p)2k−1 = p(1− p)

1− (1 − p)2 =1− p 2− p. (c) We want to show the following:

Let X be a discrete random variable with the set of possible values

1, 2, 3 . . . . If for all positive integers n and m,

P (X > n+ m | X > m) = P (X > n), (17) then X is a geometric random variable. That is, there exists a number p, 0 < p < 1, such that

P (X= n) = p(1 − p)n−1. (18)

To prove this, note that (17) implies that for all positive integers n and m, P (X > n+ m)

P (X > m) = P (X > n).

Therefore,

P (X > n+ m) = P (X > n)P (X > m). (19)

Section 5.3 Other Discrete Random Variables 103 Let p= P (X = 1); using induction, we prove that (18) is valid for all positive integers n. To show (18) for n= 2, note that (19) implies that

P (X >2)= P (X > 1)P (X > 1).

Since P (X > 1)= 1 − P (X = 1) = 1 − p, this relation gives 1− P (X = 1) − P (X = 2) = (1 − p)2, or

1− p − P (X = 2) = (1 − p)2, which yields

P (X= 2) = p(1 − p),

so (18) is also true for n= 2. Now assume that (18) is valid for all positive integers i, i ≤ n;

that is, assume that

P (X= i) = p(1 − p)i−1, i≤ n. (20)

We will show that (18) is true for n+ 1. The induction hypothesis [relation (20)] implies that P (X≤ n) =

n i=1

P (X= i) = n

i=1

p(1− p)i−1= p1− (1 − p)n

1− (1 − p) = 1 − (1 − p)n. So P (X > n)= (1 − p)nand, similarly, P (X > n− 1) = (1 − p)n−1.Now (19) yields

P (X > n+ 1) = P (X > n)P (X > 1), which implies that

1− P (X ≤ n) − P (X = n + 1) = (1 − p)n(1− p).

Substituting P (X≤ n) = 1 − (1 − p)nin this relation, we obtain P (X= n + 1) = p(1 − p)n,

which establishes (18) for n+ 1. Therefore, we have what we wanted to show.

23. Consider a coin for which the probability of tails is 1− p and the probability of heads is p.

In successive and independent flips of the coin, let X1 be the number of flips until the first head, X2be the total number of flips until the second head, X3be the total number of flips until the third head, and so on. Then the length of the first character of the message and X1

are identically distributed. The total number of the bits forming the first two characters of the message and X2are identically distributed. The total number of the bits forming the first three characters of the message and X3are identically distributed, and so on. Therefore, the total number of the bits forming the message has the same distribution as Xk. This is negative binomial with parameters k and p.

24. Let X be the number of cartons to be opened before finding one without rotten eggs. X is not a geometric random variable because the number of cartons is limited, and one carton not having rotten eggs is not independent of another carton not having rotten eggs. However, it should be obvious that a geometric random variable with parameter p=

1000 12

1200 12



= 0.1109 is a good approximation for X. Therefore, we should expect approximately 1/p= 1/0.1109 = 9.015 cartons to be opened before finding one without rotten eggs.

25. Either the N th success should occur on the (2N− M)th trial or the Nth failure should occur on the (2N− M)th trial. By symmetry, the answer is

2N − M − 1 N− 1

1 2

N1 2

N−M

=

2N− M − 1 N − 1

1 2

2N−M−1

.

26. The desired quantity is 2 times the probability of exactly N successes in (2N− 1) trials and failures on the (2N )th and (2N+ 1)st trials:

2

2N− 1 N

1 2

N 1−1

2

(2N−1)−N

· 1−1

2 2

=

2N − 1 N

1 2

2N

.

27. Let X be the number of rolls until Adam gets a six. Let Y be the number of rolls of the die until Andrew rolls an odd number. Since the events (X = i), 1 ≤ i < ∞, form a partition of the sample space, by Theorem 3.4,

P

Y > X

=

i=1

P

Y > X| X = i P

X= i

=

i=1

P Y > i

P X= i

=

i=1

1 2

i

·5 6

i−11 6 =6

5 ·1 6

i=1

 5 12

i

= 1 5·

5 12 1− 5

12

=1 7,

where P (Y > i)= (1/2)isince for Y to be greater than i, Andrew must obtain an even number on each of the the first i rolls.

28. The probability of 4 tagged trout among the second 50 trout caught is

pn=

50 4

n− 50 46



n 50

 .

It is logical to find the value of n for which pnis maximum. (In statistics this value is called the maximum likelihood estimate for the number of trout in the lake.) To do this, note that

pn

pn−1 = (n− 50)2 n(n− 96).

Section 5.3 Other Discrete Random Variables 105 Now pn ≥ pn−1if and only if (n− 50)2≥ n(n − 96), or n ≤ 625. Therefore, n = 625 makes pnmaximum, and hence there are approximately 625 trout in the lake.

29. (a) Intuitively, it should be clear that the answer is D/N . To prove this, let Ejbe the event of obtaining exactly j defective items among the first (k− 1) draws. Let Akbe the event that the kth item drawn is defective. We have

P (Ak)= k−1 j=0

P (Ak | Ej)P (Ej)= k−1

j=0

D− j N− k + 1·

D j

 N − D k− 1 − j



 N k− 1

 .

Now

(D− j)

D j



= D

D− 1 j



and

(N− k + 1)

 N k− 1



= N

N− 1 k− 1

 . Therefore,

P (Ak)=

k−1

j=0

D

D− 1 j

 N− D k− 1 − j



N

N− 1 k− 1

 = D

N

k−1

j=0

D− 1 j

 N− D k− 1 − j



N− 1 k− 1

 = D

N,

where

k−1 j=0

D− 1 j

 N− D k− 1 − j



N− 1 k− 1

 = 1

since

D− 1 j

 N− D k− 1 − j



N− 1 k− 1

 is the probability mass function of a hypergeometric random

variable with parameters N − 1, D − 1, and k − 1.

(b) Intuitively, it should be clear that the answer is (D− 1)/(N − 1). To prove this, let Akbe as before and let Fj be the event of exactly j defective items among the first (k− 2) draws.

Let B be the event that the (k− 1)st and the kth items drawn are defective. We have

P (B)= k−2 j=0

P (B | Fj)P (Fj)

= k−2 j=0

(D− j)(D − j − 1) (N − k + 2)(N − k + 1)·

D j

 N− D k− 2 − j



 N k− 2



= k−2 j=0

D(D− 1)

D− 2 j

 N− D k− 2 − j



N (N− 1)

N− 2 k− 2



= D(D− 1) N (N− 1)

k−2 j=0

D− 2 j

 N− D k− 2 − j



N − 2 k− 2



= D(D− 1) N (N− 1).

Using this, we have that the desired probability is

P (Ak | Ak−1)=P (AkAk−1)

P (Ak−1) = P (B) P (Ak−1) =

D(D− 1) N (N− 1)

D N

= D− 1 N− 1.

REVIEW PROBLEMS FOR CHAPTER 5

1.

20 i=12

20 i



(0.25)i(0.75)20−i = 0.0009.

2. N (t ), the number of customers arriving at the post office at or prior to t is a Poisson process with λ= 1/3. Thus

P

N (30)≤ 6

= 6

i=0

P

N (30)= i

= 6

i=0

e−(1/3)30

(1/3)30i

i! = 0.130141.

3. 4· 8

30 = 1.067.

4.

2 i=0

12 i



(0.30)i(0.70)12−i = 0.253.

Chapter 5 Review Problems 107 5.

5 2



(0.18)2(0.82)3= 0.179.

6.

1999

i=2

i− 1 2− 1

 1 1000

2 999 1000

i−2

= 0.59386.

7.

12 i=7

160 i

 200 12− i



360 12

 = 0.244.

8. Call a train that arrives between 10:15A.M.and 10:28A.M.a success. Then p, the probability of success is

p= 28− 15 60 = 13

60.

Therefore, the expected value and the variance of the number of trains that arrive in the given period are 10(13/60)= 2.167 and 10(13/60)(47/60) = 1.697, respectively.

9. The number of checks returned during the next two days is Poisson with λ= 6. The desired probability is

P (X≤ 4) = 4

i=0

e−66i

i! = 0.285.

10. Suppose that 5% of the items are defective. Under this hypothesis, there are 500(0.05)= 25 defective items. The probability of two defective items among 30 items selected at random is

25 2

475 28



500 30

 = 0.268.

Therefore, under the above hypothesis, having two defective items among 30 items selected at random is quite probable. The shipment should not be rejected.

11. N is a geometric random variable with p = 1/2. So E(N) = 1/p = 2, and Var(N) = (1− p)/p2=

1− (1/2)

/(1/4)= 2.

12. 5 6

51 6

= 0.067.

13. The number of times a message is transmitted or retransmitted is geometric with parameter 1− p. Therefore, the expected value of the number of transmissions and retransmissions of a

message is 1/(1− p). Hence the expected number of retransmissions of a message is 1

1− p − 1 = p 1− p.

14. Call a customer a “success,” if he or she will make a purchase using a credit card. Let E be the event that a customer entering the store will make a purchase. Let F be the event that the customer will use a credit card. To find p, the probability of success, we use the law of multiplication:

p= P (EF ) = P (E)P F | E

= (0.30)(0.85) = 0.255.

The random variable X is binomial with parameters 6 and 0.255. Hence P

X= i

=

6 i

0.255i

1− 0.2556−i

, i = 0, 1, . . . , 6.

Clearly, E(X)= np = 6(0.255) = 1.53 and

Var(X)= np(1 − p) = 6(0.255)(1 − 0.255) = 1.13985.

15.

5 i=3

18 i

 10 5− i



28 5

 = 0.772.

16. By the formula for the expected value of a hypergeometric random variable, the desired quantity is (5× 6)/16 = 1.875.

17. We want to find the probability that at most 4 of the seeds do not germinate:

4 i=0

40 i



(0.06)i(0.94)40−i = 0.91.

18. 1− 2

i=0

20 i



(0.06)i(0.94)20−i = 0.115.

Let X be the number of requests for reservations at the end of the second day. It is reasonable to assume that X is Poisson with parameter 3× 3 × 2 = 18. Hence the desired probability is

P (X≥ 24) = 1 − 23

i=0

P (X= i) = 1 − 23

i=0

e−18(18)i

i! = 1 − 0.89889 = 0.10111.

Chapter 5 Review Problems 109 19. Suppose that the company’s claim is correct. Then the probability of 12 or less drivers using

seat belts regularly is

12 i=0

20 i



(0.70)i(0.30)20−i ≈ 0.228.

Therefore, under the assumption that the company’s claim is true, it is quite likely that out of 20 randomly selected drivers, 12 use seat belts. This is not a reasonable evidence to conclude that the insurance company’s claim is false.

20. (a) (0.999)999(0.001)1= 0.000368. (b)

2999 2



(0.001)3(0.999)2997= 0.000224.

21. Let X be the number of children having the disease. We have that the desired probability is

P (X= 3 | X ≥ 1) = P (X= 3) P (X≥ 1) =

5 3



(0.23)3(0.77)2

1− (0.77)5 = 0.0989.

22. (a)

 w

w+ b

n−1 b w+ b

. (b)

 w

w+ b n−1

.

23. Let n be the desired number of seeds to be planted. Let X be the number of seeds which will germinate. We have that X is binomial with parameters n and 0.75. We want to find the smallest n for which

P (X≥ 5) ≥ 0.90.

or, equivalently,

P (X <5)≤ 0.10.

That is, we want to find the smallest n for which 4

i=0

n i



(0.75)i(.25)n−i ≤ 0.10.

By trial and error, as the following table shows, we find that the smallest n satisfying P (X <5)≤ 0.10 is 9. So at least nine seeds is to be planted.

n 4 i=0

n

i

(0.75)i(.25)n−i

5 0.7627

6 0.4661

7 0.2436

8 0.1139

9 0.0489

24. Intuitively, it must be clear that the answer is k/n. To prove this, let B be the event that the ith baby born is blonde. Let A be the event that k of the n babies are blondes. We have

P (B | A) = P (AB) P (A) =

p·

n− 1 k− 1



pk−1(1− p)n−k

n k



pk(1− p)n−k =

n− 1 k− 1



n k

 = k

n.

25. The size of a seed is a tiny fraction of the size of the area. Let us divide the area up into many small cells each about the size of a seed. Assume that, when the seeds are distributed, each of them will land in a single cell. Accordingly, the number of seeds distributed will equal the number of nonempty cells. Suppose that each cell has an equal chance of having a seed independent of other cells (this is only approximately true). Since λ is the average number of seeds per unit area, the expected number of seeds in the area, A, is λA. Let us call a cell in Aa “success” if it is occupied by a seed. Let n be the total number of cells in A and p be the probability that a cell will contain a seed. Then X, the number of cells in A with seeds is a binomial random variable with parameters n and p. Using the formula for the expected number of successes in a binomial distribution (= np), we see that np = λA and p = λA/n.

As n goes to infinity, p approaches zero while np remains finite. Hence the number of seeds that fall on the area A is a Poisson random variable with parameter λA and

P (X= i) = e−λA(λA)i i! . 26. Let D/N → p, then by the Remark 5.2, for all n,

D x

N− D n− x



N n

 ≈

n x



px(1− p)n−x.

Now since n→ ∞ and nD/N → λ, n is large and np is appreciable, thus

n x



px(1− p)n−xe−λλx x! .

Chapter 6

C ontinuous R andom

V ariables

在文檔中 ProbabilitY Fundamentals of (頁 105-117)

相關文件