• 沒有找到結果。

1.3 Conditional Probability and Independence

N/A
N/A
Protected

Academic year: 2022

Share "1.3 Conditional Probability and Independence"

Copied!
7
0
0

加載中.... (立即查看全文)

全文

(1)

1.3 Conditional Probability and Independence

All of the probabilities that we have dealt with thus far have been unconditional probabilities.

A sample space was defined and all probabilities were calculated with respect to that sample space. In many instances, however, we are in a position to update the sample space based on new information. In such cases, we want to be able to update probability calculations or to calculate conditional probabilities.

Example 1.3.1 (Four aces) Four cards are dealt from the top of a well-shuffled deck. What is the probability that they are four aces?

The probability is 1/¡52

4

¢= 270,7251 .

We can also calculate this probability by an “updating” argument, as follows. The probability that the first card is an ace is 4/52. Given that the first card is an ace, the probability that the second card is an ace is 3/51. Continuing this argument, we get the desired probability as

4 52

3 51

2 50

1

49 = 1

270, 725. Definition of Independence

If A and B are events in S, and P (B) > 0, then the conditional probability of A given B, written p(A|B), is

P (A|B) = P (A ∩ B)

P (B) . (1)

Note that what happens in the conditional probability calculation is that B becomes the sample space: P (B|B) = 1. The intuition is that our original sample space, S, has been updated to B. All further occurrence are then calibrated with respect to their relation to B.

Example 1.3.3 (Continuation of Example ??) Calculate the conditional probabilities given

(2)

that some aces have already been drawn.

P (4 aces in 4 cards|i aces in i cards)

= P ({4 aces in 4 cards} ∩ {i aces in i cards}) P (i aces in i cards)

= P (4 aces in 4 cards) P (i aces in i cards)

=

¡52

i

¢

¡52

4

¢¡4

i

¢ = (4 − i)!48!

(52 − i)! = 1

¡52−i

4−i

¢

For i = 1, 2 and 3, the conditional probabilities are .00005, .00082, and ,02041, respectively.

(Three prisoners)

Three prisoners, A, B, and C, are on death row. The governor decides to pardon one of the three and chooses at random the prisoner to pardon. He informs the warden of his choice but requests that the name be kept secret for a few days. The next day, A tries to get the warden to tell him who has been pardoned. The warden refuses. A then asks which of B or C will be executed. The warden thinks for a while, then tells A that B is to be executed.

Warden’s reasoning: Each prisoner has a 13 chance of being pardoned. Clearly, either B or C must be executed, so I have given A no information about whether A will be pardoned.

A’s reasoning: Given that B will be executed, then either A or C will be pardoned.

My chance of being pardoned has risen to 12.

Let A, B, and C denote the events that A, B, or C is pardoned, respectively. We know that P (A) = P (B) = P (C) = 1/3. Let W denote the event that the warden says B will die.

Using (1), A can update his probability of being pardoned to P (A|W ) = P (A ∩ W )

P (W ) . What is happening can be summarized in this table:

(3)

Prisoner Pardoned Warden tells A Probability

A B dies 16

A C dies 16

B C dies 13

C B dies 13

Using this table, we can calculate

P (W ) = P (warden says B dies)

= P (warden says B dies and A pardoned) + P (warden says B dies and C pardoned) + P (warden says B dies and B pardoned)

= 1 6 +1

3 + 0 = 1 2. Thus, using the warden’s reasoning, we have

P (A|W ) = P (A ∩ W ) P (W )

= P (warden says B dies and A pardoned)

P (warden says B dies) = 1/6 1/2 = 1

3. However, A falsely interprets the event W as equal to the event Bc and calculates

P (A|Bc) = P (A ∩ Bc)

P (Bc) = 1/3 2/3 = 1

2.

We see that conditional probabilities can be quite slippery and require careful interpretation.

The following are several variations of (1):

P (A ∩ B) = P (A|B)P (B) P (A ∩ B) = P (B|A)P (A) P (A|B) = P (B|A)P (A)

(4)

Then, for each i = 1, 2, . . .,

P (A

i

|B) = P (B|A

i

)P (A

i

) P

j=1

P (B|A

j

)P (A

j

) .

Example 1.3.6 (Codings)

When coded messages are sent, there are sometimes errors in transmission. In particular, Morse code uses “dots” and “dashes”, which are known to occur in the proportion of 3:4.

This means that for any given symbol, P (dot sent) = 3

7 and P (dash sent) = 4 7.

Suppose there is interference on the transmission line, and with probability 18 a dot is mis- takenly received as a dash, and vice versa. If we receive a dot, can we be sure that a dot was sent?

Using Bayes’s Rule, we can write

P (dot sent|dot received) = P (dot received|dot sent) P (dot sent) P (dot received). P (dot received) = P (dot received ∩ dot sent)

+ P (dot received ∩ dash sent)

= P (dot received|dot sent)P (dot sent) + P (dash received|dash sent)P (dash sent)

= 7 8× 3

7+ 1 8× 4

7 = 25 56. So we have

P (dot sent|dot received) = (7/8) × (3/7) 25/56 = 21

25. Example A (Incidence of Rare Disease)

(5)

In some cases it may happen that the occurrence of a particular event, B, has no effect on the probability of another event, A. For these cases, we have the following definition.

Defintion of Independence

Tow events, A and B, are statistically independent if P (A ∩ B) = P (A)P (B).

Example 1.3.8

The gambler introduced at the start of the chapter, Chevalier de Mere, was particularly interested in the event that he could throw at least 1 six in 4 rolls of a die. We have

P (at least 1 six in 4 rolls) = 1 − P (no six in 4 rolls)

= 1 − Y4 i=1

P (no six on roll i)

= 1 − (5

6)4 = 0.518.

The equality of the second line follows by independence of the rolls.

Theorem 1.3.9

If A and B are independent events, then the following pairs are also independent:

a. A and Bc. b. Ac and B.

c. Ac and Bc.

Proof: (a) can be proved as follows.

P (A ∩ Bc) = P (A) − P (A ∩ B) = P (A) − P (A)P (B) (A and B are independent)

= P (A)(1 − P (B)) = P (A)P (Bc)

¤

(6)

Define the following events:

A = {double appear} = {(1, 1), (2, 2), (3, 3), (4, 4), (5, 5), (6, 6)}, B = {the sum is between 7 and 10}

C = {the sum is 2, 7 or 8}.

Thus, we have P (A) = 1/6, P (B) = 1/2 and P (C) = 1/3. Furthermore, P (A ∩ B ∩ C) = P (the sum is 8, composed of double 4s)

= 1 36 = 1

6 ×1 2 × 1

3

= P (A)P (B)P (C) However,

P (B ∩ C) = P (sum equals 7 or 8) = 11

36 6= P (B)P (C).

Similarly, it can be shown that P (A∩B) 6= P (A)P (B). Therefore, the requirement P (A∩B∩

C) = P (A)P (B)P (C) is not a strong enough condition to guarantee pairwise independence.

Definition 1.3.12 (Mutual Independence)

A collection of events A1, . . . , Anare mutually independent if for any subcollection Ai1, . . . , Aik, we have

P (∩kj=1Aij) = Yk j=1

P (Aij).

Example 1.3.13 (Three coin tosses)

Consider the experiment of tossing a coin three times. The sample space is {HHH, HHT, HT H, T HH, T T H, T HT, HT T, T T T }. Let Hi, i = 1, 2, 3, denote the event that the ith toss is a head. For example, H1 = {HHH, HHT, HT H, HT T }. Assuming that the coin is fair and has an equal probability of landing heads or tails on each toss, the events H1, H2 and H3 are mutually independent.

To verify this we note that

P (H1∩ H2∩ H3) = P ({HHH}) = 1

8 = P (H1)P (H2)P (H3).

To verify the condition in Definition 1.3.12, we must check each pair. For example, P (H1∩ H2) = P ({HHH, HHT }) = 2

8 = P (H1)P (H2).

(7)

The equality is also true for the other two pairs. Thus, H1, H2 and H3 are mutually inde- pendent.

參考文獻

相關文件

Suppose you roll two four-faced dice, with faces labeled 1, 2, 3, 4, and each equally likely to appear on top.. Let X denote the smaller of the two numbers that appear

If k = 1, the compactness condition can be replaced by (M, g) being com- plete, by the classical Obata’s theorem..

• A conditional jump instruction branches to a label when specific register or flag conditions are met.

[r]

[r]

In a nonparametric setting, we discuss identifiability of the conditional and un- conditional survival and hazard functions when the survival times are subject to dependent

Replacing integrals with sums proves the lemma for discrete

To get the inteval of convergence , we need to check the end points, 1