• 沒有找到結果。

# Inexact Reasoning

N/A
N/A
Protected

Share "Inexact Reasoning"

Copied!
35
0
0

(1)

(2)

(3)

## Why Certainty Factors

### P(D | E) = =

– E: evidence – D: diseases ) ( ) ( ) | ( E P D P D E P

### ∑

j Dj P Dj E P D P D E P ) ( ) | ( ) ( ) | (

(4)

## Why Certainty Factors

### domain experts

– The experts do not agree to the relation of belief and disbelief

### • Another example

– If this is the last course required to a degree

• P(graduate | ‘A’ in this course) = 0.7 • P(not graduate | ‘A’ in this course) = 0.3

(5)

## Certainty Factors

### • Certainty Factor

– CF(H , E) = MB(H , E) – MD(H , E)

– CF is the certainty factor in the hypothesis H due to evidence E

– which is the difference between belief and disbelief – MB is the measure of increased belief in H due to E – MD is the measure of increased disbelief in H due to E

(6)

## Certainty Factors

### 2

• Belief 1 if P(H) = 1 MB(H , E) = else • Disbelief: 1 if P(H) = 0 MD(H , E) = else ) ( 1 ) ( )] ( ), | ( [ H P H P H P E H P Max − − ) ( 0 ) ( )] ( ), | ( [ H P H P H P E H P Max − −

(7)

(8)

## Characteristics

### Ranges

0 ≤ MB ≤ 1 , 0 ≤ MD ≤ 1 ,

-1 ≤ CF ≤ 1

### P(H | E) = 1

MB = 1 , MD = 0 , CF = 1

### P(H’ | E) = 1

MB = 0 , MD = 1 , CF = -1

(9)

## Meaning

### important

e.g. CF = 0.7 = 0.7 – 0 = 0.8 – 0.1

### – e.g. CF(H , E)=0.70, CF(H’ , E) =-0.70

I am 70% certain that I will graduate if I get an ‘A’ in this course

(10)

(11)

(12)

E1

E2

E1

E2

E1

(13)

(14)

(15)

## Combining Conclusions

• The same conclusion is derived by different evidences • Example : IF E1 THEN A CF1 IF E2 THEN A CF2 • Combining Function CF1 + CF2 * (1 – CF1) both > 0 CF = one < 0 CF1 + CF2 * (1 + CF1) both < 0

(16)

(17)

## MYCIN stores current CF

### COMBINE

Rule 1

CF1(H,e) = CF1(E,e) CF1(H,E)

NOT (–) OR (max) AND (min) Hypothesis , H Rule 2

CF2(H,e) = CF2(E,e) CF2(H,E)

NOT (–) OR (max) AND (min)

(18)

### • Easy understanding

(19)

• CF values could be the opposite of conditional probabilities

### e.g.

P(H1) = 0.8 P(H2) = 0.2 P(H1 | E) = 0.6 P(H2 | E) = 0.4 CF(H1 | E) = -1 CF(H2 | E) = 0.25

A higher conditional probability and a lower certainty factor Contradiction

• In general: P(H | e) ≠ P(H | i) * P(i | e)

• But in MYCIN: CF(H | e) = CF(H | i) * CF(i | e) Only suitable for short inference chains

(20)

(21)

## Environment

### exhaustive:

– university of discourse in set

### • Each subset of the environment is a possible

– e.g.”What is the military aircraft” {bomber,fighter} } ..., , {Θ1 Θ2 ΘN = Θ

(22)

## All Subsets

### • Example:{

ψ,{A},{B},{F},{A,B},{A,F},{B,F},{A,B,F}

### }

{B , F} {F} {B} {A , F} {A , B} {A} Θ={A , B , F}

(23)

## Mass Function

### • Example: Aircraft identification

m({F}) = 0.5 m({B}) = 0.3 m({B , F}) = 0.2

(24)

(25)

(26)

## D – S and Probability

### m(Θ) does not have to be 1

If X⊆Y , it is not necessary that m(X) ≤ m(Y)

No required relationship between m(X) and m(X’)

1 =

i i P

(27)

## Combining Evidences

• Orthogonal Sum: ⊕ • Example m1({B , F}) = 0.7 m1({A , B , F}) = 0.3 m2({B}) = 0.9 m2({A , B , F}) = 0.1 m2({B}) = 0.9 m2({A , B , F}) = 0.1 m1({B , F}) = 0.7 {B} 0.63 {B , F} 0.07 m1({A , B , F}) = 0.3 {B} 0.27 {A , B , F} 0.03 m3({B}) = 0.63 + 0.27 = 0.9 m3({B , F}) = 0.07 m3({A , B , F}) = 0.03 [nonbelief]

(28)

## Evidential Interval

### • 0 <= Bel <= Pls <= 1

lower bound or support (Spt) or (Bel) upper bound or plausibility (Pls)

(29)

## Common Evidential Intervals

### Meaning

[1,1] [0,0] [0,1]

[Bel,1] where 0 < Bel < 1 here [0,Pls] where 0 < Pls < 1 here

[Bel,Pls] where 0 < Bel < Pls < 1 here

Completely true Completely false Completely ignorant Tends to support Tends to refute

(30)

(31)

1

2

1

2

1

2

1

2

1

2

(32)

## Evidential Interval

### • EI(S) = [Bel(S) , 1 – Bel(S’)]

m3({B}) = 0.9 S = {B , F}

m3({B , F}) = 0.07 S’ = {A} m3({A , B , F}) = 0.03

Bel({B , F}) = 0.9 + 0.07 = 0.97 Bel({A}) = 0 EI(B , F) = [0.97 , 1 - 0] = [0.97 , 1]

[total belief, plausibility] can be expressed as

(33)

## Normalization of Belief

### • Example :

m1({A}) = 0.95[new evidence] m2({B}) = 0.9

m1 ({A , B , F}) = 0.05 m2 ({B , F}) = 0.07 m2 ({A , B , F}) = 0.03 m3({A}) = 0.0285 m3 ({B}) = 0.045 m3 ({B , F}) = 0.0035 m3 ({A , B , F}) = 0.0015 m3 (ψ) = 0.9215

(34)

3

3

3

3

(35)

## Problem

1

1

2

2

### ⇒Belief of brain tumor = 1

An information literate person is able to recognise that information processing skills and freedom of information access are pivotal to sustaining the development of a

This elective is for those students with a strong interest in weather and climate. It aims at providing a more academic and systematic foundation for students’ further study pursuit

In fact, his teachers believe that it is his good ear for music that has helped him with the precise pronunciation of different languages – especially with a tonal language like

Research has suggested that owning a pet is linked with a reduced risk of heart disease, fewer visits to the doctor, and a lower risk of asthma and allergies in young

• For novice writers, a good strategy is imitation: choose a well-written paper that is of a similar flavor, analyze its organization, and sketch an organization for your results

 Propose eQoS, which serves as a gene ral framework for reasoning about th e energy efficiency trade-off in int eractive mobile Web applications.  Demonstrate a working prototype and

• To achieve small expected risk, that is good generalization performance ⇒ both the empirical risk and the ratio between VC dimension and the number of data points have to be small..

With the process, it is expected to provide distribution centers with a useful reference by obtaining a better combination of order batching and storage assignment, shortening