• 沒有找到結果。

Two Scenarios for Privacy Protection of Mail Servers

4 Ontologies+Rules for Privacy Protection Policy

4.2 Two Scenarios for Privacy Protection of Mail Servers

A privacy protection scenario for three email users (Alice, Bob, and Charlie) in a mail server G to enforce privacy protection policies under a specific purpose from different organization domain is shown as follows:

G company is a well-known mail server portal that pro-vides email sending, receiving, and storing management services for its registered users. In order to apply for an email account from this portal, each new user has to explic-itly fill in his own office profile information to this portal, including name, office phone number, office address, and working organization, etc. Furthermore, for the purposes of providing the user’s personal email search and retrieval or for the management of a mail server’s own business ser-vices, dynamically generated users’ digitally traced infor-mation will be online extracted, (un)disclosed, and even archived in this portal during email sending and receiving activities.

The possible online digitally traced information ex-tracted, (not-)disclosed, and archived from this mail server portal are IP address for each time the user signs

Figure 2. A class hierarchy classification for both personal profiles and digital traces

in, sender’s/receivers’ email address(es) for each incom-ing/outgoing email, the titles and contents for each thread of all associated emails, etc. Of course, the mail server G does provide opt-in and opt-out mechanisms for the user to decide whether his public (or private) profile and digitally traced information can be (not-)disclosed under certain cir-cumstances for some roles to achieve a specific purpose.

Please propose DL + log-based privacy protection poli-cies that can explicitly specify ontologies and rules to sat-isfy the weak DL-safeness conditions to have the semantic enforcement of privacy protection objective via the combi-nation of ontologies and rules [23]. The knowledge bases of ontologies and rules for two use case scenarios can be shown as:2.

A 5-tuple term (user(s), type(s), purpose(s), right(s), condition(s)) is a fact shown as the P3P XML-based rep-resentation from data owner specified options on the data usage’s for data user(s), where user(s) ∈ data user ontol-ogy, type(s) ∈ data type ontolontol-ogy, purpose(s) ∈ purpose ontology; right(s) ∈ (read,write,display,disclose,..), and condition(s) ∈ (date,time,counter,..). Once this 5-tuple term was collected from data owner, it will be extracted and decomposed as several legal predicates that fitted into the grounding facts for the ontologies module and the

2In the following rules and facts, each term shown as capital letters comes from ontologies while each term shown as little letters is defined as Datalog predicates. This is the feature of a hybrid ontologies+rules combination.

cation of different data usage purposes

rules module to semantically enforce the privacy protection policies with respect to each data user’s request.

• Use case one scenario: There are two organizations that share users’ public profiles and digitally traced in-formation from this mail server portal: one is a sub-sidiary department SD of this mail server and the other is a cooperative partner CP of this mail server. The privacy protection policies to enforce the information disclosure requests from the members of these two or-ganizations will be quite different from service pur-poses or user roles perspective. Now a userAlice ∈ SD is going to send a data auditing announcement email ∈ DAT A AU DIT AN N OU N. to both a user Bob ∈ SD and a user Charlie ∈ CP. Under com-pany SD internal regulation, anyone sends an email to a mailing list with multiple recipients, where email re-cipients∈ SDcannot disclose his/her email address to those people not ∈ SD domain under any purposes.

Therefore, the email recipient Charlie ∈ CP can-not explicitly see the email address of the recipient Bob ∈ SD in his receiving email address header(see Figure 4).

Let Γ = (Λ, ∆) be the two components of knowledge representation from ontologies Λ module and rules ∆ module:

– Λ = ontology about information disclosure for this use case one scenario:

Ontologies Module’s Axioms:

COMPANY v PRIVATE PRIVATE v ORGANIZATION OWNER v PERSON

COMPANYdomain←− HAS COOP ERAT IV Erange−→

COMPANY

COMPANY domain←− HAS SU BSIDIARY range−→

COMPANY

HAS COOPERATIVE ≡ HAS COOPERATIVE PERSONdomain←− IS ST AF F OF range−→

Figure 4. A recipient B’s email address can-not be disclosed to C ∈ CP under all data usage purposes

ORGANIZATION

MAIL TRACEdomain←− HAS M AIL T RACErange−→

EMAIL

EMAIL v ∃ HAS MAIL TRACE ONLINE.O EMAIL SENDER EMAIL v ∀ HAS MAIL TRACE ONLINE.O EMAIL RECEIVER DATA AUDIT ANNOUN. v AUDIT ANNOUN.

Ontologies Module’s Facts:

– ∆ = Rules about information disclosure for this use case one scenario:

Rules Module’s Rules:

cando(?c,?b-email, display) ⇐=

opt-in(?b,?b-email,?p)), data-user(?c), data-owner(?b),

HAS EMAIL ADDRESS(?b,?b-email). ← (a1)

cando(?c,?b-email, nill) ⇐=

opt-out(?b,?b-email,?p)), data-user(?c), data-owner(?b),

HAS EMAIL ADDRESS(?b, ?b-email). ← (a2)

opt-in(?b,?b-email,?p) ⇐=

IS STAFF OF(?b,?c1), IS STAFF OF(?c, ?c2), HAS SUBSIDIARY(?c1,?c2),

IS STAFF OF(?b,?c1), IS STAFF OF(?c, ?c2), HAS COOPERATIVE(?c1,?c2), From Bob’s side, a mail server G will be grounding rule (a4) first and then it will derive opt-out(b,Bob@yahoo.com.tw,data-auditing) as a conclusion. The opt-out(..) will be-come one of the facts in rule (a2) con-ditions once Charlie activates his email re-ceiving action from mail server G to read this particular email from Alice@gmail.com.

The recipient email address Bob@yahoo.com.tw will not be displayed due to the conclusion of cando(Charlie,Bob@yahoo.com.tw,nill) from rule (a2) due to the nill access right.

From Charlie’s side, a G mail server does not have the constraints from Charlie to enforce as-sociated privacy protection policies so Bob is aware Charlie as one of the mailing list recip-ients with Charlie@hotmail.com in his receiving email message (see Figure 4). In the rule (a3), it satisfies weak DL-safeness but it does not

sat-variables c1 and c2 inIS STAFF OFDL predicate did not occur in any Datalog predicates.

• Use case two scenario: The auditing officer Bob serves in one of government auditing agencies In-ternal Revenue Service (IRS), where IRS GOV AGEN CY v P U BLIC. Bob is going to en-force a routine auditing check to a company M ∈ COM P AN Y v P RIV AT E through its representa-tive Charlie. An auditing announcement officer Alice from IRS is going to send an email to a representative employeeCharlie ∈ Mand other company represen-tatives to notify the account-auditing schedule. Under government’s auditing regulations, the real acting au-ditor Bob as one of the mailing list recipients served in IRS cannot disclose his email address in this account-auditing notification email. Therefore, a chief privacy officer(CP O) ∈ IRShas to opt-out the acting auditor recipient Bob’s email address to comply the regula-tions while Alice is sending an account-auditing noti-fication message (see Figure 5).

Figure 5. A recipient Bob’s email address Bob@government.gov cannot be disclosed to Charlie under auditing regulations for the purpose of delivering auditing notification email to Charlie

The ontologies module and the rules module for this use case two scenario are very similar to those specified in the use case one scenario except condi-tions for rule (a3) and rule (a4) are not shown as binary ontology predicates HAS SU BSIDIARY (..) and HAS SU BSIDIARY (..) instead they are re-placed as unary ontology predicates IRS(?c1) and IRS(?c2)to ascertain the data owner b will opt-in(..)

in IRS. Otherwise, the data owner b will opt-out(..) his email address to the data user c who is not an IRS employee.

5 Discussion

5.1 Which Ontologies+Rules