• 沒有找到結果。

A Novel Approach to Trust Management in Unattended Wireless Sensor Networks

N/A
N/A
Protected

Academic year: 2021

Share "A Novel Approach to Trust Management in Unattended Wireless Sensor Networks"

Copied!
15
0
0

加載中.... (立即查看全文)

全文

(1)

A Novel Approach to Trust Management in

Unattended Wireless Sensor Networks

Yi Ren, Member, IEEE, Vladimir I. Zadorozhny, Senior Member, IEEE,

Vladimir A. Oleshchuk, Senior Member, IEEE, and Frank Y. Li, Senior Member, IEEE

Abstract—Unattended Wireless Sensor Networks (UWSNs) are characterized by long periods of disconnected operation and fixed or irregular intervals between sink visits. The absence of an online trusted third party implies that existing WSN trust management schemes are not applicable to UWSNs. In this paper, we propose a trust management scheme for UWSNs to provide efficient and robust trust data storage and trust generation. For trust data storage, we employ a geographic hash table to identify storage nodes and to significantly decrease storage cost. We use subjective logic based consensus techniques to mitigate trust fluctuations caused by environmental factors. We exploit a set of trust similarity functions to detect trust outliers and to sustain trust pollution attacks. We demonstrate, through extensive analyses and simulations, that the proposed scheme is efficient, robust and scalable.

Index Terms—Unattended wireless sensor network (UWSN), distributed trust management, subjective logic

1

I

NTRODUCTION

W

IRELESSSensor Networks (WSNs) have been used in challenging, hostile environments for various appli-cations such as forest fire detection, battlefield surveillance, habitat monitoring, traffic management, etc. One common assumption in traditional WSNs is that a trusted third party, e.g., a sink, is always available to collect sensed data in a near-to-real-time fashion.

Although many WSNs operate in such a mode, there are WSN applications that do not fit into the real time data collection model. Consider an example of a monitoring system deployed in a natural park to detect poaching activ-ities. The lack of regular access routes and the size of the surveillance area would require a mobile sink to collect data periodically [1]. Another example is an underwater mobile sensor network for submarine tracking and harbor moni-toring. The inaccessibility of the protected area and other technical problems make it difficult to maintain continuous connections between sink and sensors [2]. Fig.1shows an example of Unattended WSNs (UWSNs) [1], [3]–[5] with a mobile sink visiting the network at either fixed or irregular intervals to collect data.

Trust management becomes very important for detect-ing malicious nodes in unattended hostile environments.

• Y. Ren is with the Department of Information and Communication Technology, University of Ager, Grimstad 4898, Norway, and also with the Department of Computer Science, National Chiao Tung University, Hsinchu, Taiwan. E-mail:yi.ren@uia.no.

• V. Oleshchuk and F. Li are with the Department of Information and Communication Technology, University of Ager, Grimstad 4898, Norway. E-mail: {vladimir.oleshchuk, frank.li}@uia.no.

• V. Zadorozhny is with the School of Information Sciences, University of Pittsburgh, Pittsburgh, PA 15260 USA. E-mail:vladimir@sis.pitt.edu. Manuscript received 30 Mar. 2012; revised 23 Dec. 2012; accepted 24 Jan. 2013. Date of publication 14 Feb. 2013; date of current version 2 July 2014. For information on obtaining reprints of this article, please send e-mail to:

reprints@ieee.org, and reference the Digital Object Identifier below. Digital Object Identifier 10.1109/TMC.2013.22

It can also assist in secure routing [6], [7], secure data dis-tribution [8], and trusted key exchange [9]. An efficient trust management system is required to handle trust related information in a secure and reliable way. It should deal with uncertainty caused by noisy communication channels and unstable sensor behavior.

We propose a trust management scheme for efficient trust generation as well as scalable and robust trust data storage in UWSNs. A central issue for trust management in UWSNs is how to store trust data without relying on a trusted third party. Initially, we consider two simple trust management schemes as a first-step attempt to address the existing trust storage problems in UWSNs. After analyz-ing the shortcomanalyz-ings of these simple schemes, we propose an advanced scheme based on a Geographic Hash Table (GHT) [10]. Our advanced scheme allows sensor nodes to

put and get trust data to and from designated storage nodes

based on node IDs. Sensor nodes do not need to know the IDs of storage nodes. They use a hash function to find loca-tions of the storage nodes, which significantly reduce the storage cost. We also propose a set of similarity threshold functions to remove outliers from trust opinions. This pre-vents attackers from generating false trust opinions and from polluting trustworthiness. Furthermore, we provide a detailed analysis of the proposed scheme and conduct a comprehensive simulation-based study to demonstrate that our scheme is efficient, robust, and scalable.

The rest of the paper is organized as follows. Related work is reviewed in Section 2. Section 3 defines the net-work scenario, security model and design goals. Section 4

presents some background material on trust management in sensor networks and on subjective logic. Section5 intro-duces our solutions for efficient trust data storage. Section6

reports a simulation-based study conducted to evaluate the efficiency and the robustness of the proposed schemes. Section 7 considers advanced approaches to reliable trust generation. Section8 offers conclusions.

1536-1233 c2013 IEEE. Personal use is permitted, but republication/redistribution requires IEEE permission. See http://www.ieee.org/publications_standards/publications/rights/index.html for more information.

(2)

Fig. 1. Example of UWSN.

2

R

ELATED

W

ORK

In this section, we review the existing trust management schemes in WSNs, ad hoc and P2P networks.

2.1 Trust Management in WSNs

Several solutions have been recently proposed for trust management in WSNs. In [11] the authors designed a pro-tocol to diagnose and mask arbitrary node failures in an event-driven WSN. In [12], the authors proposed a Bayesian trust management framework where each node maintains reputation metrics to assess past behavior of other nodes and to predict their future behavior. The authors in [13] proposed iTrust, an integrated trust framework for WSNs. A trust aware routing protocol for WSNs was proposed in [7]. The protocol exploits prior routing patterns and link quality to determine efficient routes. In [6], the authors proposed a trust-based routing scheme that selects a for-warding path based on the trust requirement of a packet and the trust level of neighbor nodes.

2.2 Trust Management in Ad Hoc Networks

More trust management studies were conducted in the field of ad hoc networks [14]–[17]. The authors in [14] proposed a reputation system based on Bayesian estimation of mis-behavior in mobile ad hoc networks. The work in [15] introduced an information theoretic framework to measure trust and to model trust evolution. A data-centric frame-work for trust establishment was proposed in [16]. In [17], the authors proposed a distributed trust scheme based on distributed public key certificate management for mobile ad hoc networks.

2.3 Trust Management in P2P Networks

The authors in [18] proposed a Peer-Trust model based on public key infrastructure and trust propagation. PowerTrust [19], a robust and scalable P2P reputation scheme, was proposed to leverage the power-law feed-back factors. In [20], the authors developed Credence, a decentralized object reputation and ranking system for P2P networks.

UWSNs are an emerging class of wireless networks [3]. The authors in [3] also defined a mobile adversary and proposed a set of schemes to neutralize attacks focusing on erasing data. Techniques providing forward secrecy and backward secrecy of data stored in sensors are explored in [8], [21], [22]. To the best of our knowledge, our ear-lier work [23] is the first study which proposed trust data storage and trustworthiness calculation to facilitate trust

management in UWSNs. In this paper, we further proposed a set of schemes to mitigate trust pollution attacks based on subjective logic and various trust similarity measures.

Most of the trust management solutions developed for traditional WSNs, however, rely on the presence of an online trusted third party, e.g., to store and distribute trust data [6], [7], [11]–[13]. They cannot be applied directly to UWSNs due to the absence of the sink (or the base station). [24] is one exception that proposed a distributed scheme establishing reputation-based trust among sensor nodes. The authors anyhow did not consider significant trust attacks (as defined in Section 3.2) against the generated trust. The work in [10], [25] addressed data centric stor-age in WSNs, but trust manstor-agement and security attacks are not considered.

To summarize, most schemes, e.g., [14]–[20], proposed for P2P and ad hoc networks are not suitable for UWSNs for the following reasons. First, UWSNs are more con-strained with respect to computation, communication and power capabilities than P2P and ad hoc networks (although today’s wireless sensors provide more options for storage capacity and computational capabilities, cheap hardware cost and light weight security solutions which lead to longer network lifetime are critical for UWSNs). Those schemes designed for P2P and ad hoc networks based on public key cryptography are therefore not suitable for UWSNs. Second, the number of nodes in ad hoc networks with typical applications like on-campus peer-to-peer com-munication among lap-tops/smart-phones is usually lower than in UWSNs which are more often targeted at envi-ronment monitoring applications. An UWSN is likely to have thousands of sensors. P2P networks may have more nodes than UWSNs, but the nodes in P2P networks do not have the same computational and energy constraints as in UWSNs. Finally, sensor nodes provide services throughout their whole lifetime, until their energy is depleted, while P2P nodes enter and exit the networks randomly.

3

N

ETWORK

S

CENARIO

, S

ECURITY

M

ODEL AND

D

ESIGN

G

OALS

3.1 Network Scenario

We consider an UWSN that consists of N sensor nodes, denoted as sj ∈ S, where S = {sj}Nj=1. Each sensor sj is located at point pjand has a transmission rangeφ. Thus sjat point pjcan communicate with smat point pm ifD(pj, pm) ≤

φ, (j, m ∈ {1, . . . , N}), where D(pj, pm) is the distance between pj and pm. Each sensor sj ∈ S has nj neighbors. We say that smis one of sj’s neighbors ifD(pj, pm) ≤ φ. The yellow (light shadow) points in the circle in Fig. 2 form a set of neighbors of sj, B(sj) = {s|s ∈ S andD(pj, p) ≤ φ}. We assume that sj’s neighbors, biB(sj), i ∈ {1, . . . , nj}, have their own trust opinions Tji regarding the trustwor-thiness ϒj1 of sj, and are referred as trust producers of

sj. The nodes storing trust data are trust managers TMrj,

r ∈ {1, . . . , α} of sj. Here α is the number of trust man-agers in the network. Those sensors that would like to know

1. In the context of this paper, trust opinion T is one sensor’s con-clusion about the trust level of another sensor; and trustworthinessϒ is a combination of trust opinions over time and across all involved sensors.

(3)

Fig. 2. Example of network topology.

Fig. 3. Relationship between trust producer, trust manager and trust consumer.

about other sensors’ trustworthiness are referred as trust

consumers. The relationship between trust producer, trust

manager and trust consumer is illustrated in Fig.3. We assume further that time is split into equal time intervals and that sensors maintain loosely synchronized clocks. At time interval t, sj’s neighbor bi generates a trust opinion Tj,ti regarding sj. Note that trust consumers can be anywhere in the network but trust producers are only within the transmission range of the corresponding sensor. Furthermore, there is a mobile sink visiting the network at either fixed or irregular time intervals to collect data from sensors.

3.2 Security Model

The UWSNs can be attacked in many ways. In this study, we focus on an adversary ADV launching attacks against trust data2. We divide the attacks into two categories: trust

eraser and trust pollution attacks.

The effect of the trust eraser attack (denoted asADV_Del) is that the trust data stored in sensors are lost and cannot be retrieved by trust consumers. For instance,ADV could try to compromise sensors and to erase the trust data stored in them. Moreover, when sensors are nonfunctional (e.g., due to energy depletion, natural disasters, etc.) their stored trust data are lost and are considered as non-recoverable in this study.

In case of trust pollution attack,ADVdoes not delete the trust data but rather pollute them. We consider the following pollution strategies:

• Environmental effect (ADV_Noise). Since sensors’ trust opinions are generated based on sensors’ pre-vious behavior, they may generate some noise due to environmental effects.

• Homogeneous attack (ADV_Homo). Given a sensor

sj,ADVtries to increase sjtrustworthinessϒj mono-tonically or, it tries to decreaseϒjmonotonically. To do so, at each time intervalADV can compromise a subset of sensors to generate false trust opinions.

2. In this paper trust data means trust related data, such as opin-ions, trust measurement, etc., which are used by trust management scheme to make trust aware decisions.

• Hybrid attack, denoted as ADV_Hbd. This attack is more severe sinceADVaims to manipulate the trust-worthiness, i.e., it is able to increase or decrease trustworthiness in any way.

For clarity, we assume that the number k of compromised sensors in each time interval is fixed. We refer to this num-ber as the compromising capability. The compromised sensor can occur anywhere in the network.

3.3 Design Goals

Our trust management scheme is designed with the follow-ing goals in mind:

1) Robustness. The scheme remains functional even after certain amount of sensor nodes lose battery power or are physically damaged. Moreover, the trust data stored in the system should remain available to queries even if some sensor nodes fail due to theADV_Del attack.

2) Resilience. The generated trustworthiness ϒ should be as close as possible to its real value even though ADV tries to inject false trust opinions in order to pollute it. In other words, the system should be resilient toADV_Noise, ADV_Homo andADV_Hbd.

3) Scalability. The scheme should work for a very large unattended physical area with a high density of sensor nodes.

4) Efficiency. The designed trust management scheme should be efficient in terms of both communication cost and storage cost.

5) Consistency. Trust opinions generated by trust produc-ers and trust queries sent from trust consumproduc-ers should be routed correctly to trust managers where the trust data are stored.

3.4 Performance Metrics

The following metrics are defined to evaluate the perfor-mance of our scheme.

Pr[survival]jt is defined as the probability that at least one trust manager of sj survives during time interval t.

Communication Cost (denoted asC): The communica-tion cost consists of two parts: the cost of sending trust opinions to trust managers; and the cost of querying and retrieving trustworthiness stored in trust managers. Cj is defined as a communication cost of sj. Since trust value queries and answers are short messages, we assume that sending and receiv-ing a trust value message across each hop have the same cost, and that an approximate communication cost is O(N) for message broadcast and O(N) for point-to-point routing.

Storage Cost (denoted as S): Sj is the storage cost of storing the trust opinions and trustworthiness of sj.

4

P

RELIMINARIES

4.1 Information Collection on Sensor Behavior

The information on a sensor’s prior behavior is one of the most important aspects of trust management solutions [26]. This information varies from application to application. For

(4)

example, it can be a watchdog mechanism that monitors the behavior of neighboring nodes. The work in [12] uses the Bayesian approach for assessing node reputation and trust evolution. Node capture attacks [27], where nodes are removed from the network for an indefinite amount of time, can be detected by their neighbors. One-shot probing is pro-posed in [28] to identify misbehaving nodes. The authors in [29] consider the trust inference problem as a shortest path calculation in a weighted directed graph. They utilize the theory of semirings for trust evaluation.

In this study, we assume that sensor nodes may use the analyses and scoring sensor trust approaches (e.g., [12], [27]–[29]) to generate trust opinions. That is, in a time inter-val t, sj’s neighbors, {bi}

nj

i=1, can generate trust opinions

Tj,ti (i ∈ {1, ..., nj}) regarding sj, by monitoring sj’s prior behavior.

4.2 Subjective Logic

Monitoring sensor behavior in UWSNs based on previous communication patterns involves considerable uncertainty. Communication channels between sensors are unstable and noisy. To deal with this uncertainty, we adopt a Subjective Logic (SL) framework [30], and use SL opinions to assess trustworthiness. SL has two advantages: 1) it is lightweight; and 2) it takes both uncertainty and belief ownership into account. The definition of SL opinion is as follows.

Definition 1. An opinion is a triplet, T = {B, D, U}, where B, D, U ∈ [0, 1] and B + D + U = 1, and that B, D, and U correspond to belief, disbelief and uncertainty respectively.

A trust level can be naturally defined using the SL opin-ions, e.g., T1 = {0.0, 0.93, 0.07} for low trust value and

T2= {0.88, 0.0, 0.12} for higher one.

Definition 2. Let sX, sY and sZ be three sensors. Then TXY = {BX

Y, DXY, UYX} and TZX = {BXZ, DXZ, UZX} denote the opinions

of sY and sZ about the trustworthiness of sX. Their

com-bined consensus opinion is defined as TY,ZX = TYX⊕ TXZ =

{BX

Y,Z, DXY,Z, UXY,Z} where BXY,Z = (BXYUZX+ BXZUYX)/(UXY+

UXZ−UXYUXZ), DXY,Z= (DXYUZX+DXZUYX)/(UXY+UXZ−UYXUZX) and UXY,Z= (UYXUZX)/(UYX+ UXZ− UYXUZX).

The trust value expressed as subjective opinions instead of one simple trust level provides a more flexible trust model of the real world. Therefore, according to Def.2, the consensus of trust opinions generated by sensors{bi}

nj

i=1in time interval t about sensor sj is

T1j,t⊕ · · · ⊕ Tj,ti ⊕ · · · ⊕ Tj,tnj = Tj,t1,··· ,i,··· ,n

j . (1)

Definition 3. Let sX and sY be two sensors. Then {TX,t1

Y , . . . , TX,tY n} denotes the opinion of sY about the

trustworthiness of sX for time intervals {t1, . . . tn}

respec-tively, where TX,tn

Y = {BX,tY n, DX,tY n, UYX,tn}. Then sY’s opinion

about the trustworthiness of sX on t1∪ · · · ∪ tn is defined as

TX,t1∪···∪tn Y = {B X,t1∪···∪tn Y , D X,t1∪···∪tn Y , U X,t1∪···∪tn Y } (2) where BX,t1∪···∪tn Y = 1n  BX,t1 Y + · · · + BX,tY n  , DX,t1∪···∪tn Y = 1 n  DX,t1 Y +· · ·+DX,tY n  , UX,t1∪···∪tn Y =n1  UX,t1 Y +· · ·+UYX,tn  .

According to Def. 2 and Def. 3, we define trustwor-thiness ϒj in terms of sensor consensus to combine trust opinions generated by sensors {sj,i}ni=1j in time interval

{t}tn

t=t1 as

ϒj= Tj,t1∪···∪tn

1,··· ,i,··· ,nj . (3)

The ϒj can be calculated with respect to sensor consen-sus or time as follows: 1) with respect to sensor consenconsen-sus:

ϒj = Tj,t1∪···∪tn 1,··· ,i,··· ,nj = T j,t1∪···∪tn 1 ⊕ · · · ⊕ T j,t1∪···∪tn i ⊕ · · · ⊕ Tj,t1∪···∪tn

nj ; and 2) with respect to time: ϒ

j = Tj,t1∪···∪tn 1,··· ,i,··· ,nj = {Bj,t1∪···∪tn 1,··· ,i,··· ,nj, D j,t1∪···∪tn 1,··· ,i,··· ,nj, U j,t1∪···∪tn 1,··· ,i,··· ,nj}, where B j,t1∪···∪tn 1,··· ,i,··· ,nj = 1 n  Bj,t1 1,··· ,i,··· ,nj+· · ·+B j,tn 1,··· ,i,··· ,nj  , Dj,t1∪···∪tn 1,··· ,i,··· ,nj = 1 n  Dj,t1 1,··· ,i,··· ,nj+ · · · + Dj,tn 1,··· ,i,··· ,nj  , and Uj,t1∪···∪tn 1,··· ,i,··· ,nj = 1 n  Uj,t1 1,··· ,i,··· ,nj + · · · + U1j,t,··· ,i,··· ,nn j  .

Remark. According to Def. 3, each trust opinion has the same impact over time. Meanwhile, it is more realistic to design the scheme to be time-aware such that the newer trust opinions have higher impact on the trustworthi-ness, while prior trust opinions should be also taken into account. A straightforward solution is to use a time fac-tor (e.g., f ∈ [0, 1]) adding time impact into prior trust opinions, where greater f indicates newer opinion. More specifically, the time-aware trust opinion can be com-puted as Tj,t1∪···∪tn i = {B j,t1∪···∪tn i , D j,t1∪···∪tn i , U j,t1∪···∪tn i }, where Bj,t1∪···∪tn i = n1(fn−1B j,t1 i + · · · fB j,tn−1 i + B j,tn i ), Dij,t1∪···∪tn = 1n(fn−1Dij,t1 + · · · fDj,ti n−1 + Dj,ti n), and Uj,t1∪···∪tn i = 1 − B j,t1∪···∪tn i − D j,t1∪···∪tn i . For example, given Tj,t−1i = {0.88, 0.3, 0.09}, Tij,t = {0.8, 0.38, 0.09}, and f = 0.99, Tij,t−1∪t can be computed as Tj,t−1∪ti = {1

2(0.99×0.88+0.8),12(0.99×0.03+0.38), 1−12(0.99∗0.88+

0.8) −12(0.99 × 0.03 + 0.38)} = {0.8712, 0.0297, 0.0991}. However, we will consider this extension in our future work. The reasons are because that specifying a suitable value of f is not a trivial task and it needs to be further investigated. For example, given f = 0.99 or f = 0.88, it is not clear which one is more reasonable and how

f varies over time. Due to page limit, we are not able

to include any results on time-aware solutions in this paper. Indeed, this aspect is the focus in our more recent work on subjective logic based machine learning (Bayesian network) techniques for time-aware trustworthiness estab-lishment. We refer interested readers to [30] for more details on subjective logic and to [31]–[33] for examples of the application of subjective logic in WSNs and social networks.

5

E

FFICIENT AND

R

OBUST

S

TORAGE OF

T

RUST

D

ATA

In traditional WSNs, a trusted third party, e.g., base sta-tion, is used to keep and calculate received trust opinions. The queries of sensors’ trustworthiness are also sent to and answered by the base station. However, since UWSNs do not have a base station, trust opinions of sensors need to

(5)

be stored in sensors instead. Therefore, once sensor bi gen-erates an opinion Tj,ti at time interval t, it either stores Tij,t locally or sends Tj,ti to other nodes.

Next we consider three trust data storage schemes with-out involving the base station. First, we introduce two basic schemes and discuss their shortcomings. Then we propose an advanced scheme to improve the basic schemes.

5.1 Basic Scheme I (SI) - Trust Data Local Storage

The main idea of the SI is to keep generated trust opinions locally, i.e., bi generates Tj,ti and then stores Tj,ti in its own memory. In other words, bi is not only one of sj’s trust producers but also one of sj’s trust managers.

The SI includes local storage of trust opinion, and trust opinion querying and calculation:

(1) Local storage of trust opinion. At every time interval, each sensor generates trust opinions about its neighbor nodes, combines it with previous trust opinions according to Eq. (2) and stores it locally. Note that the generated trust opinions are combined as a combined trust opinion result-ing in very low storage cost. For instance, bigenerates Tj,ti 1,

Tj,t2

i and T

j,t3

i at t1, t2 and t3, respectively, and stores the

combined trust opinion in its memory as Tji= Tj,t0∪t1∪t2∪t3

i .

(2) Trust opinion querying and calculation. Consider the example in Fig. 2. Assume that sensor sa wants to esti-mate the trustworthiness ϒj of another sensor, sj. It broadcasts a trust opinion request, ASK(Tj), to ask sen-sors to collect opinions of other sensen-sors about sj. Here, we assume a suitable broadcast authentication proto-col, e.g., multilevel μTESLA [34], for secure and reli-able transmission of such broadcast values. If there is no direct relationship between two sensors (e.g., sh and

sj), they have highest uncertain opinion score about each other’s trustworthiness, i.e., Tjh = Thj = {0, 0, 1}. Upon receiving ASK(Tj), each sensor sends feedback messages,

ANS(Tj), to sa if they have a direct relationship with

sj. Otherwise they just drop ASK(Tj). Next, sa combines received sensors’ opinions using a consensus operator (Eq. (1)) to compute sj’s trustworthinessϒj, and stores the results.

Proposition 4. In the Basic Scheme I, the probability that at least one trust manager node remains uncompromised within t time intervals is



Pr[survival]jt= 1, k ∗ t < nj

Pr[survival]jt= 0, k ∗ t ≥ nj ,

(4)

where nj is the number of neighbor nodes and k is the

compromising capability ofADV as defined in Section3.2.

Proof.In SI, each sensor sjhas njtrust managers and njtrust producers in its transmission range. It is easy forADVto find the trust managers in the transmission range of sj. Within each time interval ADV compromises k sensors (trust managers) in sj’s transmission range. By the end of t-th time interval, k∗ t sensors (trust managers) are compromised. Therefore, k∗ t ≥ nj implies that all the trust managers are compromised, i.e., Pr[survival]jt= 0; otherwise Pr[survival]jt= 1.

5.1.1 Communication Cost

Queries ASK(Tj) are broadcast to all nodes at the cost of

O(N). Responses ANS(Tj) are sent back to the trust

con-sumer at the cost of O(N) each. For t time intervals, the cost becomes Cj = O(tN) + O(tnj

N) = O(t(N + nj

N)),

where N is the number of sensors in the network.

5.1.2 Storage Cost

For a sensor sj, each neighbor in its transmission range gen-erates one trust opinion per time interval. As the generated trust opinions are combined over time, the storage cost for each neighbor is very low (i.e., O(1)). There are nj neigh-bors that need to store the trust opinions regarding sjat the cost of O(nj). That is,Sj= O(nj).

Discussion. According to Proposition 4, ADV_Del can compromise all the trust managers in a short time (≥ nj

k). After that, both pre-compromised and post-compromised trust data can be deleted byADV_Del. Therefore, we need to hide trust managers from ADV_Del. Next we propose Basic Scheme II that supports distributed trust data storage.

5.2 Basic Scheme II (SII) - Distributed Trust Data Storage

In order to address the shortcomings of the SI, we should ensure that: (1) a sensor sj’s trust producer and trust man-ager are not the same node; (2)ADVcannot easily find trust manager nodes; and (3) the scheme is resilient against node failures.

A straightforward solution would be to specify for each node a designated trust manager node that stores its trust data. The trust manager should not be one of the node’s direct neighbors. The components of the SII scheme are defined as follows:

(1) System initialization. To provide trust data redun-dancy, at the beginning, each sensor sj is associated with

α randomly selected trust managers {TMr

j}αr=1. The IDs of

those trust managers,{TMrj}αr=1, are stored in the trust pro-ducers before deployment since the trust propro-ducers need to send the generated trust opinions to those trust managers, {TMr

j}αr=1. In addition, trust consumers store {sj, {TMrj}αr=1} in their local memory so that trust consumers are able to retrieve sj’s trust data from{TMrj}αr=1.

(2) Trust opinion distributed storage. After generating trust opinions about sj, the trust producers of sj send them to {TMr

j}αr=1. Note that, in every time interval, TMrj receives

nj trust opinions{Tij,t} nj

i=1 from biB(sj) (i ∈ [1, nj]). After receiving{Tj,ti }ni=1j , TMrj first removes outlier trust opinions as noise (we will further discuss this in Section 7). Then it combines the trustworthiness of previous time intervals with the received trust opinions according to Def. 3 and Def. 2 as follows: ϒrj = ϒrj,1∪···∪t = ϒrj,1∪···∪t−1∪ ϒrj,t = ϒrj,1∪···∪t−1∪(T1j,t⊕· · ·⊕T j,t nj), where ϒ j,t r is the trustworthiness of sj stored in TMrj during the time interval t.

(3) Trustworthiness query and calculation. Trust consumers send ASK(Tj) to {TMrj}αr=1 to retrieve trustworthiness {ϒrj}αr=1 from sj’s trust manager nodes. Upon receiving

α trustworthiness, trust consumers remove outliers using

(6)

and compute the expected value of the rest of ϒj as s j’s trustworthiness.

Proposition 5. In the Basic Scheme II, the probability that at least one trust manager node is not compromised within t time intervals is Pr[survival]t= 1 −  1− t  t=1  1− k N− (t − 1)k  α . (5)

Proof. Let Ert = 1 denote the event of r-th trust manager

compromised byADV_Del at time interval t, and Ert= 0

denote the event of r-th trust manager survival within the time interval t. The probability of no trust manager nodes surviving up to t is Pr[E1t = 1 ∩ · · · ∩ Ert= 1 ∩ · · · ∩ Eαt = 1] =Pr[E1 t = 1] ∗ · · · ∗ Pr[Ert= 1] ∗ · · · ∗ Pr[Eαt = 1] =Pr[Er t= 1]α.

Thus the probability of at least one trust manager node surviving up to t is Pr[survival]t= 1 − Pr[Ert= 1]α.

The probability that r-th trust manager survives up to

t is Pr[Ert= 0] =  1− k N  1− k N− k  1− k N− 2k  · · · · · ·1− k N− (t − 1)k  = t  t=1  1− k N− (t − 1)k  . Thus, we have

Pr[survival]t= 1 − Pr[Ert= 1]α= 1 − (1 − Pr[Ert= 0])α = 1 −  1− t  t=1  1− k N− (t − 1)k  α . 5.2.1 Communication Cost

Once a trust opinion is generated, it is sent to and stored at the corresponding trust managers. The communication cost to store the trust opinion is O(N). Since njtrust pro-ducers need to send trust opinions toα trust managers at each round, the total cost is O(tnjα

N). Queries are sent to

the designated node (trust manager), which also returns a response, causing a communication cost of O(N). That is, Cj= O(tnjαN) + O(2tαN) = O(tnjαN). 5.2.2 Storage Cost

For a sensor sj, all its trust producers (its neighbors) have to know that a designated sensor TMjis a corresponding trust manager node, i.e., for all sensors biB(sj) (i ∈ [1, nj]), they have to store the ID of TMjat the cost of O(αnj). In addition, any node in the network could be sj’s trust consumer, con-sequently every node has to know which node is sj’s trust manager. That is, every node in the network has to store the ID of sjand the ID of its corresponding trust manager TMj, which causes O(N) of storage cost. Thus,Sj= O(α(N +nj)).

Discussion. Once attacked by anADV_Del, the trust data stored in sensors are deleted rather than polluted, so that

a survived trust manager is able to report trust data. In order to compare SII with SI, we obtain numerical results of Eq. (4) and Eq. (5) using MATLAB, as illustrated in Fig.4. We observe that SII is more robust than SI for all values of

N, k, t and α, upon attacking byADV_Del. Increasing the number of trust managers improves Pr[survival]t as well as trust data redundancy. However, as discussed above, the storage cost of SII is proportional to α(N + nj), caus-ing huge storage costs especially in large-scale UWSNs (see Fig.8(a) and (b)). This limitation motivates us to design a more scalable scheme that can reduce storage cost caused by distributed storage while keeping the same Pr[survival]t as in SII.

5.3 Advanced Scheme (AS)

Our Advanced Scheme (AS) utilizes a hash-table-like inter-face of GHT [10] where nodes can put and get data based on their data type, i.e., Put(DataType, DataValue) and

Get(DataType). Since a sensor ID is unique in the network,

trust producers are able to put trust opinions to trust man-agers based on the ID, i.e., Put(sj, Tj,ti ). Trust consumers are able to get trustworthiness from trust managers using the same sensor ID, i.e., Get(sj). In other words, trust opinions are pushed by, and stored at the same trust manager node. Meanwhile it enables trust consumers to pull trustworthi-ness from the trust manager nodes consistently. Neither trust producers nor trust consumers need to store the IDs of trust manager nodes, reducing storage cost significantly. Furthermore, the scheme should not be sensitive to node failures. That is, the scheme should be resilient toADV_Del. Thus, trust opinions are pushed toα (α > 1) trust manager nodes, whereas trust consumers pull trustworthiness from

α trust managers. To do so, we modify the original basic

operations of GHT from  Put(DataType, DataValue) Get(DataType) to  Put(sj, Tij,tl, r) Get(sj, r) ∀r ∈ [1, α].

Put(sj, Tj,ti , α) is the function in which trust producer biis able to put its trust opinion Tj,ti regarding sjto the r-th trust manager node of sj, where α is the number of trust man-ager nodes specified by the mobile sink when the network is deployed. Using the Get(sj, r) function, trust consumers are able to get sj’s trustworthiness ϒj from the r-th trust manager node of sj. That is, each node hasα trust manager nodes to store its trust opinions from its neighbors for data redundancy. Trust opinions regarding a sensor sjare hashed by the sensor ID sj to a geographical location. The node closest to the hashed geographical location is referred to as the trust manager node where data is sent to and retrieved from. Fig.5 shows an example of a sensor ID sj hashed to

α = 3 random geographical locations in the sensor network

by using a secure hash function Lrj = hr(sj) = h(hr−1(sj))3 (∀r ∈ {1, . . . , α}); trust producers (e.g., bi and bm) and trust consumers (e.g., sa) can send trust opinions and trust query requests to Lrj using Greedy Perimeter Stateless Routing (GPSR) [35]. The closest node to the location Lrj, namely

3. Note thatLrj is not the location of sjbut the location closest to

(7)

Fig. 4. Comparison of SI, SII and AS using Eq. (4) and Eq. (5) with default setting: N = 10000, k = 5 and t = 150. (a) Pr [survival]t vs. N. (b) Pr [survival]tvs. k. (c) Pr [survival]tvs. t.

trust manager (see{TMrj}3r=1in Fig.5), can receive the trust opinions and trust query requests. The AS includes the following phases:

(1) System initialization. Each node is preloaded with a secure hash function, denoted as h(·), and the redundancy factorα specified by the mobile sink depending on appli-cation scenarios. All nodes know their own loappli-cations, and the locations of the nodes which are one hop away.

(2) Trust opinion storage based on GHT. During the time interval t, and after Tij,t is generated, bi uses the function

Put(sj, Tj,ti , r) to put Tj,ti toα trust managers. In other words,

biperforms hr(sj) to obtainL1j, . . . ,Lαj, and then sends T j,t

i to

locations L1

j, . . . ,Lαj using GPSR, respectively. The closest node to location Lr

j, denoted as TMrj, finally receives the trust opinion Tij,tand is called the r-th trust manager node of sj.

(3) Trust opinion querying and calculation. A trust consumer node, e.g., sh, wants to know the trustworthiness of sj. It uses the function Get(sj, r) ∀r ∈ [1, α] to get trustworthiness {ϒrj}αr=1 from sj’sα trust manager nodes. Similar as the put process, sh performs hr(sj) to obtainL1j, · · · ,Lαj, and sends

ASK(Tj) to locations L1j, · · · ,Lαj using GPSR. The closest nodes to L1

j, · · · ,Lαj, i.e., trust manager nodes {TMrj}αr=1,

finally receive ASK(Tj) and then send {ϒrj}αr=1 to sh.

Fig. 5. Simple example of GHT techniques on UWSNs withα =3.

Proposition 6. The Basic Scheme II and the Advanced Scheme have the same Pr[survival]t. That is

Pr[survival]t= 1 −  1− t  t=1  1− k N− (t − 1)k  α .

Proof.The same as for Proposition5. The numerical results are shown in Fig.4.

6

E

FFICIENCY AND

R

OBUSTNESS

E

VALUATION

In this section we conduct a set of simulations in MATLAB to show that AS has the strongest performance among these three schemes in terms of both efficiency and robustness. We consider an UWSN where 10000 nodes are randomly distributed in a 3000×3000 units area. The other parameters are set as follows. Each sensor has transmission rangeφ = 150 units. ADV_Del has compromising capability k = 25. The number of trust managers nodesα = 3. The simulation results are averaged over 20 randomly deployed networks and are explained below.

Fig.6(a)–(c) show the performance of t in terms of how many intervals the network can survive, given different α,

k and φ. It demonstrates that SII and AS have better

per-formance than SI does with respect to t for all values of α,

k andφ. We observe in Fig.6(a) that increasingα improves the performance of t. Meanwhile, increasing k decreases the performance of t. Fig.6(c) shows thatφ has no impact on SII and AS in terms of t but slightly increases the performance of t in SI.

Fig.7(a)–(c) display the performance in terms of com-munication costC for differentα, k and φ. Distributed trust

(8)

Fig. 7. Simulation results: t/α/φvs.C.

Fig. 8. Simulation results:α/φvs.S.

data storage is resilient toADV_Del and provides higher t. However, it causes higher communication costs. As shown in Fig.7(b) and (c), the communication cost is acceptable if

α ≤ 3 and φ ≤ 120.

As our simulation results in Fig.8(a) and (b) indicate, SII has higher storage costs than SI and AS do in terms of α and φ. Meanwhile, φ has no impact on SII in terms of S but slightly increasesS of SI and AS.α does not have any influence on SI and AS with respect toS but significantly increasesS for SII.

Discussions.

(1) Robustness. Trust opinions {Tj,ti }ni=1j generated for each time interval are routed to and stored inα trust man-agers. Therefore, the trustworthiness can be retrieved even if up toα−1 trust managers are lost. That is, the SII and AS are resilient toADV_Del, as shown in Fig.6(a)–(c). In addi-tion, those figures also show that SII and AS have the same probability of at least one trust manager survival, which is coincident with the analysis of Proposition6.

(2) Efficiency. SII and AS have the same communication cost. In addition, AS has a much lower storage cost com-pared with SII. Unlike in SII, trust producers do not need to store the IDs of the trust managers of a sensor sj due to the nature of put and get function of GHT. For α trust managers, the total storage cost is O(α(N + nj)) for SII and

O(α) for AS. Thus, AS is more efficient than SII.

(3) Scalability. As discussed above, the storage cost of AS does not depend on the number of sensors, N. Increasing

N does not increase storage costs in AS. Moreover,

Section5.2.1 demonstrates that the communication cost of AS is proportional to √N. For example, when the

num-ber of sensors N increases 10 times from 1000 to 10000, we observe only a 3 times increase in communication costs.

(4) Consistency. For a given sensor, sj, all generated trust opinions {Tj,ti }ni=1j are routed to {Lr

j}αr=1, respectively. The nodes closest to {Lr j}αr=1 receive {T j,t i } nj

i=1 and store them in

their local memory, where Lr

j = hr(sj). Since sensor ID is unique in the network for a given sensor sj, the generated hash values {Lr

j}αr=1 are also unique in the network due

to the one-way property of the hash function. As a con-sequence, all the trust producers and trust consumers are able to find the correct trust managers to store and query trust data.

7

T

RUSTWORTHINESS

G

ENERATION

Through the simulations and discussions in the previ-ous section, we have demonstrated that AS significantly reduces storage cost caused by distributed data storage and provides resilience toADV_Del. In this section, we continue to investigate the performance of the proposed schemes against trustworthiness pollution attacks (i.e., ADV_Noise, ADV_Homo andADV_Hbd) defined in Section3.2.

Initially, for each sensor sj, trust opinion Tij,0 could be set by a mobile sink based on such information as physical protection, location, or role of the nodes. For example, a node, sj, buried under the ground has higher{Tij,0}

nj

i=1 than

the exposed ones. After the trust opinions are generated, a trust producer bi is able to update Tji using a conjunction operation (Eq. (2)), and trust consumers can calculate the trustworthinessϒj using consensus operation (Eq. (1)).

In the following, we conduct a simulation-based study of the trust resilience of the proposed schemes against pollution attacks. We assume that uncompromised sensor nodes generate correct trust opinions cTj = {cBj, cDj, cUj}, while compromised sensors generate false trust opinions

fTj = {fBj, fDj, fUj}, where cB and fB denote correct belief and false belief. Similarly, cD, cU, fD, and fU denote cor-rect disbelief, corcor-rect uncertainty, false disbelief, and false uncertainty, respectively. We adopt normal distribution to generate trust opinions, i.e., BN(E(B), σ2), where E(B)

is the expected value of B andσ is the standard deviation of B. In order to compare the impact of false trust opinions

fT, in the simulations, fT values are generated after the 20th

time interval so that we can observe whether there are any differences before and after the 20th time interval on the trustworthiness of a node.

We will use d(Bji, E(Bjm)) =

(Bji− E(Bjm))2 to denote the Euclidean distance between Bjiand its expected value E(Bjm) where j∈ {1, . . . , N} and i, m ∈ {1, . . . , nj}.

7.1 Trust Consensus only Approach (TC-ONLY)

In this subsection, we consider the TC-ONLY approach, which we used in Section5, as our baseline. This approach generates trustworthiness based on trust consensus only as in Eq. (2) and Eq. (3). Through simulation results, we first demonstrate that it is resilient toADV_Noise, and then reveal its shortcomings with respect to ADV_Homo and ADV_Hbd.

7.1.1 Trust Resilience againstADV_Noise

To evaluate the performance of the proposed scheme with respect to environmental effect (ADV_Noise), i.e.,

d(aj i, E(a

j

(9)

Fig. 9. Consensus is enough, cT= {0.3,0.3,0.4},σc=0.01, fT= {0.3,0.3,0.4},σf=0.1.

Fig. 10. Example of whenADVtries to increaseϒj, cT= {0.1,0.3,0.6}, fT = {0.4,0.1,0.5},σc= σf =0.01.

opinions as cT = {0.3, 0.3, 0.4}4 and σ

c = 0.01, where

cB, cD ∼ N(0.3, 0.001), and cU = 1 − cB − cD. In order

to monitor environmental effect, we set a certain percent-age PrC of sensors to generate trust opinions with larger

σf = 0.1.

Fig.9 shows the simulation results of SI, SII and AS when different values of PrC (from 10% to 40%) are spec-ified. These figures display three elements B, D and U in white, green (light shadow), and red (black) respectively. The first row of Fig.9is the simulation results of SI. As one can observe that, after the 20th time interval, the obtained trustworthiness ϒ starts to become unstable. In addition, increasing the percentage PrC of anomalous sensors makes

ϒ more unstable. The second row of Fig.9shows the results for the SII and AS schemes. It is interesting to emphasize thatϒ is very smooth for all values of PrC. The anomalous trust opinions have almost no influence onϒ. We observe a slight increase ϒ when PrC = 40% in Fig.9(h). This is because that the consensus operation reduces trust uncer-tainty. Comparing the first row of Fig.9and the second row of Fig.9, it is easy to see that SII and AS are more resilient against ADV_Noise than SI is. Therefore, trust consensus

4. Our simulations are conducted using random trust values and distribution paremeters. To exhibit the impact of ADV’s attacks as clear as possible, we select suitable values (e.g., cT={0.3,0.3,0.4},σc=

0.01, etc.) to plot simulation results (figures). The same simulation parameter configuration applies to the rest of the paper.

improves resilience againstADV_Noise, i.e., d(aji, E(ajm)) ≈ 0 (a∈ {B, D, U}).

7.1.2 Trust Resilience againstADV_Homo

Since ADV_Homo tries to either increase ϒ or decrease ϒ monotonously, we conduct two simulations. In the first simulation, ADV_Homo is assumed to generate false trust opinions fT to increaseϒ. In contrast,ADV_Homo is set to decreaseϒ in the second simulation.

Simulation one. In order to increase ϒ, ADV_Homo increases B and decreases D simultaneously. That is, gen-erate fT that satisfies E(cB) < E(fB) and E(cD) > E(fD). We select a special case when cT = {0.1, 0.3, 0.6}, fT = {0.4, 0.1, 0.5} and σc = σf = 0.01. The simulation results are shown in Fig.10. The same as in the other figures, the results for SI are plotted in the first row, while the SII and AS results are shown in the second row. It is interesting to see that the results of SI experience sharp steps and jitters after the 20th time interval. Those sharp steps and jitters indicate that SI is not resilient toADV_Homo attacks. In con-trast, the results of SII and AS are smoother compared with that of SI. The smoother result means that trust consensus does effectively mitigate the effect ofADV_Homo. In addi-tion, when PrC increases,ϒ starts to increase. The reason is that more sensors generate false trust opinions, increasing the impact of false trust opinion fT on trustworthinessϒ.

Simulation two. To decrease ϒ as much as possible, ADV_Homo decreases B and increases D simultaneously.

(10)

Fig. 11. Example of whenADVtries to decreaseϒj, cT= {0.4,0.1,0.5}, fT= {0.1,0.3,0.6},σc= σf =0.01.

That is, E(cB) > E(fB) and E(cD) < E(fD). We set cT = {0.4, 0.1, 0.5}, fT = {0.1, 0.3, 0.6}, σc= σf = 0.01 in the simu-lation. We observe, in Fig.11, that trust consensus does not mitigateADV_Homo. After the 20th time interval,ϒ starts to decrease sharply. Similar to simulation one, increasing

PrC has heavier influence onϒ, and SII and AS have better

performance than SI does.

Through the simulation results shown above, we con-clude that TC-ONLY is not resilient against ADV_Homo attack.

7.1.3 Trust Resilience againstADV_Hbd

Recall that ADV_Hbd aims to manipulate trustworthiness

ϒ, it is able to increase or decrease ϒ in any way. However,

as shown in Figs.10 and 11, trust consensus does not perform well in either increasingϒ or decreasing ϒ attacks.

Discussion. From the simulation results shown above, it is

easy to conclude that TC-ONLY is not enough for trustwor-thiness calculation. It can only sustain ADV_Noise caused by environmental effects. The reason is that using trust con-sensus for trust calculation decreases uncertainty U and makes ϒ stable. However, it does not performance well againstADV_Homo andADV_Hbd. The reason is that both correct trust opinions cT and false trust opinions fT are taken into account in trustworthiness calculation as input, resulting in polluted ϒ. One way to solve this problem is to reduce the effect of fT as much as possible. Thus we propose the next scheme capable of removing false trust opinions.

7.2 Trust Consensus with One Parameter Similarity Threshold Function (ONE-PARA)

As compromised trust producers may send false trust opin-ions to trust managers to pollute trustworthiness, we use ONE-PARA to remove outliers. A one parameter similarity threshold function is defined as

ST(Tji) =

(Bji− E(Bjm))2

BjiE(Bjm) , (6)

where Bjiand Bjmare the belief values regarding sjgenerated by its neighbors si and sm respectively. As a consequence, any Tji is considered as a outlier if ST(Tij) >  where  is a similarity threshold factor (e.g.,  = 0.1). The similarity

threshold function is expected to neutralize false trust opin-ions as much as possible. It is also desirable to reduce false positives (when trust opinions are considered as false trust opinions even though they are correct), as well as false negatives (when trust opinions are considered as cor-rect trust opinions even though they are false). Therefore, we aim to increase true positives as much as possible while keeping very few false positives and false negatives. However, since the threshold function is based on how far

Bji is from its expected value E(Bjm), and E(Bjm) is the aver-age value of both cB and fB, the selection of  may be problematic.

As shown in Fig.12, decreasing increases true positives. However, it also increases false positives. Fig.12(a) shows that a major part of fB and a small part of cB are consid-ered to be outliers. A small part of false trust opinions are considered to be correct trust opinions (false negative), if a suitable similarity threshold factor is specified. When  is too small, as shown in Fig.12(b), all false trust opinions are considered to be outliers. However, more than half of the correct trust opinions are also considered to be outliers. In contrast, when is too large, more than half of the false

Fig. 12. Example of similarity threshold factor selection in terms of false positive (FP), true positive (TP) and false negative (FN). (a) Suitable similarity threshold factor. (b) Similarity threshold factoris too small. (c) Similarity threshold factoris too large.

(11)

Fig. 13. ROC curve with cT = {0.1,0.3,0.6}, fT= {0.4,0.1,0.5},σc=

σf =0.01 where similarity threshold factorare specified as 0.2, 0.4, 0.6 and 0.8, respectively.

trust opinions are considered to be correct trust opinions. In addition, the larger the number of false trust opinions considered to be correct trust opinions, the closer E(Bjm) is to E(fB).

Fig.13 shows the Receiver Operating Characteristic (ROC) curve with cT = {0.1, 0.3, 0.6}, fT = {0.4, 0.1, 0.5},

σc = σf = 0.01 where similarity threshold factor  is spec-ified as 0.2, 0.4, 0.6 and 0.8 respectively. We observe that true positives increase as false positives increase, and that the ROC curves are the same when = 0.6 and 0.8. That is the reason why Fig.14(d), (i), (e), and (j) indicate the same level of performance.

7.2.1 Trust Resilience againstADV_Noise

Since ONE-PARA is an improved version of TC-ONLY, it works well against ADV_Noise and can further filter outliers.

7.2.2 Trust Resilience againstADV_Homo

As demonstrated above, TC-ONLY cannot sustain

ADV_Homo. We conduct two more sets of simulations and compare it with ONE-PARA. We use the parameters of the last two simulations of TC-ONLY with a fixed percentage of compromised sensors PrC= 20%5.

The results are as shown in Figs.14 and 15. The first rows and the second rows are the simulation results of SI and that of SII and AS, respectively. We observe that, due to trust consensus, the generated ϒ in SII and AS is more stable than ϒ in SI. The first column of Figs.14

and 15 shows the results when the threshold function is not applied (TC-ONLY). The rest of the columns of those figures are the results when the threshold function (ONE-PARA) is applied. Here similarity threshold factor varies from 0.2 to 0.8. We notice that TC-ONLY (the first column of Figs.14and 15) do not sustainADV_Homo and ADV_Hbd as expected. Consider now the simulation results of SII and AS (columns 2,3,4,5 in Figs.14and15):

5. Please note that the percentage of compromised sensors can be configured randomly in the range of [0,50%). Due to page limit, we only illustrate the results with 20% compromised nodes.

• Proper . We find that ONE-PARA works well (see Fig.14(i) and (j) as well as Fig.15(h) and (i)) when a suitable  is selected.

 is too small. It is interesting to observe that

uncer-tainty U becomes higher while belief B and disbelief

D decrease in the second column of Figs.14and15. This is because that a very small similarity thresh-old factor treats both false and correct trust opinions as outliers. When  is too small, as illustrated in Fig.12(b), more than half of the correct trust opin-ions are considered as outliers, causing the bizarre behavior in the second column of Fig.14.

 is too large. As shown in Fig.12(c), false nega-tives increase with a larger . That is, more false trust opinions are considered as correct trust opin-ions, resulting in higher values in disbelief D and lower values in belief B, as shown in Fig.15(e) and (j).

Discussion. As shown in Figs.14and15, ONE-PARA works well in some special cases (see Table 1). However, when

d(Bji, E(Bjm)) ≈ 0 and d(Dji, E(D j

m)) are high, ONE-PARA cannot identify the difference between disbelief D, such as correct trust opinion cT = {0.1, 0.1, 0.8} and false trust opinion fT = {0.1, 0.8, 0.1}, or cT = {0.1, 0.8, 0.1} and

fT= {0.1, 0.1, 0.8}. As shown in Fig.16, when cB= fB = 0.1,

cD= 0.1 and fD = 0.8, the differences in disbelief D

can-not be detected by ONE-PARA, which may cause severe consequences. ADV’s attack pollutes the trustworthiness for all schemes. We observe in Fig.17 that ADV’s attack is mitigated by the consensus when d(Bji, E(Bjm)) ≈ 0 and

E(cB) > E(fB). Furthermore, ONE-PARA can only specify

one parameter among the three elements B, D and U so that it cannot sustainADV_Hbd. The reason is that the most efficient way forADV to increaseϒ is to increase B and to decrease D simultaneously. The threshold functions based on one parameter cannot control these two parameters, i.e.,

B and D. To mitigate the pollution caused by ADV_Hbd for the case when d(Bji, E(Bjm)) ≈ 0 and E(cB) < E(fB), we propose our next scheme.

7.3 Trust Consensus with Three Parameter Similarity Threshold Function (T-PARA)

Since ONE-PARA cannot recognize the difference between

cT= {0.1, 0.1, 0.8} and false trust opinion fT = {0.1, 0.8, 0.1},

we formulate a three-parameter similarity threshold func-tion as follows,

ST(Tj i) =

(Bji− E(Bjm))2+ (Dji− E(Djm))2+ (Uij− E(Ujm))2

BjiE(Bjm) + DjiE(Djm) + UjiE(Ujm)

.

(7)

7.3.1 Trust Resilience againstADV_Noise

As an improved version of TC-ONLY, T-PARA exhibits bet-ter performance. Here, we do not show the simulation results with respect toADV_Noise due to page limit.

7.3.2 Trust Resilience againstADV_Homo and

ADV_Hbd

We use the same simulation parameters as in ONE-PARA experiments. Figs.18 and 19 show the simulation results.

(12)

Fig. 14. Example of whenADVtries to increaseϒj, cT= {0.1,0.3,0.6}, fT = {0.4,0.1,0.5},σc= σf =0.01.

Fig. 15. Example of whenADVtries to decreaseϒj, cT= {0.4,0.1,0.5}, fT= {0.1,0.3,0.6},σc= σf =0.01.

The results based on TC-ONLY are plotted in the first column.

As one can observe, the performance of T-PARA is much better in comparison with TC-ONLY when a suitable similarity threshold factor  is specified (see Fig.18(h)–(j) as well as Fig.19(h)–(j)). In addition, we observe that T-PARA works well when  = 0.4, 0.6 and 0.8 (see Fig.18(h)–(j)), while the performance of ONE-PARA is not good (Fig.16). Moreover, Fig.18(b) and (g) as well as Fig.19(b) and (g) show the impact of  when it is too small. Furthermore, it is worth mentioning that T-PARA performs well (see Fig.18(h)–(j) as well as Fig.19(h)–(j)) in terms of both ADV_Homo increasing ϒ attack and ADV_Homo decreasing ϒ attack. Therefore, it is resilient toADV_Hbd.

7.4 Three Parameters with Weighted Factors (T-PARA-WF)

In order to provide a more flexible threshold function to prevent ADV from pollution attacks, we further develop an improved version of Eq. (7):

ST(Tj i) = x2(Bj i− E(B j m))2+ y2(Dji− E(D j m))2+ z2(Uji− E(U j m))2 xBjiE(Bjm) + yDjiE(U j m) + zUjiE(U j m) , (8) where xB+ yD + zU = 1.

We introduce three weighted factors x, y and z into Eq. (7), enabling a T-PARA-WF method that can be adjusted depending on different scenarios. For example, to pre-ventADVfrom increasing trustworthiness, we can define a

TABLE 1

(13)

Fig. 16. Example of ONE-PARA does not work, when cT= {0.1,0.1,0.8}, fT= {0.1,0.8,0.1},σc= σf=0.01.

Fig. 17. Example of ONE-PARA works even it cannot identity the difference between cT= {0.1,0.8,0.1}and fT= {0.1,0.1,0.8},σc= σf=0.01.

Fig. 18. Example of T-PARA works while TC-ONLY and ONE-PARA does not work well where cT = {0.1,0.1,0.8}, fT = {0.1,0.8,0.1},σc =

σf = 0.01.

higher value of B in Eq. (8), i.e., increase x. In contrast, we define a lower value for y to preventADVfrom decreasing trustworthiness. To preventADV_Hbd (i.e.,ADV generates false trust opinions in any way to manipulate sensor trust-worthiness), a larger value of z can be defined to put more weight on uncertainty U.

Note that Eq. (6) and Eq. (7) are special cases of Eq. (8) when x = 1, y = z = 0 and x = y = z = 1, respectively. Finally, we have the following observations.

• IfADV_Homo intends to increaseϒ, ONE-PARA in terms of B is better than T-PARA since B is the only

weight factor in it. That is, x= 1, y = z = 0, meaning that D and U are not taken into consideration.

• In contrast, ifADV_Homo wants to decreaseϒ, ONE-PARA in terms of D is better than T-ONE-PARA since D is the only weight factor in it.

• T-PARA is resilient to ADV_Noise, ADV_Homo and ADV_Hbd.

• T-PARA-WF is a more flexible way to preventADV from various attacks. The selection of x, y and z is scenario dependent.

The countermeasures againstADV’s pollution attack strate-gies are summarized in Table2. Here, Fair and Good indicate

(14)

Fig. 19. Example of TC-ONLY works well while T-PARA slightly compromised the result whenis selected too small. cT = {0.1,0.8,0.1}, fT =

{0.1,0.1,0.8},σc= σf=0.01.

how the countermeasures are resilient toADV’s pollution attack strategies. Good means that a countermeasure (e.g., ONE-PARA) is more resilient than TC-ONLY (i.e., Fair), once attacked byADV_Noise.

8

C

ONCLUSION

In this paper, we have proposed a family of efficient and robust trust management schemes for UWSNs based on Subjective Logic. Our advanced trust storage scheme, AS, facilitates distributed trust data storage to ensure high reliability of trust data. It takes the advantage of both GHT and GPSR routing to find storage nodes and to route trust data. We have also proposed several meth-ods to mitigate trust pollution attacks based on various trust similarity measures. We demonstrated that our trust management schemes are resilient to major attack cate-gories including ADV_Del, ADV_Noise, ADV_Homo, and ADV_Hbd. Moreover, our simulation results demonstrated that AS has much lower storage costs compared with the less sophisticated approaches. Combining AS with similar-ity threshold measures, we are able to significantly reduce trust storage costs and perform efficient node invalidation and mitigation ofADV’s pollution attacks.

A

CKNOWLEDGMENTS

The research leading to these results has received fund-ing from the EU FP7-PEOPLE-IRSES program under grant agreement 247083, project acronym S2EuNet. V. Zadorozhny’s research was supported in part by the Research Council of Norway through the L. Eiriksson mobility program, project 209237. Y. Ren’s research was

TABLE 2

ADV’s Pollution Attack Strategies and Their Countermeasures

supported in part by the Aiming for the Top University and Elite Research Center Development Plan by Taiwan, and this work was partially done while Y. Ren was vis-iting the School of Information Sciences, University of Pittsburgh. Part of this paper was presented at the IEEE MDM conference, July 2012.

R

EFERENCES

[1] D. Ma, C. Soriente, and G. Tsudik, “New adversary and new threats: Security in unattended sensor networks,” IEEE Netw., vol. 23, no. 2, pp. 43–48, Mar. 2009.

[2] Y. Ren, V. Oleshchuk, F. Y. Li, and S. Sulistyo, “SCARKER: A sensor capture resistance and key refreshing scheme for mobile WSNs,” in Proc. IEEE LCN, Bonn, Germany, 2011.

[3] R. Di Pietro, L. V. Mancini, C. Soriente, A. Spognardi, and G. Tsudik, “Catch me (if you can): Data survival in unattended sensor networks,” in Proc. IEEE PERCOM, Hong Kong, 2008. [4] R. Di Pietro, G. Oligeri, C. Soriente, and G. Tsudik, “United we

stand: Intrusion-resilience in mobile unattended WSNs,” IEEE Trans. Mobile Comput., vol. 12, no. 7, pp. 1456–1468, Jul. 2013. [5] R. Di Pietro and N. Verde, “Epidemic data survivability in

unat-tended wireless sensor networks,” in Proc. ACM WiSec, Hamburg, Germany, 2011.

[6] K.-S. Hung, K.-S. Lui, and Y.-K. Kwok, “A trust-based geograph-ical routing scheme in sensor networks,” in Proc. IEEE WCNC, Kowloon, Hong Kong, 2007.

[7] A. Rezgui and M. Eltoweissy, “TARP: A trust-aware routing pro-tocol for sensor-actuator networks,” in Proc. IEEE MASS, Pisa, Italy, 2007.

[8] Y. Ren, V. Oleshchuk, and F. Li, “Optimized secure and reli-able distributed data storage scheme and performance evalu-ation in unattended WSNs,” Comput. Commun., vol. 36, no. 9, pp. 1067–1077, May 2013.

[9] N. Lewis and N. Foukia, “Using trust in key distribution in wireless sensor networks,” in Proc. GLOBECOM Workshops, Washington, DC, USA, 2007.

[10] S. Ratnasamy et al., “GHT: A geographic hash table for data-centric storage,” in Proc. ACM WSNA, 2002.

[11] M. Krasniewski, P. Varadharajan, B. Rabeler, S. Bagchi, and Y. Hu, “TIBFIT: Trust index based fault tolerance for arbitrary data faults in sensor networks,” in Proc. DSN, 2005.

[12] S. Ganeriwal, L. Balzano, and M. Srivastava, “Reputation-based framework for high integrity sensor networks,” ACM Trans. Sen. Netw., vol. 4, no. 3, pp. 1–37, May 2008.

[13] K. Yadav and A. Srinivasan, “iTrust: An integrated trust frame-work for wireless sensor netframe-works,” in Proc. ACM SAC, New York, NY, USA, 2010.

[14] S. Buchegger and J. Le Boudec, “A robust reputation system for mobile ad-hoc networks,” in Proc. P2PEcon, 2004.

[15] Y. L. Sun, W. Yu, Z. Han, and K. Liu, “Information theoretic frame-work of trust modeling and evaluation for ad hoc netframe-works,” IEEE J. Sel. Areas Commun., vol. 24, no. 2, pp. 305–317, Feb. 2006.

數據

Fig. 1. Example of UWSN.
Fig. 2. Example of network topology.
Fig. 6 (a)–(c) show the performance of t in terms of how many intervals the network can survive, given different α,
Fig. 7. Simulation results: t /α/φ vs. C .
+7

參考文獻

相關文件

• In 2007, Mitsubishi UFJ Trust &amp; Banking, a division of Japan's largest banking group, started to allow employees to go home up to three hours early to care for children

(1996), “Transformational leader behaviors and substitutes for leadership as determinants of employees satisfaction, commitment, trust, and organizational citizenship

ii. Drama as a Second Language: a Practical Handbook for Language Teachers. Cambridge: National Extension College Trust. Drama Techniques in Language Learning: a Resource Book

In this paper, we build a new class of neural networks based on the smoothing method for NCP introduced by Haddou and Maheux [18] using some family F of smoothing functions.

In this paper, we develop a novel volumetric stretch energy minimization algorithm for volume-preserving parameterizations of simply connected 3-manifolds with a single boundary

● In computer science, a data structure is a data organization, management, and storage format that enables efficient access and

3.Secondary research may reduce the levels of trust between participants and researchers.. 4.&#34;One size fits all&#34; approach also risks losing

In this chapter, a dynamic voltage communication scheduling technique (DVC) is proposed to provide efficient schedules and better power consumption for GEN_BLOCK