• 沒有找到結果。

Understanding the physical and economic consequences of attacks on control systems

N/A
N/A
Protected

Academic year: 2021

Share "Understanding the physical and economic consequences of attacks on control systems"

Copied!
11
0
0

加載中.... (立即查看全文)

全文

(1)

I N T E R N A T I O N A L J O U R N A L O F C R I T I C A L I N F R A S T R U C T U R E P R O T E C T I O N2 ( 2 0 0 9 ) 7 3 – 8 3

available atwww.sciencedirect.com

journal homepage:www.elsevier.com/locate/ijcip

Understanding the physical and economic consequences of

attacks on control systems

Yu-Lun Huang

c

,

, Alvaro A. Cárdenas

a

, Saurabh Amin

b

, Zong-Syun Lin

c

, Hsin-Yi Tsai

c

,

Shankar Sastry

a

aDepartment of Electrical Engineering and Computer Sciences, University of California, Berkeley, California 94720, USA bDepartment of Civil and Environmental Engineering, University of California, Berkeley, California 94720, USA cDepartment of Electrical and Control Engineering, National Chiao Tung University, Hsinchu, 30010, Taiwan

A R T I C L E I N F O Article history: Received 11 June 2009 Accepted 11 June 2009 Keywords: Control systems Integrity attacks Denial-of-service attacks Consequences A B S T R A C T

This paper describes an approach for developing threat models for attacks on control systems. These models are useful for analyzing the actions taken by an attacker who gains access to control system assets and for evaluating the effects of the attacker’s actions on the physical process being controlled. The paper proposes models for integrity attacks and denial-of-service (DoS) attacks, and evaluates the physical and economic consequences of the attacks on a chemical reactor system. The analysis reveals two important points. First, a DoS attack does not have a significant effect when the reactor is in the steady state; however, combining the DoS attack with a relatively innocuous integrity attack rapidly causes the reactor to move to an unsafe state. Second, an attack that seeks to increase the operational cost of the chemical reactor involves a radically different strategy than an attack on plant safety (i.e., one that seeks to shut down the reactor or cause an explosion).

c

2009 Elsevier B.V. All rights reserved.

1.

Introduction

Control systems are computer-based systems used to monitor and control physical processes. They are usually composed of a set of networked devices such as sensors, actuators, con-trollers, and communication devices.

Control systems and networks are essential to monitoring and controlling many critical infrastructure assets (e.g., elec-tric power distribution, water treatment, and transportation management) and industrial plants (e.g., those used for man-ufacturing chemicals, pharmaceuticals, and food products). Most of these infrastructures are safety-critical – an attack can impact public health, the environment, the economy, and even lead to the loss of human life.

Control systems are becoming more complex and interde-pendent and, therefore, more vulnerable. The increased risk

∗ Corresponding author. Tel.: +886 3 5131476.

E-mail address:ylhuang@cn.nctu.edu.tw(Y.-L. Huang).

of computer attacks has led to numerous investigations of control system security (see, e.g., [1–11]). Most of the technical solutions involve extensions and improvements to traditional information technology (IT) mechanisms. However, very few solutions consider the interactions between security and the physical processes being controlled. In particular, researchers have not considered how attacks affect the estimation and control algorithms that regulate physical systems, and, ulti-mately, how the attacks affect the physical environment.

The goal of this paper is to initiate the development of new threat models for control systems. We argue that a threat as-sessment must include an analysis of how attacks on con-trol systems can affect the physical environment in order to: (i) understand the consequences of attacks, (ii) estimate the possible losses, (iii) estimate the response time required by defenders, and (iv) identify the most cost-effective defenses.

1874-5482/$ - see front matter c 2009 Elsevier B.V. All rights reserved. doi:10.1016/j.ijcip.2009.06.001

(2)

Fig. 1 – Control system abstraction.

Fig. 2 – Attacks on control systems.

The paper is organized as follows. The next section, Sec-tion2, focuses on formal models of cyber attacks in control

systems. Section3describes the experimental setup and

an-alyzes the experimental results. The final section, Section4, summarizes our conclusions and highlights areas for future research.

2.

Modeling Attacks

This section defines the control system abstraction and formally models integrity and denial of service (DoS) attacks. 2.1. Notation

A control system is composed of sensors, controllers, actu-ators, and the physical system (plant). Sensors monitor the physical system and send measurements to a controller. The controller sends control signals to actuators. Upon receiv-ing a control signal, an actuator performs a physical action (e.g., opening a valve). Fig. 1clarifies the relationships be-tween the physical system, sensor signals (y), the controller,

and control signals (u).

The following notation is used to formally model attacks on control systems.

• Time (t): The term t denotes an instant of time. A process

runs fromt = 0 to t = T.

• Sensor Measurement (yi(t)): The term yi(t) denotes the

value measured by sensori at time t. Note that, ∀i, t, yi(t) ∈

Y

, where

Y

= [ymini , ymax

i ](ymini and ymaxi ) are the

rea-sonable minimum and maximum values representing the plant state, respectively. Also,Y = [y1, y2, . . . , yn]T, wheren

is the number of sensors.

• Manipulated Variable (ui(t)): The term ui(t) denotes the

output of controlleri at time t. Note that, ∀i, t, ui(t) ∈

U

i,

where

U

i = [umini , umaxi ] is the allowable range of

con-troller output values.

• Attack Duration (

Ta

): The term

Ta

denotes the duration of an attack. An attack starts att = tsand ends att = te. Note

that

Ta

= [ts, te].

(3)

I N T E R N A T I O N A L J O U R N A L O F C R I T I C A L I N F R A S T R U C T U R E P R O T E C T I O N2 ( 2 0 0 9 ) 7 3 – 8 3

75

Fig. 4 – Plant outputs without noise.

Fig. 2identifies several attacks on control systems. A1 and A3 correspond tointegrity attacks, where the adversary sends

false information ˆy 6= y or ˆu 6= u from (one or more) sensors

or controllers. The false information may be an incorrect measurement, an incorrect time when the measurement was observed, or an incorrect sender identifier. The adversary can launch these attacks by obtaining the secret keys used by the devices or by compromising sensors (A1) or controllers (A3). We assume that each device is uniquely authenticated. Therefore, an attacker who compromises the secret key of a device is able to impersonate only that device.

A2 and A4 correspond toDoS attacks, where the adversary

prevents the controller from receiving sensor measurements or prevents actuators from receiving control commands. The adversary can launch a DoS attack by jamming communica-tion channels, compromising devices and preventing them from sending data, attacking routing protocols, or flooding the network.

A5 corresponds to adirect attack against actuators or an

external physical attack on the plant. From an algorithmic

perspective, it is not possible to defend against such at-tacks (aside from detecting them). Therefore, significant ef-forts must be implemented to deter and/or prevent attacks

against physical systems (e.g., by implementing physical se-curity controls).

2.2. Modeling integrity attacks

A successful integrity attack on sensor i modifies the real

sensor signal, causing the input to the control function

u

to be changed fromy to ˆy. In an integrity attack, the adversary

sends a value ˆy or ˆu to a sensor or actuator based on the

information available to the adversary.

In an effort to develop a systematic – and trackable – treat-ment of attack strategies, we propose the investigation ofmax attacks, min attacks, scaling attacks, and additive attacks. We

as-sume that all these attacks lie within

U

i and

Y

i. Note that

signals outside this range are easily detected by fault-tolerant algorithms.

The following attacks can be launched against sensors:

• Min and Max Attacks:

ˆ ymini (t) = ( yi(t) fort 6∈

Ta

ymini fort ∈

Ta

, and ˆ ymaxi (t) = ( yi(t) fort 6∈

Ta

ymaxi fort ∈

Ta

.

(4)

Fig. 5 – Plant outputs with Gaussian noise.

Fig. 6 – Integrity attack ymax

(5)

I N T E R N A T I O N A L J O U R N A L O F C R I T I C A L I N F R A S T R U C T U R E P R O T E C T I O N2 ( 2 0 0 9 ) 7 3 – 8 3

77

Fig. 7 – Integrity attack ymin5 from t = 0 to t = 30.

Fig. 8 – Integrity attack umin

(6)

Fig. 9 – Integrity attack umax1 from t = 0 to t = 30. • Scaling Attacks: ˆ ysi(t) =          yi(t) fort 6∈

Ta

αi(t)yi(t) for t ∈

Ta

andαiyi(t) ∈

Y

i

ymini fort ∈

Ta

andαiyi(t) < ymini

ymaxi fort ∈

Ta

andαiyi(t) > ymaxi .

• Additive Attacks: ˆ yai(t) =          yi(t) fort 6∈

Ta

yi(t) + αi(t) for t ∈

Ta

andyi(t) + αi(t) ∈

Y

i

ymini fort ∈

Ta

andyi(t) + αi(t) < ymini

ymaxi fort ∈

Ta

andyi(t) + αi(t) > ymaxi .

Similar attacks can be launched against controllers:

• Min and Max Attacks:

ˆ umini (t) = ( ui(t) fort 6∈

Ta

umini fort ∈

Ta

. and ˆ umaxi (t) = ( ui(t) fort 6∈

Ta

umaxi fort ∈

Ta

. • Scaling Attacks: ˆ usi(t) =          ui(t) fort 6∈

Ta

αi(t)ui(t) for t ∈

Ta

andαiui(t) ∈

U

i

umini fort ∈

Ta

andαiui(t) < umini

umaxi fort ∈

Ta

andαiui(t) > umaxi .

• Additive Attacks: ˆ uai(t) =          ui(t) fort 6∈

Ta

ui(t) + αi(t) for t ∈

Ta

andui(t) + αi(t) ∈

U

i

umini fort ∈

Ta

andui(t) + αi(t) < umini

umaxi fort ∈

Ta

andui(t) + αi(t) > umaxi .

2.3. Modeling DoS attacks

In a DoS attack, we assume that a sensor signal does not reach the controller or that a control signal does not reach an actuator. Because the controller or actuator will notice the missing signal, it is necessary to implement functionality that enables the device to respond to this event.

Let ˆu and ˆy denote the response strategies for handling

DoS attacks. A conservative response strategy uses the last signal received as the current command. In other words, the controller assumes that the missing sensor measurement is the same as the measurement it last received:

ˆ

ypasti (t) = (

yi(t) fort 6∈

Ta

yi(ts) for t ∈

Ta

.

A similar assumption can be made for a DoS attack on a control signal. In particular, we assume that an actuator con-tinues operating based on the control signal corresponding to the manipulated variable value that it last received:

ˆ upast i (t) = ( ui(t) fort 6∈

Ta

ui(ts) for t ∈

Ta

.

(7)

I N T E R N A T I O N A L J O U R N A L O F C R I T I C A L I N F R A S T R U C T U R E P R O T E C T I O N2 ( 2 0 0 9 ) 7 3 – 8 3

79

Fig. 10 – DoS attack on y5.

3.

Experimental results

This section describes the experimental setup and analyzes the experimental results.

3.1. Chemical reactor system

A chemical reactor system with a proportional integral (PI)

control algorithm [12] is investigated in this paper. The

dynamical model was coded in FORTRAN and the control algorithm in Matlab. The attacks were implemented using Matlab.

Fig. 3shows the model of the chemical reactor system.

Four chemical components are involved (A, B, C, and D). The

goal of the control system is to maintain the irreversible

reactionA + C −→B D at a specified rate while keeping the

pressure inside the tank below 3000 kPa. Note thatB is an

inert component.

The chemical reactor system has three actuators. The first actuator, which is controlled byu1(t), operates a valve that

controls feedF1containing the chemical componentsA, B,

andC. The second actuator, controlled by u2(t), is a valve that controls feedF2containingA. The third actuator, controlled

byu3(t), is a valve that purges the gas created by the chemical reaction. Each control signalui(t) has a range between 0% (the valve is completely closed) and 100% (the valve is completely open).

The control algorithm [12] uses data from three sensors

that monitor the product flow (y4), pressure inside the tank (y5), and amount of componentA in the purge (y7). Note that

u1is a function ofy5andy4,u2is a function ofy7, andu3is a function ofy5.

Fig. 4shows the chemical plant outputs without any noise

inputs.Fig. 5shows the plant outputs with Gaussian noise

inputs. Specifically, Gaussian process noise (disturbance) with a mean of 0 and a variance of 0.05 is introduced at each valve. Note that the disturbances cause the system not to return to the steady state.

The chemical reactor system is simulated from t = 0 to

t = 40 (h). Note that all the attacks in the experiments are

executed fromt = 10 to t = 30 (h).

3.2. Integrity attacks

We assume that the goal of the attacker is to raise the pres-sure inside the reactor vessel to an unsafe value (greater than 3000 kPa), causing equipment damage and possibly an explo-sion.

The integrity attacks (scaling, additive, and constant

attacks) described in Section 2.2 were implemented. Only

one sensor or controller was attacked at a time. The max

andmin attacks were the most effective; however, not all the

attacks were able to drive the pressure to an unsafe level. We summarize the results below.

(8)

Fig. 11 – DoS attack on y5and integrity attack on y4.

When a sensor is attacked, the controller can be expected to output an incorrect control signal because it operates on in-correct sensor information. If an attacker does not know the plant dynamics or the control algorithm, he/she may compro-mise a sensor at random. We assume the attacked sensor is

y7.Fig. 6shows the effect of aymax

7 attack, which informs the

controller that there is a large amount of componentA in the

reactor vessel. The simulations demonstrate that the plant re-turns to the steady (safe) state after the attack. Furthermore, the pressure in the reactor vessel is always below 3000 kPa.

Our experiments demonstrate that the chemical reactor system is very resilient to attacks ony7,y4, andu2. Constant

attacks are the most damaging, but they do not move the system to an unsafe state.

An attacker with knowledge about the system dynamics and control system operation would recognize that control signalsu1andu3directly influence the pressure in the reactor

vessel. Furthermore, the sensor that monitors the pressure in the reactor vessel tanky5would be an attractive target.

Fig. 7shows the results of launching attackymin5 . During the attack, the controller believes the pressure in the tank to be very low (0 kPa). Therefore, it shuts the purge valve with the goal of increasing the pressure. Because the sensor keeps sending the false pressure reading of 0 kPa, the controller keeps the purge valve shut for the duration of the attack. In our experiments, it took about 20 hours for the attack to increase the pressure above 3000 kPa (the unsafe state). This

time period is long enough for plant operators to observe the unusual phenomenon and take the appropriate mitigation steps.

In the following, we discuss the effects of attacking control

signals u1 and u3, which appear to be promising from an

attacker’s point of view.

Intuitively, it appears that shutting down the purge valve would increase the pressure. Therefore, we decided to launch attackumin

3 (t). The results are shown inFig. 8. The original

signal computed by the controller is discarded and the attack forces the purge valve to close. This causes the chemical components to accumulate in the reactor vessel. However, although the accumulation raises the pressure from 2700 kPa to 2900 kPa (y5curve), it does not force the chemical reactor system to an unsafe state. The reason is that the control signalu1is also dependent ony5; thus, when the pressure

rises, the feed rate is correspondingly reduced.

Finally, we discuss the effects of launching attackumax1 (t) (Fig. 9). The original signal computed by the controller is dis-carded and the valve for Feed 1 is opened completely. In this case, large amounts of input flow to the reactor, causing the pressure to rise above 3000 kPa (y5curve). Note that this

at-tack forces the system to an unsafe state in the shortest time. We conclude that in order for a plant operator to prevent an attack from moving the system to an unsafe state, he/she should prioritize the protection of the control signalu1. The

(9)

I N T E R N A T I O N A L J O U R N A L O F C R I T I C A L I N F R A S T R U C T U R E P R O T E C T I O N2 ( 2 0 0 9 ) 7 3 – 8 3

81

Fig. 12 – Integrity attack on the Loop 2 controller.

pressure by attackingy5takes a long time, the problem may

be alleviated by monitoring the system and implementing the appropriate response when an anomaly is detected.

3.3. DoS attacks

Our experiments demonstrate that launching a DoS attack on a single device and implementing ˆupast or ˆypast does not have a major impact when the plant reaches a steady state. For example, note that the DoS attack on sensory5in Fig. 10does not cause the curve fory5to change significantly.

Similar responses are obtained for all the other sensors and actuators. We conclude that the effects of DoS attacks on individual devices are limited and that protecting against integrity attacks should be a priority.

DoS attacks, however, can be launched in combination with innocuous integrity attacks to cause significant damage.

Consider, for example, a DoS attack ony5 coupled with an

integrity attack on the production ratey4(which introduces a small variation of ys4(t) with α = 0.5). After the attacks are launched, the Loop 1 controller opens the Feed 1 valve to increase the production rate. This increases the flow of reactants to the reactor vessel, but the pressure sensory5, which is targeted by the DoS attack, fails to observe that the pressure in the vessel is rising. The resulting accumulation of

reactants causes the pressure to exceed 3000 kPa in a fairly short time. Note that the changes toy4andy5inFig. 11start

at timet = 10 when the attacks are launched.

3.4. Operating cost attack

Apart from forcing the chemical reactor system to an unsafe state, the attacker may wish to have a negative economic impact by increasing its operating cost. Such an attack is not easily detected and can produce large economic losses in the long term.

Estimating the cost of an attack in a typical information technology environment is often difficult because it is neces-sary to produce valuations for information loss (e.g., stolen data) and opportunity cost (e.g., DoS attack against an e-commerce website). However, estimating the cost of an attack on a control system is easier because the operating cost of a plant can be computed based on the reactants consumed and the production rate.

In our plant model, the instantaneous operating cost depends on the quantities of reactantsA (yA3) andC (yC3) and

FlowF3and FlowF4. According to Ricker [12], the operating

cost of the chemical plant is given by

cost = F3

F4(2.206yA3

(10)

Fig. 13 – Integrity attack on y4.

The operating cost is proportional to the purge flow (F3)

and the quantities of reactants A (yA3) and C (yC3) in the purge. Thus, an attacker may either target a controller to maximize the purge flow or target a sensor to confuse the

controller and increase the quantities of the reactants A

andC.

Now consider an attack on the Loop 2 controller. In this case, the purge valve is opened to increase the purge flow (largerF3value).Fig. 12shows how the attack increases the

operating cost of the plant (fromt = 10 to t = 30).

Next, consider an integrity attack on sensory4that sends

an incorrect (zero) signal to the Loop 1 controller indicating that there is an insufficient quantity of reactants in the tank. In attempting to the maintain the production rate, the controller issues an incorrect control signalu1to increase the

feed rate of A, B, and C by opening the Feed 1 valve. The

increased quantity of reactants results in higher production flow (F4) and higher reactor pressure (curve y5 in Fig. 13). However, upon detecting the change in pressure, the Loop 2 controller turns on the purge valve to regulate the pressure.

This increases the purge flow F3, which leads to a higher

operating cost, as shown inFig. 13.

Based on the experiment results, we can conclude that targeting the purge flow valve is the most effective strategy

for increasing the operating cost of the chemical reactor system.

4.

Conclusions

Formal models of process systems, control systems, and attacks provide a powerful mechanism for reasoning about attacks and their consequences. The investigation of integrity and DoS attacks on a chemical reactor system reveals several important points. A DoS attack has relatively little impact on the system in steady state; however, a DoS attack launched in combination with an innocuous integrity attack can produce serious consequences. An attacker needs to identify and attack the key sensors in order to drive a system to an unsafe state; in the case of the chemical reactor, targeting the reactor pressure sensor is most effective as it rapidly causes the system to cross the safety threshold. In general, attacks on control signals are more serious than attacks on sensor signals. Finally, an attack on plant economy involves a radically different strategy that an attack on plant safety.

Our future research will attempt to develop systematic techniques for evaluating the impact of simultaneous attacks. Another area of focus is the design of automatic attack

(11)

I N T E R N A T I O N A L J O U R N A L O F C R I T I C A L I N F R A S T R U C T U R E P R O T E C T I O N2 ( 2 0 0 9 ) 7 3 – 8 3

83

detection and response mechanisms that can enhance the resilience of control systems.

Acknowledgements

We wish to thank Adrian Perrig, Bruno Sinopoli, Gabor Karsai, and Jon Wiley for useful discussions related to control systems security. This effort was partially supported by the International Collaboration for Advancing Security Technology (iCAST) and the Taiwan Information Security Center (TWISC) Projects under Grants NSC97-2745-P-001-001, NSC97-2918-I-009-005 and NSC98-2219-E-009-003, respectively.

R E F E R E N C E S

[1] E. Byres, Designing secure networks for process control, IEEE Industry Applications 6 (5) (2000) 33–39.

[2] E. Byres, J. Lowe, The myths and facts behind cyber security risks for industrial control systems, in: VDE Congress, 2004. [3] E. Goetz, S. Shenoi (Eds.), Critical Infrastructure Protection,

Springer, Boston, Massachusetts, 2007.

[4] V. Igure, S. Laughter, R. Williams, Security issues in SCADA networks, Computers and Security 25 (7) (2006) 498–506.

[5] T. Kilpatrick, J. Gonzalez, R. Chandia, M. Papa, S. Shenoi, Forensic analysis of SCADA systems and networks, Interna-tional Journal of Security and Networks 3 (2) (2008) 95–102. [6] P. Oman, E. Schweitzer, J. Roberts, Protecting the grid from

cyber attack — Part 2: Safeguarding IEDs, substations and SCADA systems, Utility Automation & Engineering T&D 7 (1) (2002) 25–32.

[7] M. Papa, S. Shenoi (Eds.), Critical Infrastructure Protection II, Springer, Boston, Massachusetts, 2008.

[8] K. Stouffer, J. Falco, K. Kent, Guide to Supervisory Control and Data Acquisition (SCADA) and industrial control systems security – initial public draft, National Institute of Standards and Technology, Gaithersburg, Maryland, 2006.

[9] P. Tsang, S. Smith, YASIR: A low-latency, high-integrity security retrofit for legacy SCADA systems, in: Proceedings of the Twenty-Third IFIP TC 11 International Information Security Conference, 2008, pp. 445–459.

[10] United States Computer Emergency Readiness Team (US-CERT), Control Systems Security Program, U.S. Department

of Homeland Security, Washington, DC. www.us-cert.gov/

control_systems/index.html.

[11] A. Wright, J. Kinast, J. McCarty, Low-latency cryptographic protection for SCADA communications, in: Proceedings of the Second International Conference on Applied Security and Network Security, 2004, pp. 263–277.

[12] N. Ricker, Model predictive control of a continuous, nonlinear, two-phase reactor, Journal of Process Control 3 (2) (1993) 109–123.

數據

Fig. 2 – Attacks on control systems.
Fig. 4 – Plant outputs without noise.
Fig. 6 – Integrity attack y max 7 from t = 0 to t = 30.
Fig. 8 – Integrity attack u min 3 from t = 0 to t = 30.
+6

參考文獻

相關文件

In order to apply for a permit to employ Class B Foreign Worker(s), an Employer shall provide reasonable employment terms and register for such employment demands with local

Should an employer find it necessary to continue the employment of the Class A Foreign Worker(s), the employer shall, within four (4) months prior to the expiration of the

These types of attacks are what we call algebraic replay attacks targeting the challenge- response mechanism in authentication protocols, attribute acquisition attacks on

• When a system undergoes any chemical or physical change, the accompanying change in internal energy, ΔE, is the sum of the heat added to or liberated from the system, q, and the

 develop a better understanding of the design and the features of the English Language curriculum with an emphasis on the senior secondary level;..  gain an insight into the

Students are asked to collect information (including materials from books, pamphlet from Environmental Protection Department...etc.) of the possible effects of pollution on our

Wang, Solving pseudomonotone variational inequalities and pseudocon- vex optimization problems using the projection neural network, IEEE Transactions on Neural Networks 17

Define instead the imaginary.. potential, magnetic field, lattice…) Dirac-BdG Hamiltonian:. with small, and matrix