A Novel Hybrid MCDM Procedure for Achieving Aspired Earned Value Project Performance

17  Download (0)

全文

(1)

Research Article

A Novel Hybrid MCDM Procedure for Achieving Aspired

Earned Value Project Performance

Shou-Yan Chou,

1

Chien-Chou Yu,

1

and Gwo-Hshiung Tzeng

2,3

1Department of Industrial Management, National Taiwan University of Science and Technology, No. 43, Section 4, Keelung Road, Taipei 106, Taiwan

2Graduate Institute of Urban Planning, College of Public Affairs, National Taipei University, No. 151, University Road, San Shia, New Taipei City 23741, Taiwan

3Institute of Management of Technology, National Chiao Tung University, No. 1001, Ta-Hsueh Road, Hsinchu 300, Taiwan

Correspondence should be addressed to Gwo-Hshiung Tzeng; ghtzeng@mail.ntpu.edu.tw Received 30 December 2015; Revised 14 April 2016; Accepted 19 April 2016

Academic Editor: Danielle Morais

Copyright © 2016 Shou-Yan Chou et al. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited. A better-performing project gains more subsequent businesses. Many organizations worldwide apply an earned value management (EVM) system to monitor and control their projects’ performance. However, a successful EVM application requires handling multiple interinfluenced criteria with feedback effects for decision-making and continuous improvements throughout the application life cycle. The conventional decision approaches assume that preferences between criteria are independent and put their focuses on decision-making. This study employs a hybrid multiple criteria decision-making (HMCDM) method to devise a novel procedure to fulfil the deficiencies. The proposed procedure enables us to evaluate interinfluence effects and gap indices among criteria/dimensions/alternatives and then systemize the evaluation results in a context of influential network relation map (INRM). The INRM provides managers with visual information to find a route in making application decisions, while identifying critical gaps for continuous improvements. A numerical example is presented to illustrate the applicability of the proposed procedure. The results show that, by employing the HMCDM method, the proposed procedure can provide organizations with a foundation to ensure that the aspired EVM application outcomes are achieved at different levels within an organization.

1. Introduction

A project is “a temporary endeavor undertaken to transform limited resources into a unique product, service, or result,” in order to satisfy the needs of society, users, and customers [1]. A better-performing project gains more subsequent business-es and is ultimately of strategic importance to an organization [2, 3]. To attain high performances, many organizations worldwide apply an earned value management (EVM) system to monitor and control their projects [4, 5]. A successful EVM application enables us to produce reliable performance indices at initial stages of a project, as early as 15 to 20 percent of the project process [6, 7], thus allowing organizations to understand project health, predict future trends, and take required control actions to minimize deviations, thereby

attaining the aspired performances throughout the project life cycle [8–10].

According to Kim et al. [11], EVM application can be formulated as a multiple criteria decision-making (MCDM) problem, which requires experts to analyze a set of interin-fluenced application criteria with feedback effects through-out the application process. Some of these criteria include the following: using information systems to report project progress in an accurate and timely manner [11]; using a project management process to break down the project scope and organizational structure [7]; training stakeholders in the effective use of EVM [12]; and providing ongoing efforts to improve the application of EVM [10]. According to Fleming and Koppelman [7], a lack of accurate understanding of the above-mentioned interinfluenced criteria can lead to Volume 2016, Article ID 9721726, 16 pages

(2)

a series of shortcomings in the implementation of an EVM. Kwak and Anbari [5] have also noted that without adequate analysis upfront, even after an application decision has been conceived and implemented, unanticipated efforts will be required to solve new problems as the implementation pro-ceeds. These studies demonstrate the importance of adopting a systematic procedure to analyze interinfluenced criteria associated with the EVM application decision. Additionally, to obtain aspired application outcomes, continuous improve-ments should be also early considered in order to prevent the selected decisions from producing negative outcomes [2, 13, 14].

However, according to the literature review of this study, most traditional MCDM approaches assume that the pref-erences between decision variables are independent and put their emphasis on evaluation and selection of decision alternatives without addressing practical means to implement required improvements [15–20]. Yet, as discussed previously, EVM application requires a decision approach that addresses these issues. Consequently, this study employs a hybrid mul-tiple criteria decision-making (HMCDM) method to devise a novel procedure to fulfil the above-mentioned deficiencies. The HMCDM method contains a decision-making trial and evaluation (DEMATAEL) technique [21], a DEMATEL-based analytical network procedure (DANP) [22], and a mod-ified multicriteria optimization and compromise solution (ViseKriterijumska Optimizacija I Kompromisno Reˇsenje in Serbian; VIKOR) method [23]. This combined approach was introduced by Tzeng [17] as a new trend of decision-making. Recently, it has been successfully applied in different business fields to solve and improve complex and interdependent real-world problems [18, 22, 24–27] and is thus examined in this study.

The proposed novel procedure uses experts’ judgments to model interdependent EVM application problems with a decision framework considering improvement require-ments. The procedure then employs the HMCDM method to quantify gap indices with respect to aspiration levels of EVM application based on interinfluence effects among fac-tors/dimensions/alternatives within the decision framework. Finally, the HMCDM method systemizes the quantitative results in the context of influential network relation maps (INRM). The INRM helps managers find a route for EVM application decisions, while identifying critical gaps for prior improvements throughout the life of the decisions implemen-tation. A numerical example is presented to illustrate how the proposed procedure operates in practice. The results show that, by employing the HMCDM method, the proposed pro-cedure can provide organizations with a foundation to ensure that the aspired EVM application outcomes are achieved at different levels within an organization. The remainder of this paper is organized as follows. In Section 2, the EVM literature is reviewed in relation to the proposed procedure; in Section 3, essential concepts of the HMCDM model are presented and the proposed procedure is introduced; in Section 4, a numerical example showing the applicability of the proposed procedure is presented, and main findings are discussed; conclusions are provided in the final section.

2. The Literature Review about

Earned Value Management

This section briefly reviews research literature associated with EVM application and then identifies the dimensions and factors/criteria for establishing a decision framework in formulating the proposed procedure for pursuing the aspiration levels of EVM application through better decision-making and continuous improvements.

The EVM was originally developed as a technique by the United States Department of Defense (DoD) in the 1960s to manage the financial aspects of major acquisition projects. In 1967, the DoD adopted the 35 standardized EVM managerial criteria, defined by the United States Air Force as the Cost Schedule Control System Criteria (C/SCSC). This regulatory system was used by the DoD and its contractors to monitor and control various projects over the next three decades [7]. In 1996, the National Defense Industrial Association reduced the EVM criteria to a total of 32, which were formally accepted by ANSI/EIA in 1998 in their publication of the ANSI/EIA 748-98 standard, known as EVM system [6]. During the following year, the Project Management Institute (PMI) adopted EVM as a managerial tool and technique to monitor projects, as stated in its publication titled A Guide to the Project Management Body of Knowledge (PMBOKGuide) and subsequently described in a separate publication, Practice Standards for Earned Value Management. These publications and the promotion of EVM principles, including their regulation, standardization, and simplification, have led to increasing interest in the use and development of innovative applications of EVM among organizations and experts worldwide [3, 5, 6, 8, 11].

However, while EVM has been widely accepted as one of the most pragmatic systems for managing project perfor-mances in both public and private organizations, the studies have also noted that the development of EVM elements and the wide acceptance of EVM do not in themselves guarantee that the EVM application will be successful for projects in all organizations [2, 6]. Some of the common issues arising in projects managed through EVM in different organizations, including the U.S. government and its subsidiary agencies, include overbudgeting, schedule delays, and unsatisfactory performance [5, 11]. These phenomena indicate that even in organizations with long-term operational experience, the implementation of EVM can result in deviations from organizations’ aspiration level [7]. Hence, the subject of the effective EVM application requires further study to assist organizations in obtaining intended outcomes. In particular, organization must enable us to assess the current capability of each subordinate unit to understand whether EVM applica-tion decisions could eventually help the unit to better manage project performance. What application factors should be in place for each unit to apply EVM and to avoid the need for unintended efforts during the implementation of EVM decisions? Furthermore, if the EVM application is justified as inappropriate, then how can each unit improve its weakness to facilitate benefits through EVM application in the future?

According to the American National Standards Institute/ Electric Industries Association, a reliable EVM application

(3)

should consider 32 criteria belonging to five categories:(1) organization, (2) planning and budgeting, (3) accounting, (4) analysis and revision, and (5) data maintenance [28]. Fleming and Koppelman [7], who conducted research on many software projects, proposed ten “must-haves” that are required to fully grasp and apply the critical earned value concept in enhancing the management of all types of projects in an industry. These ten “must-haves” require the complete definition of a project’s scope of work using a work breakdown structure (WBS) at the outset of project planning as well as through the continuous management of all changes during project execution. Another study by Kwak and Anbari [5] based on the National Aeronautics and Space Association (NASA) indicated that key success factors for the imple-mentation of EVM included the early introduction of EVM, the full involvement of users, and consistent communication with all stakeholders. Lipke [10] argued that the elements required for executing projects and facilitating continuous improvement are necessary ingredients for EVM application to ensure successful project outcomes. These studies have provided useful information for understanding the factors influencing the successful EVM application from different perspectives but lack an integrated or systematic procedure for analyzing level of readiness of these factors when making application and improvement decisions.

Stratton [12] proposed a five-step mature model of earned value management to enhance the quality and use of EVM within an organization. This model can be linked to the ANSI/ESI standard 748 to create assessment matrices that help users to evolve an EVM within their own organizations and to assess the relative strengths of various EVM appli-cations. This study has focused on developing a systematic procedure for analyzing effectively EVM implementation while assuming the independence of the factors in the assessment matrices. This assumption conflicts with the real-world application situations discussed in many other studies [3, 5, 10].

A more comprehensive study by Kim et al. [11] used sur-veys mailed to 2,500 individuals and on-site case studies con-ducted within six organizations and concluded that approx-imately 40 interactive factors in four dimensions (the EVM user, the EVM methodology, the implementation process, and the project environment) could influence significantly the EVM application in four ways:(1) accepting the concept, (2) applying EVM by project managers and team members, (3) enabling projects to be completed within constraints and with satisfactory performance, and(4) bringing overall sat-isfaction to users of this methodology. The study concluded by proposing an implementation framework to assist both industrial and government agencies applying EVM more effectively for different sizes and types of projects. However, the proposed model and framework were qualitative in nature and did not provide a systematic mean to quantitatively analyze interrelated effects among the dimensions/factors for application decisions and management actions.

According to the literatures discussed above, the fac-tors/criteria influencing the effective EVM application can be grouped into four dimensions: the EVM user, the EVM methodology, the implementation process, and the project

environment. Each dimension contains respective factors, as shown in Table 1. In the next section, a novel procedure based on the HMADM method is proposed to evaluate and analyze these interdependent application dimensions/factors in relation to the selection and improvement of application decisions, with the goal of obtaining aspiration levels of EVM application.

3. The Proposed Procedures for Obtaining

the Aspiration Levels of EVM Application

To explain the proposed procedure, this section first briefly introduces the essential concepts related to the HMCDM model that combines the following elements: DEMATEL technique, DEMATEL-base ANP, and modified VIKOR; sub-sequently, this section discusses how the model is employed to develop the proposed procedure.

The HMCDM model was proposed by Tzeng [17], who combined new concepts and techniques to handle complicate and dynamic real-world problems. First, the HMCDM model employs the DEMATEL technique to quantify interinfluence effects among decision variables and visualize the effects on an influential network relation map (INRM). The DEMATEL technique was developed by the Battelle Geneva Institute in 1972 for assessing and solving complex groups of problems [29]. This technique used Boolean operation and Markov Process to quantify cause and effect relationships on each dimension/criterion within a system (or subsystem). Quan-titative values results are then systemized on a single map showing degree and direction that each dimension/criterion can influence each other and to the overall system per-formance [30]. The interinfluence values of DEMATEL can not only help managers gain valuable information for understanding specific societal problems, but also be further used with other methods to obtain more precise weight-ing values and gap indices in dealweight-ing with the real-world decision and improvement problems [21, 31]. Second, this model provides a procedure known as DANP that applies a basic concept of the analytic network procedure (ANP) to transform the interinfluence value of DEMATEL into influential weights (IWs) for prioritizing decision variables. ANP was proposed by Saaty [32] to address interdependence and feedback among the factors, dimensions, or alternatives associated with a decision-making problem. However, ANP assigns identical weights for each cluster per group on the normalized supermatrix, neglecting the influence in different degree. DANP used DEMATEL technique to adjust the ANP equal weighting assumption for better communication of real interdependent situations and improvement alternatives and decisions [22, 31]. These features avoid the assumption of traditional decision models, such as AHP, TOPSIS, path analysis, and SEM, that the value creation criteria are inde-pendently and hierarchically structured, thereby enabling interdependent decision situations to be viewed as decision process and outcomes [18].

Third, this model adopts the principle of “aspiration levels” [33] to replace the traditional max/min approach [15, 34], through a modified VIKOR method, when choosing

(4)

Table 1: Evaluation factors and dimensions.

Dimensions/factors Descriptions

EVM users (𝐷1)

Experience (𝐶1) Experience in using EVMS

Training (𝐶2) Training at school and on-job training to understand how to useEVMS

Administrative capabilities (𝐶3) Administrative expertise of project managers

Technical capabilities (𝐶4) Technical expertise of project managers

Changes in work contents (𝐶5) Acceptance of power shift after implementing EVMS EVM methodology (𝐷2)

WBS (𝐶6) Using work breakdown structure (WBS) details project scopes

CPM (𝐶7) Using the Critical Path Method (CPM) as scheduling tool ofprojects

IPT (𝐶8) Using Integrated Project Team (IPT) facilitates understanding

among project participants

Computer system (𝐶9) Using automated computer system as part of EVMS process

Integrated project management (𝐶10) Using a project management system including EVMS

Implementation process (𝐷3)

Open communication (𝐶11) Open communications among project team players including

customers

Sufficient resources (𝐶12) Provision of sufficient resources in the EVMS process

Top-down approach (𝐶13) Top management perceives EVMS as a pragmatic way in managing

project effectively

Integrated change control system (𝐶14) Using separated office to handle required changes justified byEVMS

Continuous improvement (𝐶15) Providing ongoing efforts to improve application of the EVMS

Project management environment (𝐷4)

Colleague-based work environment (𝐶16) A colleague-based project management environment as opposedto bureaucratic culture

Ownership of EVM to lower level project managers (𝐶17) Flexibility allowed lower level project managers

Risk free (𝐶18) Allowing project players to select their own form of EVMS use

within a general framework

Culture (𝐶19) A strong trust and supportive culture in which project is performed

Regulations (𝐶20) Complete regulations for implementing EVMS

a relatively good solution from existing alternatives. This feature produces the size of performance gaps to aspiration levels on each criterion/dimension/alternative, thus enabling managers to use a single value for both decision-making and continuous improvements [25]. The VIKOR method was proposed by Opricovic [35] to solve problems that involve incommensurable and conflicting factors. Originally, this method focused on analyzing a set of alternatives and select-ing a compromise solution closest to the ideal state [34]. The ideal state was defined as a set of maximum/minimum values relating to each benefit/cost criterion among all alternatives. However, these traditional compromises can entail “choosing the best among inferior options/alternatives”: that is, pick the best apple in a barrel of rotten apples; thus, the traditional procedure has to entail “improving” the potential solutions [18]. Hence, Tzeng [17] proposed the modified VIKOR method to replace the maximum/minimum approach with “aspired-worst” by setting 𝑓𝑗∗ = 10 and 𝑓𝑗− = 0 as the aspiration level and the worst level, respectively, for criterion

𝑗, if performance scores with measuring range are from 0 to 10 in questionnaires of each criterion as complete dis-satisfaction/bad← 0, 1, 2, . . . , 4, 5, 6, . . . , 8, 9, 10 → extreme satisfaction/good. Recently, this method has been used to aid decision makers in identifying critical gaps in need of further improvement [36, 37].

Combining all these concepts and techniques, the HMCDM model allows managers to avoid “choosing the best among inferior options/alternatives,” (i.e., avoiding “picking the best apple among a barrel of rotten apples”) [17]. More importantly, the HMCDM model extends the evaluation and selection of decision functions to include identification of critical gaps for continuous improvement over the life of decision implementation [24, 27, 37]. The detailed descrip-tions, notadescrip-tions, and computational processes can be found in [17, 19, 26, 38].

This study applies the HMCDM model to devise a novel procedure for obtaining aspiration levels of EVM applica-tion through four main stages: (1) form an expert team,

(5)

Develop a decision framework

Compute initial-average influence relation matrix A

Expert questionnaire Form expert team

User questionnaire

Construct influential network relation map

INRM

Top management

Make decisions and determine strategies

for continuous improvements

DEMATEL technique

DANP

Modified VIKOR method Compute gap indices usingRl = Sl+ (1 − )Ql

the maximal regret usingQl= maxj{rlj| j = 1, . . . , n}

Normalizefljusingrlj= (|fj∗− flj|)/(|fj∗− fj−|)

Set aspiration levelfj+and the worst levelfj

Performance valuesflj

Compute weighted supermatrixW𝛼= TD𝛼W

Transpose intoW = (TC𝛼)󳰀 Normalize intoTD𝛼

NormalizeA into initial-influence matrix D = A/s

Compute total-influence matrixT = D(I − D)−1

Classify factors into correspondent dimensionTC

Average each dimension intoTD

Normalize intoTC𝛼

Compute influential weights (IWs) on factors/dimensions

(W𝛼)u⇒w=(w1, . . . ,wj, . . . ,wn)

limu→∞

Compute the average value usingSl= ∑nj=1wjrljand

Figure 1: A graphical representation of the proposed procedure.

(2) develop a decision framework, (3) systemize and visualize decision information using HMCDM model, and(4) make application decisions and determine improvement strategies based on INRM. A graphical representation of our procedure is depicted in Figure 1.

As shown in Figure 1, the proposed procedure first forms an expert team (ET) through a top management commit-tee according to the predetermined qualifications. Second, the ET identifies influencing criteria to develop a novel decision framework (Figure 2) which considers both the decision-making and continuous improvements associated with an interrelated decision problem. The decision frame-work developed in this stage is different from traditional ones which only consider decision-making. Third, based on

the decision framework, the procedure uses the HMCDM model to evaluate, systemize, and visualize decision and improvement information including the following: comput-ing interinfluence effects uscomput-ing the DEMATEL technique; computing influential weights using DANP; computing gap indices using modified VIKOR method; and, lastly, sys-temizing the decision information obtained from the pre-vious steps on the visualized DEMATEL’s INRM, showing preference of alternatives and how much improvement is required for each criterion and dimension associated with each alternative. Finally, referring to the INRM, the ET gains valuable information to finalize application decisions with top management and stakeholders, while determining strategies for continuous improvements in achieving the

(6)

Goal Dimensions Criteria (factors) Alternatives Gaps Implementation process Open communication Sufficient resources Top-down approach Integrated change control Continuous improvement Unit 2 EVM users Experience Training Administrative capabilities Technical capabilities Changes in work contents EVM methodology WBS CPM IPT Computer system Integrated project management Unit 1 Project environment Colleague-based work environment Ownership of EVM to lower level project managers Risk free Culture Regulations z units in an organization D1 D2 D3 D4 C1 C2 C3 C4 C5 C6 C7 C8 C9 C10 C11 C12 C13 C14 C15 C16 C17 C18 C19 C20 U3, . . . , Uz Unit3, . . . , Unit z U1 U2 D1 D2 D3 D4 C1, C2, C3, C4, C5 C6, C7, C8, C9, C10 C11, C12, C13, C14, C15 C16, C17, C18, C19, C20

Obtaining aspiration levels of EVM application across an organization forz units

Figure 2: The decision framework for EVM application.

aspired EVM application outcomes in an organization. In the next section, a numerical example is presented to illustrate how the proposed procedure operates in practice.

4. A Numerical Example to Illustrate

the Proposed Procedure

In this section, we use an empirical example from a defense organization to illustrate the application of the proposed procedure to a real-world problem. To preserve confidential-ity, all data related to the example have been transformed into equivalent units by normalization, which does not compromise the analysis or gap measurement for each factor

and dimension and overall alternatives in order to reach the desired aspiration levels.

4.1. Problem Descriptions. The Ministry of National Defense

(MND) of a country has been experiencing difficulties obtaining sufficient defense funding during the economic recession and is consequently considering whether to apply EVM to its acquisition units to sustain superior defense capacities with limited resources by ensuring better reg-ulation of the performance and progress of its projects. However, the MND has many acquisition units. As a result of the multisourcing strategy adopted by the MND to acquire its projects from manufacturers in the U.S., Europe, and

(7)

the domestic market, each unit exhibits certain differences in infrastructure for the management of the projects from different sources. These differences have made EVM appli-cation in the MND more complicated than in organizations with mature or identical project management infrastructures for their subordinates. To better manage this complicated situation, the MND required a comprehensive and systematic evaluation to analyze, select, and improve the appropriate decisions that would enable the aspired EVM application outcomes to be achieved in the different units. The MND therefore applied the proposed procedure in a pilot project, to assess two units and obtain satisfactory outcomes.

4.2. Application of the Procedure. Here, we illustrate the

stepwise process by which the MND applied our procedure to obtain application decisions and improvement strategies to assist subordinate units in determining how to accept and use EVM to manage project performances with aspired results.

4.2.1. Form a Team. The MND formed an ET with seven

experts, one from each of following sectors: acquisition, technology, manufacturing, logistics, end users, procure-ment, and finance. All experts were selected based on their proficiency in relation to EVM, as assessed by a top manage-ment MND committee according to a set of predetermined qualifications.

4.2.2. Develop a Novel Decision Framework. In this stage, the

ET members identify 20 influencing factors as evaluation criteria in 4 dimensions and develop a decision framework as shown in Figure 2.

In Figure 2, the highest level of the decision framework is the goal: obtaining aspiration levels of EVM application across MND for two acquisition units (two alternatives), denoted by𝑈1 and 𝑈2, where two units also represent the alternatives to be evaluated at the fourth level of the decision framework. The second and third levels contain dimensions and factors (groups of interinfluence factors), used to evaluate the alternatives. The fifth and final levels include the gaps for each dimension and factor to be measured in terms of how to reach aspiration levels through continuous improvements.

4.2.3. Systemize and Visualize Decision Information Using HMCDM Model. In this stage, the ET members first

employed the DEMATEL technique to evaluate the interin-fluence effects among 20 factors within the DF and averaged the results in an initial-average 20-by-20 matrixA = [𝑎𝑖𝑗]20×20 (Table 2).

The initial-average matrix was further normalized as an initial-influence matrixD (Table 3), using

D = A

𝑠 = [𝑑𝑖𝑗]𝑛×𝑛, (1)

where𝑠 = max(max1≤𝑖≤𝑛∑𝑛𝑗=1𝑎𝑖𝑗, max1≤𝑗≤𝑛∑𝑛𝑖=1𝑎𝑖𝑗).

Subsequently, through matrix operation using (2), a total-influence matrixT was obtained as in Table 4. In Table 4, all factors inT were further classified into the corresponding

dimensions as matrixT𝐶, and each dimension was averaged to obtain matrixT𝐷:

T = D (I − D)−1, when lim𝑢→∞D𝑢= [0]𝑛×𝑛, (2)

whereI is an identity matrix, D = [𝑑𝑖𝑗]𝑛×𝑛,0 ≤ 𝑑𝑖𝑗 < 1, 0 < ∑𝑛𝑗=1𝑑𝑖𝑗 ≤ 1, 0 < ∑𝑛𝑖=1𝑑𝑖𝑗 ≤ 1. If the summation of at least one column or one row (but not all) is equal to one, then we can guarantee that lim𝑢→∞D𝑢= [0]𝑛×𝑛.

In matrixT, the inconsistency rate (IR) of the evaluation results from all experts was only 2.70%, which is less than 5%. This result implied that the inclusion of an additional expert in this study would not influence the findings and that the significant confidence level is 97.30%.

According to Table 4, the ET employed DANP to compute the influential weights (IWs) for the dimensions and factors. During this process, the matrices T𝐶 and T𝐷 obtained through DEMATEL were normalized asT𝛼𝐶andT𝛼𝐷, and then we transposed matrix T𝛼𝐶 into an unweighted supermatrix

W = (T𝛼

𝐶)󸀠. Subsequently,T𝛼𝐷was multiplied byW to obtain

a weighted supermatrixW𝛼 = T𝛼𝐷W, as shown in Table 5, and finally multiplied byW𝛼until it converged into IWs for factors and dimensions, as shown in Table 6.

As shown in Table 6, the ET generally agreed that, in terms of the IWs of DANP, all dimensions and factors have the similar level of importance for effective EVM application. However, the DEMATEL results (Table 4) provide managers with additional information to justify the level of interinflu-ence among factors/dimensions to achieve the aspired EVM application.

After the DANP steps, the ET administered a question-naire to collect the opinions of users at different units regard-ing the outcomes that their units can achieve through EVM application based on their current operational capabilities. Typically, the main components of the questionnaire can be designed as shown in Table 7 set scores to evaluate the respective performance outcomes on a scale from 1 to 5: “N/A (1),” “A (2),” “AU (3),” “AUP (4),” and “AUPS (5).”

In this case, 18 and 20 respondents in 𝑈1 and 𝑈2 were interviewed, respectively. The ET averaged all responses as performance value𝑓𝑙𝑗 and then set the worst value𝑓𝑗− = 1 and the aspiration level (best value),𝑓𝑗∗ = 5. Subsequently, the modified VIKOR method was employed to compute the gap indices through using (3)∼(6). The computational results are summarized in Table 8:

𝑟𝑙𝑗 = (󵄨󵄨󵄨󵄨󵄨𝑓 ∗ 𝑗 − 𝑓𝑙𝑗󵄨󵄨󵄨󵄨󵄨) (󵄨󵄨󵄨󵄨󵄨𝑓∗ 𝑗 − 𝑓𝑗−󵄨󵄨󵄨󵄨󵄨) . (3) 𝑆𝑙=∑𝑛 𝑗=1 𝑤𝑗𝑟𝑙𝑗, 𝑙 = 1, 2, . . . , 𝑚, (4) where𝑤𝑗is the IWs of the factor from DANP:

𝑄𝑙= max

𝑗 {𝑟𝑙𝑗| 𝑗 = 1, 2, . . . , 𝑛} , 𝑙 = 1, 2, . . . , 𝑚. (5)

(8)

Table 2: The initial-average matrixA obtained through the DEMATEL. A 𝐶1 𝐶2 𝐶3 𝐶4 𝐶5 𝐶6 𝐶7 𝐶8 𝐶9 𝐶10 𝐶11 𝐶12 𝐶13 𝐶14 𝐶15 𝐶16 𝐶17 𝐶18 𝐶19 𝐶20 𝐶1 0.000 2.857 2.857 3.429 3.143 3.286 2.857 3.000 2.714 2.571 2.714 2.143 3.571 3.571 3.429 3.000 2.714 3.000 2.571 2.429 𝐶2 2.714 0.000 3.143 3.857 2.857 3.429 3.429 3.429 2.571 3.286 2.429 1.857 2.571 2.857 3.286 2.714 2.571 3.000 2.714 2.143 𝐶3 2.429 2.000 0.000 2.143 2.714 2.286 2.286 2.571 2.000 2.571 2.571 2.429 2.286 2.571 2.571 2.143 2.714 2.857 2.286 2.143 𝐶4 3.143 2.286 2.143 0.000 2.286 3.429 3.286 3.429 3.000 2.571 2.286 2.143 2.571 3.000 3.000 2.857 3.143 3.286 2.714 2.143 𝐶5 2.857 2.286 2.571 2.429 0.000 2.429 1.571 3.000 2.143 2.857 3.000 2.143 2.571 3.286 3.286 3.000 2.714 2.857 3.286 3.000 𝐶6 3.000 2.429 2.857 2.571 2.714 0.000 3.143 3.143 2.571 3.429 3.286 3.000 3.000 3.429 3.286 3.143 3.143 3.000 2.714 2.857 𝐶7 2.286 2.286 2.286 2.714 2.000 3.143 0.000 3.286 2.714 3.000 2.571 2.571 2.143 2.429 2.429 2.143 2.143 2.000 2.143 2.000 𝐶8 2.857 2.429 2.429 2.714 2.714 3.429 3.143 0.000 2.429 3.000 3.571 2.714 3.143 3.143 3.143 2.857 2.571 3.000 3.143 2.286 𝐶9 2.857 3.286 3.429 3.429 2.286 3.571 3.286 3.571 0.000 3.571 2.429 3.143 2.429 3.000 3.286 2.714 2.571 2.286 2.429 2.143 𝐶10 2.857 2.429 2.429 2.714 2.571 3.286 3.143 3.143 2.571 0.000 2.857 2.000 2.571 3.429 2.857 2.714 2.429 2.571 2.857 2.286 𝐶113.000 3.286 2.714 3.143 2.857 2.857 3.000 3.286 2.000 3.286 0.000 3.000 2.714 3.571 3.571 3.000 3.143 3.143 3.286 2.714 𝐶122.000 3.143 2.714 3.000 2.714 2.429 2.571 3.143 2.571 2.857 3.286 0.000 2.429 2.857 3.571 2.714 2.714 2.714 2.714 2.429 𝐶132.429 3.143 2.571 2.714 2.857 2.714 2.571 2.571 2.286 2.714 2.857 3.286 0.000 3.571 2.857 3.571 3.000 2.857 3.000 2.429 𝐶14 2.143 2.286 3.000 3.143 2.429 2.857 2.286 3.000 2.571 2.571 2.857 2.429 2.286 0.000 3.143 2.143 2.143 2.286 2.286 2.286 𝐶153.286 3.429 3.000 3.286 2.857 2.571 2.571 2.857 2.143 2.857 3.286 2.714 2.571 3.429 0.000 2.714 2.143 2.857 2.571 2.286 𝐶163.000 2.571 2.714 2.571 3.000 2.714 2.857 2.571 2.857 3.000 3.143 2.571 2.143 3.143 3.143 0.000 3.429 2.714 3.286 2.286 𝐶173.286 3.000 2.714 3.000 2.714 2.714 2.429 3.286 2.143 2.714 2.857 2.714 2.571 2.857 3.000 3.286 0.000 2.857 2.714 2.571 𝐶183.429 3.286 3.000 3.571 3.143 3.000 2.857 3.143 2.429 3.000 3.286 2.571 2.714 3.143 3.429 3.000 3.286 0.000 3.286 2.286 𝐶19 3.143 2.571 3.000 2.857 3.143 2.429 2.429 3.286 1.857 2.714 3.571 2.286 3.000 2.714 3.429 3.000 3.000 3.714 0.000 2.857 𝐶202.429 2.857 3.000 2.571 2.429 2.143 2.143 2.429 1.857 2.429 2.714 2.286 2.286 2.571 2.857 2.429 2.286 2.571 3.000 0.000

Table 3: The initial-influence matrixD obtained through the DEMATEL.

D 𝐶1 𝐶2 𝐶3 𝐶4 𝐶5 𝐶6 𝐶7 𝐶8 𝐶9 𝐶10 𝐶11 𝐶12 𝐶13 𝐶14 𝐶15 𝐶16 𝐶17 𝐶18 𝐶19 𝐶20 𝐶1 0.000 0.048 0.048 0.058 0.053 0.055 0.048 0.050 0.046 0.043 0.046 0.036 0.060 0.060 0.058 0.050 0.046 0.050 0.043 0.041 𝐶2 0.046 0.000 0.053 0.065 0.048 0.058 0.058 0.058 0.043 0.055 0.041 0.031 0.043 0.048 0.055 0.046 0.043 0.050 0.046 0.036 𝐶3 0.041 0.034 0.000 0.036 0.046 0.038 0.038 0.043 0.034 0.043 0.043 0.041 0.038 0.043 0.043 0.036 0.046 0.048 0.038 0.036 𝐶4 0.053 0.038 0.036 0.000 0.038 0.058 0.055 0.058 0.050 0.043 0.038 0.036 0.043 0.050 0.050 0.048 0.053 0.055 0.046 0.036 𝐶5 0.048 0.038 0.043 0.041 0.000 0.041 0.026 0.050 0.036 0.048 0.050 0.036 0.043 0.055 0.055 0.050 0.046 0.048 0.055 0.050 𝐶6 0.050 0.041 0.048 0.043 0.046 0.000 0.053 0.053 0.043 0.058 0.055 0.050 0.050 0.058 0.055 0.053 0.053 0.050 0.046 0.048 𝐶7 0.038 0.038 0.038 0.046 0.034 0.053 0.000 0.055 0.046 0.050 0.043 0.043 0.036 0.041 0.041 0.036 0.036 0.034 0.036 0.034 𝐶8 0.048 0.041 0.041 0.046 0.046 0.058 0.053 0.000 0.041 0.050 0.060 0.046 0.053 0.053 0.053 0.048 0.043 0.050 0.053 0.038 𝐶9 0.048 0.055 0.058 0.058 0.038 0.060 0.055 0.060 0.000 0.060 0.041 0.053 0.041 0.050 0.055 0.046 0.043 0.038 0.041 0.036 𝐶100.048 0.041 0.041 0.046 0.043 0.055 0.053 0.053 0.043 0.000 0.048 0.034 0.043 0.058 0.048 0.046 0.041 0.043 0.048 0.038 𝐶110.050 0.055 0.046 0.053 0.048 0.048 0.050 0.055 0.034 0.055 0.000 0.050 0.046 0.060 0.060 0.050 0.053 0.053 0.055 0.046 𝐶120.034 0.053 0.046 0.050 0.046 0.041 0.043 0.053 0.043 0.048 0.055 0.000 0.041 0.048 0.060 0.046 0.046 0.046 0.046 0.041 𝐶13 0.041 0.053 0.043 0.046 0.048 0.046 0.043 0.043 0.038 0.046 0.048 0.055 0.000 0.060 0.048 0.060 0.050 0.048 0.050 0.041 𝐶140.036 0.038 0.050 0.053 0.041 0.048 0.038 0.050 0.043 0.043 0.048 0.041 0.038 0.000 0.053 0.036 0.036 0.038 0.038 0.038 𝐶150.055 0.058 0.050 0.055 0.048 0.043 0.043 0.048 0.036 0.048 0.055 0.046 0.043 0.058 0.000 0.046 0.036 0.048 0.043 0.038 𝐶160.050 0.043 0.046 0.043 0.050 0.046 0.048 0.043 0.048 0.050 0.053 0.043 0.036 0.053 0.053 0.000 0.058 0.046 0.055 0.038 𝐶170.055 0.050 0.046 0.050 0.046 0.046 0.041 0.055 0.036 0.046 0.048 0.046 0.043 0.048 0.050 0.055 0.000 0.048 0.046 0.043 𝐶180.058 0.055 0.050 0.060 0.053 0.050 0.048 0.053 0.041 0.050 0.055 0.043 0.046 0.053 0.058 0.050 0.055 0.000 0.055 0.038 𝐶190.053 0.043 0.050 0.048 0.053 0.041 0.041 0.055 0.031 0.046 0.060 0.038 0.050 0.046 0.058 0.050 0.050 0.062 0.000 0.048 𝐶20 0.041 0.048 0.050 0.043 0.041 0.036 0.036 0.041 0.031 0.041 0.046 0.038 0.038 0.043 0.048 0.041 0.038 0.043 0.050 0.000

(9)

Table 4: The total-influence matrixT for factors T𝐶and for dimensionsT𝐷obtained through DEMATEL. T(T𝐶) 𝐶1 𝐶2 𝐶3 𝐶4 𝐶5 𝐶6 𝐶7 𝐶8 𝐶9 𝐶10 𝐶11 𝐶12 𝐶13 𝐶14 𝐶15 𝐶16 𝐶17 𝐶18 𝐶19 𝐶20 𝐶1 0.379 0.414 0.419 0.450 0.416 0.441 0.415 0.459 0.369 0.431 0.438 0.377 0.411 0.472 0.475 0.425 0.412 0.429 0.418 0.365 𝐶2 0.416 0.361 0.416 0.450 0.405 0.437 0.418 0.459 0.361 0.435 0.427 0.366 0.389 0.454 0.465 0.414 0.403 0.422 0.413 0.355 𝐶3 0.351 0.335 0.307 0.361 0.345 0.357 0.341 0.380 0.300 0.362 0.367 0.321 0.328 0.383 0.388 0.345 0.347 0.359 0.347 0.303 𝐶4 0.410 0.386 0.388 0.376 0.384 0.424 0.403 0.445 0.357 0.411 0.412 0.360 0.377 0.442 0.447 0.404 0.400 0.414 0.401 0.344 𝐶5 0.395 0.377 0.386 0.405 0.338 0.398 0.367 0.427 0.335 0.405 0.413 0.350 0.368 0.436 0.441 0.396 0.384 0.398 0.400 0.349 𝐶6 0.432 0.412 0.424 0.443 0.415 0.394 0.424 0.467 0.371 0.450 0.453 0.395 0.407 0.476 0.479 0.433 0.424 0.434 0.425 0.376 𝐶7 0.354 0.345 0.349 0.375 0.339 0.376 0.309 0.397 0.316 0.374 0.372 0.328 0.331 0.387 0.391 0.350 0.343 0.351 0.350 0.305 𝐶8 0.418 0.401 0.405 0.433 0.403 0.437 0.413 0.404 0.359 0.431 0.445 0.380 0.398 0.458 0.464 0.417 0.404 0.422 0.420 0.357 𝐶9 0.423 0.418 0.426 0.449 0.401 0.444 0.421 0.467 0.324 0.445 0.432 0.391 0.392 0.461 0.471 0.419 0.408 0.416 0.414 0.359 𝐶10 0.398 0.381 0.386 0.411 0.382 0.414 0.393 0.432 0.344 0.362 0.413 0.351 0.370 0.440 0.437 0.394 0.382 0.395 0.395 0.340 𝐶11 0.438 0.431 0.427 0.458 0.423 0.446 0.428 0.476 0.367 0.453 0.407 0.400 0.408 0.484 0.490 0.436 0.429 0.442 0.440 0.379 𝐶12 0.390 0.398 0.396 0.422 0.389 0.406 0.390 0.439 0.348 0.414 0.425 0.323 0.373 0.438 0.454 0.400 0.392 0.403 0.399 0.347 𝐶13 0.406 0.406 0.402 0.427 0.400 0.420 0.399 0.440 0.352 0.421 0.428 0.383 0.342 0.459 0.453 0.422 0.405 0.415 0.413 0.355 𝐶14 0.363 0.356 0.371 0.393 0.357 0.383 0.357 0.405 0.323 0.379 0.388 0.336 0.344 0.360 0.415 0.362 0.354 0.368 0.363 0.319 𝐶15 0.415 0.407 0.405 0.432 0.397 0.414 0.396 0.440 0.347 0.419 0.431 0.371 0.381 0.453 0.404 0.405 0.388 0.411 0.402 0.349 𝐶16 0.413 0.396 0.403 0.423 0.401 0.418 0.402 0.438 0.359 0.424 0.431 0.371 0.376 0.450 0.456 0.364 0.410 0.411 0.415 0.351 𝐶17 0.416 0.401 0.401 0.428 0.395 0.417 0.394 0.447 0.347 0.418 0.425 0.372 0.381 0.444 0.452 0.415 0.354 0.412 0.405 0.354 𝐶18 0.447 0.433 0.434 0.467 0.430 0.451 0.428 0.476 0.376 0.452 0.461 0.395 0.410 0.480 0.491 0.439 0.434 0.395 0.442 0.374 𝐶19 0.424 0.405 0.416 0.437 0.412 0.423 0.404 0.458 0.351 0.428 0.447 0.375 0.398 0.454 0.470 0.421 0.412 0.435 0.372 0.367 𝐶20 0.362 0.359 0.366 0.379 0.351 0.366 0.349 0.390 0.307 0.371 0.380 0.328 0.339 0.395 0.404 0.361 0.351 0.366 0.369 0.278 T𝐷 𝐷1 𝐷2 𝐷3 𝐷4 𝐷1 0.387 0.397 0.404 0.386 𝐷2 0.401 0.399 0.413 0.389 𝐷3 0.404 0.403 0.406 0.392 𝐷4 0.408 0.404 0.415 0.388

Note: where𝑡𝑝𝑖𝑗and𝑡𝑝−1𝑖𝑗 denote the average influence of factor𝑖 on 𝑗 according to 𝑝 = 7 and 𝑝 − 1 = 6 experts, respectively, and 𝑛 = 20 denotes the number of factors; thus, the results above are significant at a significant confidence level of 97.30% in gaps which is greater than the 95% level used to test for significance, that is,IR = (1/𝑛2) ∑𝑛𝑖=1∑𝑛𝑗=1(|𝑡𝑝𝑖𝑗− 𝑡𝑖𝑗𝑝−1|/𝑡𝑝𝑖𝑗) × 100% = 2.7% (0.027), and significant confidence level = 1 − IR = 97.30%.

where 𝑙 = 1, 2, . . . , 𝑚, V is presented as the weight of the strategy of maximum group utility (priority improvement) and1 − V is the weight of individual regret.

As shown in Table 8, the gap indices for alternatives𝑈1 and𝑈2are 0.520 and 0.739, respectively. These values revealed the gap size that each unit would need to be improved to reach the aspiration level. These values imply that the EVM application with required continuous improvements would enhance performance of the acquisition projects in𝑈1; however, the EVM application may not help𝑈2 to enhance the performance of projects unless the current operational capabilities of𝑈2are further improved.

Additionally, the ET developed the INRM with the use of the results of the DEMATEL and the modified VIKOR method (Tables 4 and 8). During this process, using Table 4, the ET computed the degree of total influence that a factor exerted on the other factors (sum of each row),𝑟𝑖, and the degree of total influence that a factor received from the other factors (sum of each column),𝑐𝑖. The ET also derived𝑟𝑖 + 𝑐𝑖, indicating the degree of the central role that respective dimension/factor𝑖 plays in the system, and 𝑟𝑖− 𝑐𝑖, indicating

the degree of net influence that respective dimension/factor 𝑖 contributes to the system. If 𝑟𝑖 − 𝑐𝑖 is positive, then the dimension/factor 𝑖 affects other dimensions/factors and, if 𝑟𝑖− 𝑐𝑖is negative, then the dimension/factor𝑖 is influenced

by other dimensions/factors. The results were summarized as shown in Table 9.

In Table 9, the degree of the central role (𝑟𝑖 + 𝑐𝑖) of the EVM users (𝐷1), the EVM methodology (𝐷2), the imple-mentation process (𝐷3), and the project management envi-ronment (𝐷4) are 3.174, 3.201, 3.243, and 3.171, respectively. These values indicate that all members of the ET generally agreed that all 4 dimensions play a central role in achieving the MND’s EVM application at aspiration levels. However, among the 4 dimensions, the degree of net influence (𝑟𝑖− 𝑐𝑖) on the project management environment (𝐷4) is 0.060, and an emphasis on this dimension is the basic requirement for the MND to apply EVM in managing projects effectively. This finding also implies that if the project management environment is not well established, EVM application would be affected negatively. Table 9 also contains the interinfluence effects on factors, showing valuable indications for better

(10)

Table 5: The weighted supermatrixW𝛼derived from DANP. W𝛼 𝐶 1 𝐶2 𝐶3 𝐶4 𝐶5 𝐶6 𝐶7 𝐶8 𝐶9 𝐶10 𝐶11 𝐶12 𝐶13 𝐶14 𝐶15 𝐶16 𝐶17 𝐶18 𝐶19 𝐶20 𝐶1 0.045 0.050 0.051 0.052 0.051 0.051 0.050 0.051 0.050 0.051 0.051 0.049 0.050 0.050 0.051 0.051 0.051 0.051 0.051 0.050 𝐶2 0.049 0.043 0.049 0.049 0.049 0.049 0.049 0.049 0.049 0.049 0.050 0.050 0.050 0.049 0.050 0.049 0.050 0.049 0.049 0.050 𝐶3 0.049 0.050 0.044 0.049 0.050 0.050 0.050 0.049 0.050 0.049 0.049 0.050 0.050 0.051 0.050 0.050 0.050 0.050 0.050 0.051 𝐶4 0.053 0.054 0.052 0.047 0.052 0.052 0.053 0.053 0.053 0.053 0.053 0.053 0.053 0.054 0.053 0.052 0.053 0.053 0.053 0.053 𝐶5 0.049 0.049 0.050 0.049 0.044 0.049 0.048 0.049 0.047 0.049 0.049 0.049 0.049 0.049 0.049 0.050 0.049 0.049 0.050 0.049 𝐶6 0.053 0.052 0.052 0.052 0.052 0.047 0.053 0.053 0.053 0.053 0.052 0.051 0.052 0.052 0.052 0.051 0.052 0.052 0.051 0.051 𝐶7 0.050 0.050 0.049 0.050 0.048 0.050 0.043 0.050 0.050 0.050 0.049 0.049 0.049 0.049 0.049 0.049 0.049 0.049 0.049 0.049 𝐶8 0.055 0.055 0.055 0.055 0.056 0.055 0.056 0.049 0.055 0.055 0.055 0.055 0.054 0.055 0.055 0.054 0.055 0.055 0.055 0.055 𝐶9 0.044 0.043 0.044 0.044 0.044 0.044 0.044 0.044 0.038 0.044 0.042 0.044 0.043 0.044 0.043 0.044 0.043 0.043 0.043 0.043 𝐶10 0.051 0.052 0.053 0.051 0.053 0.053 0.053 0.053 0.053 0.046 0.052 0.052 0.052 0.052 0.052 0.052 0.052 0.052 0.052 0.052 𝐶110.052 0.052 0.053 0.052 0.053 0.053 0.053 0.053 0.052 0.053 0.047 0.053 0.052 0.053 0.053 0.053 0.053 0.053 0.054 0.053 𝐶120.045 0.045 0.046 0.045 0.045 0.046 0.047 0.046 0.047 0.045 0.046 0.041 0.047 0.046 0.046 0.046 0.046 0.045 0.045 0.046 𝐶130.049 0.048 0.047 0.048 0.047 0.047 0.047 0.048 0.047 0.047 0.047 0.047 0.042 0.047 0.047 0.046 0.047 0.047 0.048 0.047 𝐶140.056 0.055 0.055 0.056 0.056 0.055 0.055 0.055 0.055 0.056 0.056 0.055 0.056 0.049 0.056 0.056 0.055 0.055 0.054 0.055 𝐶150.056 0.057 0.056 0.056 0.056 0.056 0.056 0.056 0.057 0.056 0.057 0.057 0.056 0.057 0.050 0.056 0.056 0.056 0.056 0.056 𝐶16 0.051 0.051 0.050 0.050 0.050 0.050 0.050 0.050 0.051 0.050 0.050 0.050 0.051 0.050 0.051 0.045 0.051 0.051 0.050 0.050 𝐶170.049 0.049 0.050 0.050 0.049 0.049 0.049 0.049 0.049 0.049 0.049 0.049 0.049 0.049 0.048 0.050 0.044 0.050 0.049 0.049 𝐶18 0.051 0.052 0.052 0.052 0.051 0.050 0.050 0.051 0.050 0.050 0.051 0.051 0.050 0.051 0.051 0.051 0.051 0.046 0.052 0.051 𝐶190.050 0.050 0.050 0.050 0.051 0.049 0.050 0.051 0.050 0.050 0.051 0.050 0.050 0.050 0.050 0.051 0.050 0.051 0.045 0.051 𝐶200.044 0.043 0.044 0.043 0.044 0.044 0.044 0.043 0.043 0.043 0.044 0.044 0.043 0.044 0.044 0.043 0.044 0.043 0.044 0.039

understanding critical elements in EVM application in dif-ferent units within MND.

Based on Tables 8 and 9, the INRM was developed as shown in Figure 3. Taking the dimensions as an example (on the top center in Figure 3), the𝑥-coordinate is the degree of central role𝑟𝑖+ 𝑐𝑖, and the𝑦-coordinate is the degree of net influence𝑟𝑖− 𝑐𝑖. First, we marked the coordinates of the EVM users (𝐷1), the EVM methodology (𝐷2), the implementation process (𝐷3), and the project management environment (𝐷4), which are (3.174, −0.025), (3.204, −0.001), (3.243, −0.034), and (3.171, 0.060), respectively. The process then referred to Table 4 to determine the arrow directions based on the degree of total influence between each dimension. For instance, according to Table 4, the degree of total influence of EVM users (𝐷1) on the project management environment (𝐷4) is

0.386; conversely, the degree of total influence of the project management environment (𝐷4) on EVM users (𝐷1) is 0.408. The arrow direction is then drawn from project management environment (𝐷4) to EVM users (𝐷1) because 0.408 is greater than 0.386. Likewise, the influential directions among all the dimensions and factors are determined and depicted accordingly. Additionally, the ET marked the gap indices on the INRM for factors/dimensions with respect to each alternative based on Table 8.

As shown in Figure 3, the INRM quantified and sys-temized the gap indices and the degree and direction of interinfluence effects among 20 factors within 4 dimensions associated with the aspired EVM application in the MND. Therefore, it helps managers easily analyze EVM application situations that are essential to make better application deci-sions. For example, the visualized interinfluence effects at the

dimensional level on the INRM (on the top center in Figure 3) revealed that the project management environment (𝐷4) and the EVM methodology (𝐷2) were prerequisites for qualified EVM users (𝐷1) to implement an effective process (𝐷3) to achieve the aspired application outcome. When adopting the same approach, systematic information associated with decisions to accomplish the aspired EVM application can be realized comprehensively.

4.2.4. Make Application Decisions and Determine Improve-ment Strategies. In this stage, the ET arranged a series of

meetings chaired by the MND’s top management, including representatives from related functional divisions. All of the participants reviewed Tables 1–9 and, with reference to the INRM, discussed application situations for each unit, and which factors or dimensions should be prioritized for improvements. The participants also discussed the afford-ability and availafford-ability of the resources required for potential improvements. The eventual outcome of these meetings was to apply EVM at𝑈1 and to delay its application in𝑈2 until the dimensions, factors, and/or overall gaps for that unit could be improved to a level below 0.500. Additionally, the participants determined the improvement strategies to be adopted, including allocation of the priority of and respon-sibility for a set of improvement activities. For instance, according to the size of the gap to the aspiration on the dimensions in Table 8, the ET classified the respective dimensional levels for 𝑈1 and 𝑈2 in descending order as follows: 𝑈1: {𝐷4(0.518) ≻ 𝐷3 (0.488) ≻ 𝐷1 (0.408) ≻ 𝐷2 (0.395)}; and 𝑈2: {𝐷4 (0.753) ≻ 𝐷1 (0.653) ≻ 𝐷3 (0.633) ≻ 𝐷2 (0.600)}. These values revealed that the

(11)

T a ble 6 :Th e infl uen tial w eig h ts o b ta ined th ro ug h D ANP . In flu en ti al we ig h ts for fa ct or s (𝐶𝑗 )/ dimen sio n s (𝐷 𝑗 ) Fa ct o rs 𝐶1 𝐶2 𝐶3 𝐶4 𝐶5 𝐶6 𝐶7 𝐶8 𝐶9 𝐶10 𝐶11 𝐶12 𝐶13 𝐶14 𝐶15 𝐶16 𝐶17 𝐶18 𝐶19 𝐶20 0.05 0 0.0 49 0.05 0 0.05 3 0.0 49 0.05 2 0.0 49 0.05 5 0.0 43 0.05 2 0.05 3 0.0 45 0.0 47 0.05 5 0.05 6 0.05 0 0.0 49 0.05 1 0 .05 0 0.0 43 D ime ns io ns 𝐷1 𝐷2 𝐷3 𝐷4 0.2 50 0.2 51 0.2 56 0.2 43

(12)

Table 7: Sample questionnaire responses.

Factors States of outcome Scores

N/A A AU AUP AUPS

Experience (𝐶1) x 1

Training (𝐶2) x 2

Administrative capabilities (𝐶3) x 3

Technical capabilities (𝐶4) x 4

Changes in work contents (𝐶5) x 5

Note: “N/A” not available as score 1; “A” accepted as score 2; “AU” accepted and used as score 3; “AUP” accepted, used, and enhanced performance as score 4; “AUPS” accepted, used, and enhanced performance and satisfied all users as score 5.

Table 8: Gaps indices obtained through the modified VIKOR method.

Dimension/factor Influential weights (IWs) Performance values

The size of gap to aspiration level Local Global 𝑈1 𝑈2 𝑈1 𝑈2 EVM users (𝐷1) 0.250 0.408 0.653 Experience (𝐶1) 0.201 0.050 3.350 1.944 0.413 0.764 Training (𝐶2) 0.196 0.049 3.750 2.667 0.313 0.583 Administrative capabilities (𝐶3) 0.198 0.050 3.550 2.722 0.363 0.569 Technical capabilities (𝐶4) 0.210 0.053 3.200 2.778 0.450 0.556

Changes in work contents (𝐶5) 0.195 0.049 3.000 1.833 0.500 0.792

EVM methodology (𝐷2) 0.251 0.395 0.600

WBS (𝐶6) 0.206 0.052 4.000 2.833 0.250 0.542

CPM (𝐶7) 0.196 0.049 3.500 1.722 0.375 0.819

IPT (𝐶8) 0.218 0.055 3.300 2.889 0.425 0.528

Computer system (𝐶9) 0.173 0.043 3.100 3.056 0.475 0.486

Integrated project management (𝐶10) 0.207 0.052 3.200 2.500 0.450 0.625

Implementation process (𝐷3) 0.256 0.488 0.633

Open communication (𝐶11) 0.205 0.053 3.000 3.056 0.500 0.486

Sufficient resources (𝐶12) 0.178 0.045 2.950 2.056 0.513 0.736

Top-down approach (𝐶13) 0.184 0.047 3.300 2.444 0.425 0.639

Integrated change control system (𝐶14) 0.215 0.055 3.350 2.500 0.413 0.625

Continuous improvement (𝐶15) 0.218 0.056 2.650 2.278 0.588 0.681

Project management environment (𝐷4) 0.243 0.518 0.753

Colleague-based work environment (𝐶16) 0.206 0.050 3.000 1.722 0.500 0.819

Ownership of EVM to lower level project

managers (𝐶17) 0.201 0.049 2.900 2.278 0.525 0.681

Risk free (𝐶18) 0.208 0.051 2.800 1.833 0.550 0.792

Culture (𝐶19) 0.206 0.050 2.750 2.389 0.563 0.653

Regulations (𝐶20) 0.178 0.043 3.200 1.722 0.450 0.819

Gap indices 0.520 0.739

project management environment (𝐷4) was a problem that arose for both 𝑈1 and 𝑈2. In addition, with reference to the INRM,𝐷4 (3.171, 0.060) was located in the cause group; thus, improvements in the project management environment (𝐷4) would have the greatest effects in terms of improving the other dimensions and the selected application decisions. Furthermore, the INRM (Figure 3) showed that all five factors under the project management environment (𝐷4) also belonged to the cause group: the colleague-based work environment,𝐶16 (16.132, 0.091); ownership of EVM by lower

level project managers,𝐶17 (15.913, 0.241); being risk free, 𝐶18 (16.817, 0.616); culture, 𝐶19 (16.310, 0.305); and regula-tions, 𝐶20 (14.095, 0.245). These values suggested that all factors under the project management environment (𝐷4) should be accorded top priority for improvement and that the MND should be able to achieve the strongest improvement effects. Additionally, with the cross-referencing of Table 8 and the INRM, the factors needing prior improvements in the respective units were as follows:𝑈1:{sufficient resources (𝐶12) and open communication (𝐶11) in the dimension of

(13)

Table 9: The total influence given and received on dimensions and factors obtained through DEMATEL. Dimension/factor 𝑟𝑖 𝑐𝑖 𝑟𝑖+ 𝑐𝑖 𝑟𝑖− 𝑐𝑖 EVM users (𝐷1) 1.574 1.600 3.174 −0.025 Experience (𝐶1) 8.416 8.046 16.463 0.370 Training (𝐶2) 8.265 7.821 16.086 0.445 Administrative capabilities (𝐶3) 6.928 7.925 14.853 −0.997 Technical capabilities (𝐶4) 7.984 8.418 16.402 −0.434

Changes in work contents (𝐶5) 7.768 7.785 15.552 −0.017

EVM methodology (𝐷2) 1.602 1.602 3.204 −0.001

WBS (𝐶6) 8.532 8.265 16.796 0.267

CPM (𝐶7) 7.040 7.850 14.890 −0.810

IPT (𝐶8) 8.269 8.745 17.014 −0.476

Computer system (𝐶9) 8.382 6.914 15.296 1.467

Integrated project management (𝐶10) 7.819 8.287 16.106 −0.467

Implementation process(𝐷3) 1.605 1.639 3.243 −0.034

Open communication (𝐶11) 8.660 8.393 17.053 0.266

Sufficient resources (𝐶12) 7.946 7.272 15.218 0.674

Top-down procedure (𝐶13) 8.148 7.523 15.671 0.625

Integrated change control system (𝐶14) 7.298 8.826 16.124 −1.528

Continuous improvement (𝐶15) 8.067 8.948 17.015 −0.881

Project management environment (𝐷4) 1.615 1.555 3.171 0.060

Colleague-based work environment (𝐶16) 8.111 8.020 16.132 0.091

Ownership of EVM to lower level project managers (𝐶17) 8.077 7.836 15.913 0.241

Risk free (𝐶18) 8.716 8.101 16.817 0.616

Culture (𝐶19) 8.308 8.003 16.310 0.305

Regulations (𝐶20) 7.170 6.925 14.095 0.245

implementation process(𝐷3)} and 𝑈2:{experience (𝐶1) in the dimension of EVM use (𝐷1), sufficient resources (𝐶12) in the dimension of implementation process(𝐷3)}. These factors are classified as part of the cause group, and the size of their gaps is greater than that of the other factors. In a similar fashion, the improvement strategies were determined accordingly.

4.3. Discussions and Implications. Several critical results were

derived from the above-described numerical example and from the discussion with the ET members concerning the EVM application. First, according to the DEMATEL results (Tables 5, 9 and Figure 3), the interdependent relationships among 20 factors and 4 dimensions can influence the aspired EVM application outcomes. This finding is consistent with the arguments made by many studies that a set of interin-fluenced criteria would significantly influence the effective EVM application and ultimately project performance [5, 11]. However, using the DEMATEL technique can analyze, systemize, and visualize these interdependencies in a single picture, thus revealing the degree and direction of interinflu-ence effects that each dimension and factor would exert on one another and on the aspired EVM application outcomes. Consequently, for users to be satisfied with the use of EVM to enhance their project performance, organizations require a deep understanding of these interrelationships when making application decisions. Additionally, using the DEMATEL technique can help managers to better analyze and under-stand interdependent application situations in detail.

Second, according to the results from the modified VIKOR method with the IWs of the DANP (Table 8), decisions regarding the MND’s application of EVM may differ for different units in terms of their capabilities in the management of different projects. The results confirm that the development of EVM elements and the wide acceptance of EVM worldwide may not guarantee that EVM application will be successful for all projects in all organizations. In other words, organizations will use a systematic procedure to thor-oughly analyze application situations at different levels when making suitable application decisions for all units within an organization. The members of the ET emphasized the fact that the numerical results from the modified VIKOR method and the DANP were essential for the MND, which had no prior experience in applying the EVM and encountered many different application situations in each subordinate unit. If the HMCDM procedure had not been used, the application decisions would have been identical for all units once top management had made the decision to apply EVM.

Third, according to the DANP results (Table 7), among the 20 factors, continuous improvement (𝐶15), an integrated change control system (𝐶14), and an integrated product team (IPT) (𝐶8) are prioritized as the top three factors with IWs of 0.056, 0.055, and 0.055, respectively. This result echoes the findings obtained from the previously reviewed studies, indicating that the EVM application is not merely the delivery of a system in an organization [11]. Rather, there is considerable potential for improvement, which includes

(14)

EVM users (D1)

EVM methodology (D2) Implementation process (Project management environment (D3) D

4) D4(3.171, 0.060) Gaps U1:0.518; U2:0.753 −0.04 −0.02 0.00 0.02 0.04 0.06 3.16 INRM_dimensions D1(3.174, −0.025) Gaps U1:0.408; U2:0.653 D2(3.204, −0.001) Gaps U1:0.395; U2:0.600 D3(3.243, −0.034) Gaps U1: 0.488; U2: 0.633 3.17 3.18 3.19 3.20 3.21 3.22 3.23 3.24 3.25 ri −ci ri+ ci (a) INRM_factors inD1 Experience (C1) Training (C2) Administrative capabilities (C3) Technical capabilities (C4)

change work contents (C5)

−1.04 −0.74 −0.44 −0.14 0.16 0.46 14.50 15.00 15.50 16.00 16.50 17.00 C2(16.086, 0.445) Gaps U1:0.313; U2:0.583 C4(16.402, −0.434) Gaps U1:0.450; U2:0.556 C3(14.855, −0.997) Gaps U1:0343; U2:0.569 C1(16.463, 0.370) Gaps U1:0.413; U2:0.764 C5(15.552, −0.017) Gaps U1:0.500; U2:0.792 ri −ci ri+ ci (b) ri −ci ri+ ci INRM_factors inD2 WBS (C6) CPM (C7) IPT (C8) ) Computer system (C9

Integrated project management (C10)

−0.90 −0.40 0.10 0.60 1.10 1.60 14.7 15.2 15.7 16.2 16.7 17.2 C9(15.296, 1.467) Gaps U1:0.475; U2:0.468 C6(16.796, 0.267) Gaps U1:0.250; U2:0.542 C10(16.106, −0.467) Gaps U1:0.450; U2:0.625 C7(14.890, −0.810) Gaps U1:0.374; U2:0.819 C8(17.014, −0.476) Gaps U1:0.425; U2:0.528 (c) ri −ci ri+ ci 0.70 15.10 15.60 16.10 16.60 17.10 C14(16.124, −1.528) Gaps U1:0.413; U2:0625 C12(15.218, 0.674) Gaps U1:0.513; U2:0.736 C15(17.015, −0.881) Gaps U1:0.588; U2:0.681 C13(15.671, 0.625) Gaps U1:0.425; U2:0.639 C11(17.503, 0.266) Gaps U1:0.500; U2:0.486 −1.80 −1.30 −0.80 −0.30 0.20 INRM_factors inD3 Sufficient resources (C12)

Integrated change control system (C14)

Top-down approach (C13) Open communication (C11) Continuous improvement (C15) (d) ri −ci ri+ ci INRM_factors inD4 Risk free (C18) Regulations (C20) Culture (C19)

Ownership of EVM to lower level project managers (C17)

Colleague-based work environment (C16)

C18(16.817, 0.616) Gaps U1:0.550; U2:0.792 C19(16.310, 0.305) Gaps U1:0.563; U2:0.653 C15(16.132, 0.091) Gaps U1:0.500; U2:0.819 C17(15.913, 0.241) Gaps U1:0.525; U2:0.681 C20(14.095, 0.245) Gaps U1:0.450; U2:0.819 13.90 14.40 14.90 15.40 15.90 16.40 16.90 0.00 0.30 0.60 (e) Figure 3: The INRM.

(15)

continuing to identify weaknesses in EVM and regard them as opportunities for improvements [5]. Additionally, accord-ing to the results of the modified VIKOR method (Table 8), each dimension/factor can create different sizes of gaps to impact aspired EVM application in each acquisition unit (alternative). However, the proposed procedure based on the HMCDM model, combining the DEMATEL technique, the DANP, and the modified VIKOR method, enables a cross-functional team to analyze capability gaps with respect to dimensions/factors of respective application units. Analyzing these gaps is useful in developing strategies to enable each application unit to take the most influential improvement actions to facilitate the EVM application decisions and to ensure the aspired results.

Finally, based on the above example, we argue that without the full support and participation of the various units within an organization, the proposed approach could not have been applied in the pragmatic manner described above. In particular, in the MND case, it is essential to have a small ET (with five to seven members) that includes genuine experts with full authorization from the top manage-ment to handle the application project on a full-time basis. “Genuine experts” refer to experts who are committed to taking the appropriate actions when rendering their opinions and judgments regarding the EVM application. In addition, the end users who apply the EVM must have progressive intentions to pursue performance improvement in their projects. Overall, the EVM application is not an easy task; indeed, it involves an array of interdependent variables that influence the application processes and outcomes. This exam-ple, however, has demonstrated that the procedure based on the HMCDM model combining the DEMATEL technique, the DANP, and the modified VIKOR method can not only better address application problems, but also easily identify critical factors that are highly influential in solving EVM application problems to achieve the aspiration level.

5. Conclusions

Although EVM has been widely accepted and applied to manage project performance in different types of organiza-tions worldwide, many studies have indicated that a set of interdependent application factors can influence the EVM application process and outcomes. This study proposed a novel procedure, based on the HMCDM method, enabling organizations to obtain aspired outcomes through better decision-making and continuous improvements over the life of the application process.

A numerical example was used to demonstrate the appli-cability of the proposed procedure. The results showed the following merits of this study: (1) it alone measures the interinfluence effects and gap indices to support decision-making and continuous improvements in pursuing aspired EVM application outcomes; (2) the traditional concept of “effective EVM application” is extended from “illustrating of success factors and analysis framework for decision-making” to “analyzing, selecting, and improving selected decisions over application life cycle”; and(3) managers obtain a visu-alized route showing decision information at different levels

within a decision framework, allowing EVM application to be adapted to different application situations existing within the organization. These merits indicate that the proposed procedure can provide a significant foundation for ensuring that aspiration levels of EVM application are achieved at different levels in an organization.

This study has several limitations. First, the dimensions and factors used to establish the decision framework for the proposed procedure were obtained from a limited review of the literature; thus, this study may have excluded other potential influences on the decision process associated with the effective EVM application. Further research could use other approaches, such as interviews or case studies, to select additional factors and explore the differences and similarities between these approaches. Second, the conclusions drawn are based on a case from a national defense organization. Thus, future research could apply our procedure to other cases, such as organizations in the private sector, to examine our procedure across a wider range of application situations, thus making comparisons to gain additional insights into the usefulness of the proposed procedure. Finally, the improve-ment strategies determined from our procedure are a set of strategic guidelines. Future research can identify substantial improvement activities. This work can be characterized as an MODM problem, and future research can adopt the DINOV method with a changeable objective and decision spaces to obtain more valuable improvement outcomes. These limitations provide directions for future research to broaden the applicability of the proposed procedure.

Competing Interests

The authors declare that they have no competing interests.

References

[1] PMI, A Guide to the Project Management Body of Knowledge, Project Management Institute, Newtown Square, Pa, USA, 5th edition, 2013.

[2] J. R. Meredith, S. M. Shafer, S. J. Mantel, and M. M. Sutton, Project Management in Practice, John Wiley & Sons, Hoboken, NJ, USA, 5th edition, 2013.

[3] J. K. Pinto, Project Management: Achieving Competitive Advan-tage, Pearson/Prentice Hall, Upper Saddle River, NJ, USA, 2007. [4] J. Batselier and M. Vanhoucke, “Evaluation of deterministic state-of-the-art forecasting approaches for project duration based on earned value management,” International Journal of Project Management, vol. 33, no. 7, pp. 1588–1596, 2015. [5] Y. H. Kwak and F. T. Anbari, “History, practices, and future of

earned value management in government: perspectives from NASA,” Project Management Journal, vol. 43, no. 1, pp. 77–90, 2012.

[6] F. T. Anbari, “Earned value project management method and extensions,” Project Management Journal, vol. 34, no. 4, pp. 12– 23, 2003.

[7] Q. W. Fleming and J. M. Koppelman, Earned Value Project Management, Project Management Institute, Newtown Square, Pa, USA, 2nd edition, 2000.

數據

Table 1: Evaluation factors and dimensions.

Table 1:

Evaluation factors and dimensions. p.4
Figure 1: A graphical representation of the proposed procedure.

Figure 1:

A graphical representation of the proposed procedure. p.5
Figure 2: The decision framework for EVM application.

Figure 2:

The decision framework for EVM application. p.6
Table 4: The total-influence matrix T for factors T

Table 4:

The total-influence matrix T for factors T p.9
Table 8: Gaps indices obtained through the modified VIKOR method.

Table 8:

Gaps indices obtained through the modified VIKOR method. p.12
Table 7: Sample questionnaire responses.

Table 7:

Sample questionnaire responses. p.12
Table 9: The total influence given and received on dimensions and factors obtained through DEMATEL

Table 9:

The total influence given and received on dimensions and factors obtained through DEMATEL p.13

參考文獻