• 沒有找到結果。

Capability Maturity Models

The capability maturity models of the SEI are a related group of strategies for improving the software process, irrespective of the actual life-cycle model used. (The term maturity is a measure of the goodness of the process itself.) The SEI has developed CMMs for software (SW–CMM), for management of human resources (P–CMM; the P stands for “people”), for systems engineering (SE–CMM), for integrated product develop-ment (IPD–CMM), and for software acquisition (SA–CMM). There are some inconsisten-cies between the models and an inevitable level of redundancy. Accordingly, in 1997, it was decided to develop a single integrated framework for maturity models, capability maturity model integration (CMMI), which incorporates all fi ve existing capability maturity mod-els. Additional disciplines may be added to CMMI in the future [SEI, 2002].

For reasons of space, only one capability maturity model, SW–CMM, is examined here, and an overview of the P–CMM is given in Section 4.8. The SW–CMM was fi rst put forward in 1986 by Watts Humphrey [Humphrey, 1989]. Recall that a software process encompasses the activities, techniques, and tools used to produce software. It therefore incorporates both technical and managerial aspects of software production. Underlying the SW–CMM is the belief that the use of new software techniques in itself will not result in increased productivity and profi tability, because our problems are caused by how we man-age the software process. The strategy of the SW–CMM is to improve the manman-agement of the software process in the belief that improvements in technique are a natural conse-quence. The resulting improvement in the process as a whole should result in better-quality software and fewer software projects that suffer from time and cost overruns.

Bearing in mind that improvements in the software process cannot occur overnight, the SW–CMM induces change incrementally. More specifi cally, fi ve levels of maturity are defi ned, and an organization advances slowly in a series of small evolutionary steps toward the higher levels of process maturity [Paulk, Weber, Curtis, and Chrissis, 1995]. To under-stand this approach, the fi ve levels now are described.

Maturity Level 1. Initial Level

At the initial level , the lowest level, essentially no sound software engineering manage-ment practices are in place in the organization. Instead, everything is done on an ad hoc basis. A specifi c project that happens to be staffed by a competent manager and a good software development team may be successful. However, the usual pattern is time and cost

96 Part A Software Engineering Concepts

overruns caused by a lack of sound management in general and planning in particular.

As a result, most activities are responses to crises rather than preplanned tasks. In level-1 organizations, the software process is unpredictable, because it depends totally on the cur-rent staff; as the staff changes, so does the process. As a consequence, it is impossible to predict with any accuracy such important items as the time it will take to develop a product or the cost of that product.

It is unfortunate that the vast majority of software organizations all over the world are still level-1 organizations.

Maturity Level 2. Repeatable Level

At the repeatable level , basic software project management practices are in place. Plan-ning and management techniques are based on experience with similar products; hence, the name repeatable . At level 2, measurements are taken, an essential fi rst step in achieving an adequate process. Typical measurements include the meticulous tracking of costs and schedules. Instead of functioning in a crisis mode, as in level 1, managers identify problems as they arise and take immediate corrective action to prevent them from becoming crises.

The key point is that, without measurements, it is impossible to detect problems before they get out of hand. Also, measurements taken during one project can be used to draw up realistic duration and cost schedules for future projects.

Maturity Level 3. Defi ned Level

At the defi ned level , the process for software production is fully documented. Both the managerial and technical aspects of the process are clearly defi ned, and continual efforts are made to improve the process wherever possible. Reviews (Section 6.2) are used to achieve software quality goals. At this level, it makes sense to introduce new technology, such as CASE environments (Section 5.8), to increase quality and produc-tivity further. In contrast, “high tech” only makes the crisis-driven level-1 process even more chaotic.

Although a number of organizations have attained maturity levels 2 and 3, few have reached levels 4 or 5. The two highest levels therefore are targets for the future.

Maturity Level 4. Managed Level

A managed-level organization sets quality and productivity goals for each project.

These two quantities are measured continually and corrective action is taken when there are unacceptable deviations from the goal. Statistical quality controls ([Deming, 1986], [Juran, 1988]) are in place to enable management to distinguish a random deviation from a meaningful violation of quality or productivity standards. (A simple example of a statistical quality control measure is the number of faults detected per 1000 lines of code. A corre-sponding objective is to reduce this quantity over time.)

Maturity Level 5. Optimizing Level

The goal of an optimizing-level organization is continuous process improvement. Sta-tistical quality and process control techniques are used to guide the organization. The knowledge gained from each project is utilized in future projects. The process therefore incorporates a positive feedback loop, resulting in a steady improvement in productivity and quality.

sch76183_ch03_074-106.indd 96

sch76183_ch03_074-106.indd 96 04/06/10 6:35 PM04/06/10 6:35 PM

These fi ve maturity levels are summarized in Figure 3.3 , which also shows the key process areas (KPAs) associated with each maturity level. To improve its software process, an organization fi rst attempts to gain an understanding of its current process and then formulates the intended process. Next, actions to achieve this process improvement are determined and ranked in priority. Finally, a plan to accomplish this improvement is drawn up and executed. This series of steps is repeated, with the organization successively im-proving its software process; this progression from level to level is refl ected in Figure 3.3 . Experience with the capability maturity model has shown that advancing a complete maturity level usually takes from 18 months to 3 years, but moving from level 1 to level 2 can sometimes take 3 or even 5 years. This is a refl ection of how diffi cult it is to instill a methodical approach in an organization that up to now has functioned on a purely ad hoc and reactive basis.

FIGURE 3.3 The fi ve levels of the software capability maturity model and their key process areas (KPAs).

2. Repeatable level:

Basic project management

Requirements management Software project planning

Software project tracking and oversight Software subcontract management Software quality assurance

Software configuration management 1. Initial level:

Ad hoc process

Not applicable 3. Defined level:

Process definition

Organization process focus Organization process definition Training program

Integrated software management Software project engineering Intergroup coordination Peer reviews

4. Managed level:

Process measurement

Quantitative process management Software quality management 5. Optimizing level:

Process control

Defect prevention

Technology change management Process change management

98 Part A Software Engineering Concepts

For each maturity level, the SEI has highlighted a series of key process areas (KPAs) that an organization should target in its endeavor to reach the next maturity level. For example, as shown in Figure 3.3 , the KPAs for level 2 (repeatable level) include confi guration management (Section 5.10), software quality assurance (Section 6.1.1), project planning ( Chapter 9 ), project tracking (Section 9.2.5), and requirements management ( Chapter 11 ). These areas cover the basic elements of software management: Determine the client’s needs (requirements manage-ment), draw up a plan (project planning), monitor deviations from that plan (project tracking), control the various pieces that make up the software product key process area (confi guration management), and ensure that the product is fault free (quality assurance). Within each KPA is a group of between two and four related goals that, if achieved, result in that maturity level being attained. For example, one project planning goal is the development of a plan that appropriately and realistically covers the activities of software development.

At the highest level, maturity level 5, the KPAs include fault prevention, technology change management, and process change management. Comparing the KPAs of the two levels, it is clear that a level-5 organization is far in advance of one at level 2. For example, a level-2 organization is concerned with software quality assurance, that is, with detecting and correcting faults (software quality is discussed in more detail in Chapter 6 ). In con-trast, the process of a level-5 organization incorporates fault prevention, that is, trying to ensure that no faults are in the software in the fi rst place. To help an organization to reach the higher maturity levels, the SEI has developed a series of questionnaires that form the basis for an assessment by an SEI team. The purpose of the assessment is to highlight cur-rent shortcomings in the organization’s software process and to indicate ways in which the organization can improve its process.

The CMM program of the Software Engineering Institute was sponsored by the U.S.

Department of Defense. One of the original goals of the CMM program was to raise the quality of defense software by evaluating the processes of contractors who produce soft-ware for the DoD and awarding contracts to those contractors who demonstrate a mature process. The U.S. Air Force stipulated that any software development organization that wished to be an Air Force contractor had to conform to SW–CMM level 3 by 1998, and the DoD as a whole subsequently issued a similar directive. Consequently, pressure is put on organizations to improve the maturity of their software processes. However, the SW–CMM program has moved far beyond the limited goal of improving DoD software and is being implemented by a wide variety of software organizations that wish to improve software quality and productivity.