- 6.1 Sequencing Scenarios
- 6.2 Joint Implementation Strategies
- 6.3 Considerations for Staged and Continuous CMMI Representations
- 6.4 Considerations for Joint Deployment
- 6.5 Summary
6.4 Considerations for Joint Deployment
Fundamentally, CMMI deployment focuses on establishing and improving process infrastructure. In contrast, Six Sigma involves a portfolio of improvement solutions across many domains. Both require management sponsorship and trained change agents. However, the sponsorship and resources are often at different levels of the organization or different parts of the enterprise. When the two initiatives intersect, such as what is currently happening in many organizations, significant gaps emerge. Traditional Six Sigma practitioners often do not have the software, systems, or IT experience that enables them to see how to apply their toolkit in that domain. And they often do not have sufficient awareness or understanding of the CMMI and other domain best practices to build in their adoption as a critical part of the Six Sigma project portfolio. Similarly, CMMI implementers often don't have the depth of analytical or cross-discipline experiences to extrapolate from traditional Six Sigma examples into their own domain and bridge gaps in communications and applications.
To address this, we recommend attention to three fundamental elements of joint deployment:
- Shared organizational roles, particularly the primary change agents of each initiative
- Training that is designed to bridge gaps
- Synchronization of improvement project portfolios
Of the three, the first requires the least amount of explanation. It suggests that engineering process improvement experts should also hold expertise in Six Sigma and vice versa. If expertise in both topics is not held by the same people, the respective experts should work in the same group or otherwise have a seamless partnership. Either way, the objective is for the CMMI people and the Six Sigma people to have a shared sense of organizational mission and goals as well as a shared sense of responsibility to establish an improvement program that achieves its objectives.
One way to bridge gaps between different roles is to conduct cross-training—minimally at the awareness-building level and, for some, at the proficiency-building level.
- Manufacturing Six Sigma Black Belts who are trained in basic software development principles and in the CMMI will more easily recognize the "software factory" and the advantages of leveraging standards and models when establishing process infrastructure.
- CMMI implementers with Six Sigma training will more easily recognize how to achieve quantitatively managed processes and statistically managed subprocesses.
Such cross-training may require adaptations of existing training curricula. Numerous organizations have begun specializing or supplementing their Six Sigma curricula with domain-specific training. This allows the presentation and practice of domain-specific case studies, thereby alleviating the need for trainees to make the leap from the use of analytical tools in manufacturing to their respective disciplines. It also allows the inclusion of awareness sessions about domain-specific topics. This might include special analysis considerations, such as non-normal distributions, and it might include an introduction to improvement technologies, such as the CMMI or software measurement best practices, or to architecture practices (the latter being particularly well suited in a DFSS curriculum). Conversely, CMMI training might include an awareness session about relationships between the CMMI and Six Sigma.
Shared roles and cross-training enable synchronized project portfolios. One risk of not synchronizing is competition for resources, particularly funding. In the worst situation, CMMI-oriented projects get shortchanged, and Six Sigma projects are launched on the false presumption that instrumented software processes stand ready for Six Sigma improvement—a mutually disabling situation. One way to bootstrap synchronization is to create a project identification and definition process that is more efficient due to the presence of both the CMMI and Six Sigma. This approach might include the following features.
- The use of Six Sigma methods to transform fuzzy problem statements into quantitative improvement objectives against which specific improvement projects (including those serving CMMI goals) can be launched. An example of this type of project is included in Section 9.2.
- The recognition of the need for enabling projects that establish processes and measures required for subsequent Six Sigma efforts and the incorporation of the CMMI and other domain model implementation as a critical driver of an enabling project portfolio. The enabling projects might be executed via the aforementioned strategy to implement CMMI process areas as formal Six Sigma projects.
Specific approaches for project portfolio identification and alignment with mission are discussed in Chapter 8, as part of the general case of multimodel process improvement.
6.4.1 Motorola Retrospective: Integrated Training Curriculum
At Motorola, the fourth week of the traditional Six Sigma DMAIC Black Belt program involved choosing one of four different discipline-specific training options: Software, Hardware, Transaction, or Manufacturing (refer back to Table 5-2). This approach enabled cross-discipline classes during the first three weeks of the DMAIC training and helped to both bridge the communication gap on the methods and enable some degree of cross-training between disciplines.
Within the Design for Six Sigma Black Belt program, a similar type of cross-discipline training took place with the approach of training product development teams in a single class. Table 6-1 shows the initial customized DFSS curriculum, which included a number of SEI technologies and training in the CMMI as well as software-specific topics. During the actual training, at times the class broke out to side rooms to cover discipline-specific topics for that point in the Design for Six Sigma methodology.
Table 6-1. Motorola DFSS Curriculum Tailored for Software
Week 1 Topics |
Week 2 Topics |
Week 3 Topics |
DFSS Overview CDOV Process DFSS tools and project management Voice of the customer/KJ analysis QFD First Principle Modeling (Monte Carlo) Pugh DFSS Scorecards Six Sigma and CMMI synergies Parametric SW project forecasting Requirements management processes Developing SW operational profiles SW Quality Attribute Workshops Attribute-Driven SW Architecture Active Reviews for Intermediate Designs SW Architecture Tradeoff Analysis Method (ATAM) Cost/Benefit Analysis of Architecture Decisions Software Product Line Planning and Execution |
Critical parameter management DFMEA Basic Stats (statistics package) Hypothesis testing Confidence intervals ANOVA MSA SPC Design and Process Statistical capability analysis Design of Experiments (DOE) Full factorial designs Fractional factorial designs Modeling Advanced DOE |
Linear and multiple regression RSM Monte Carlo Robust design Tolerance optimization CPM Architecture and design-based software reliability modeling Software reliability growth testing and modeling Motorola Lab's TRAMS (Test Planning using fuzzy logic) Taguchi Noise Testing Small memory management Throughput and timing analysis Orthogonal Defect Classification Advanced SW inspection Human error analysis Cleanroom Software Engineering Agile/Extreme Programming SEI Personal and Team Software Process and relationships to DFSS Usability engineering |
6.4.2 Motorola Retrospective: Roles and Responsibilities
Table 6-2 depicts software engineering and management functional roles and job categories mapped to the recommended software DFSS curriculum, as well as recommended supplementary training in software practices. The champion training brought senior management up to speed on the concepts taught to the DFSS Black Belts so that the champions could help create demand for the use of the DFSS toolkit. The gatekeeper training brought senior and middle management up to speed on the DFSS product team scoring process and additional probing questions to ask during the MGates management reviews.
Table 6-2. Motorola Role-Specific Training Plan
Functional Roles |
Job Categories |
DFSS Training for Software Engineering |
Primary Responsibility in DFSS Deployment for Software Engineering |
Management and leadership |
Senior resource managers Senior software managers Senior test managers Gate review team members Operations managers Software directors Senior software architects |
Attend champion training (1 day for each of the DFSS Weeks 1–3 for software engineering) Gatekeeper training |
Serve as gatekeepers in DFSS gate reviews for software engineering Use software scorecard measures and explanations Require DFSS training for software engineering within organization Ensure sufficient SW Green, Black, and Master Black Belts exist |
Feature team leaders and working-level architects |
Test team managers and leaders Feature team leaders First-line software supervisors Senior technical experts, architects |
Attend 3 weeks of DFSS training for software engineering Attend software specialty training as required |
Ensure the timely use of DFSS tasks, tools, and deliverables for software engineering Present data at DFSS gate reviews for software engineering Identify team members for follow-on in-depth training |
Specialists |
Designers Programmers Testers |
Attend 3 weeks of DFSS training for software engineering when selected Attend follow-on training as requested by management |
Perform tasks in a timely fashion |
Examples of supplementary software specialty training are shown in Figure 6-5, targeted to specific job categories. This supplementary training spanned the CMMI as well as the SEI's architecture and product line practice technologies, depending on the role.
Figure 6-5 Motorola software specialty training to supplement the DFSS curriculum for software engineering