SKIP THE SHIPPING
Use code NOSHIP during checkout to save 40% on eligible eBooks, now through January 5. Shop now.
Register your product to gain access to bonus material or receive a coupon.
This book addresses many new topical areas for the development of 6 Sigma performance. The text is structured to demonstrate how 6 Sigma methods can be used as a very powerful tool within System Engineering and integration evaluations to help enable the process of Critical Parameter Management. The case studies and examples used throughout the book come from recent successful applications of the material developed in the text.
The Role of Executive and Management Leadership in Design For Six Sigma
Foreword.
Preface.
Acknowledgments.
I. INTRODUCTION TO ORGANIZATIONAL LEADERSHIP, FINANCIAL PERFORMANCE, AND VALUE MANAGEMENT USING DESIGN FOR SIX SIGMA.
1. The Role of Executive and Management Leadership in Design for Six Sigma.Leadership Focus on Product Development as Key Business Process. The Strategic View of Top-Line Growth. Enabling Your Product Development Process to Have the Ability to Produce the Right Data, Deliverables, and Measures of Risk within the Context of Your Phase/Gate Structure. Executive Commitment to Driving Culture Change. Summary. References.
2. Measuring Financial Results from DFSS Programs and Projects.A Case Study. Deploying the Voice of the Customer. DFSS Project Execution Efficiency. Production Waste Minimization. Pricing and Business Risk.
3. Managing Value with Design for Six Sigma.Extracting Value. Value as a Formula. Measuring Value in the Marketplace. Identifying the Purposes of Design. Design Based on the Voice of the Customer. Putting Concept Engineering to Work. References.
II. INTRODUCTION TO THE MAJOR PROCESSES USED IN DESIGN FOR SIX SIGMA IN TECHNOLOGY AND PRODUCT DEVELOPMENT.
4. Management of Product Development Cycle-Time.The Product Development Process Capability Index. Product Development Process. Project Management. The Power of PERT Charts. References.
5. Technology Development Using Design For Six Sigma.The I2DOV Roadmap: Applying a Phase/Gate Approach to Technology Development. I2DOV and Critical Parameter Management during the Phases and Gates of Technology Development. I2DOV Phase 1: Invention and Innovation. I2DOV Phase 2: Develop Technology Concept Definition, Stabilization, and Functional Modeling. I2DOV Phase 3: Optimization of the Robustness of the Subsystem Technologies. I2DOV Phase 4: Certification of the Platform or Subsystem Technologies. References.
6. Product Design Using Design for Six Sigma.An Introduction to Phases and Gates. Preparing for Product Commercialization. Defining a Generic Product Commercialization Process Using CDOV Roadmap. The CDOV Process and Critical Parameter Management during the Phases and Gates of Product Commercialization. CDOV Phase 2: Subsystem Concept and Design Development. CDOV Phase 3A: Optimizing Subsystems. CDOV Phase 3B: System Integration. CDOV Phase 4A: Verification of Product Design Functionality. CDOV Phase 4B: Verification of Production. References.
7. System Architecting, Engineering, and Integration Using Design for Six Sigma.Phase 1: System Concept Development. Phase 2: Subsystem, Subassembly, Component, and Manufacturing Concept Design. Phase 3A: Subsystem Robustness Optimization. Phase 3B: System Integration. Phase 4A: Final Product Design Certification. Phase 4B: Production Verification. References.
III. INTRODUCTION TO THE USE OF CRITICAL PARAMETER MANAGEMENT IN DESIGN FOR SIX SIGMA IN TECHNOLOGY AND PRODUCT DEVELOPMENT.
8. Introduction to Critical Parameter Management.Winning Strategies. Focus on System Performance. Data-Driven Process. The Best of Best Practices. Reference.
9. The Architecture of the Critical Parameter Management Process.Who Constructs the CPM Process? Timing Structure of the CPM Process: Where and When Does CPM Begin? What Are the Uses of CPM? Reference.
10. The Process of Critical Parameter Management in Product Development.Definitions of Terms for Critical Parameter Management. Critical Parameter Management in Technology Development and Product Design: Phase 1. Phase 2 in Technology Development or Product Design. Phases 3 and 4: Stress Testing and Integration. Capability Summaries and the Capability Growth Index.
11. The Tools and Best Practices of Critical Parameter Management.The Rewards of Deploying Proven Tools and Best Practices. Critical Parameter Management Best Practices for Technology Development. Critical Parameter Management Best Practices for Product Commercialization.
12. Metrics for Engineering and Project Management Within CPM.Key CPM Metrics. Statistical Metrics of CPM. The Capability Growth Index and the Phases and Gates of Technology Development and Product Commercialization.
13. Data Acquisition and Database Architectures in CPM.Instrumentation, Data Acquisition, and Analysis in CPM. Databases: Architectures and Use in CPM. References.
IV. TOOLS AND BEST PRACTICES FOR INVENTION, INNOVATION, AND CONCEPT DEVELOPMENT.
14. Gathering and Processing the Voice of the Customer: Customer Interviewing and the KJ Method.Where Am I in the Process? What Am I Doing in the Process? What Output Do I Get at the Conclusion of this Phase of the Process? Gathering and Processing the Voice of the Customer Process Flow Diagram. Verbal Descriptions of the Application of Each Block Diagram. VOC Gathering and Processing Checklist and Scorecard. References.
15. Quality Function Deployment: The Houses of Quality.Where Am I in the Process? What Am I Doing in the Process? What Output Do I Get at the Conclusion of this Phase of the Process? QFD Process Flow Diagram. Verbal Descriptions for the Application of Each Block Diagram. QFD Checklist and Scorecards. References.
16. Concept Generation and Design for x Methods.Where Am I in the Process? What Am I Doing in the Process? What Output Do I Get at the Conclusion of this Phase of the Process? Concept Generation and DFx Process Flow Diagram. Verbal Descriptions for the Application of Each Block Diagram. Concept Generation and DFx Checklist and Scorecards. References.
17. The Pugh Concept Evaluation and Selection Process.Where Am I in the Process? What Am I Doing in the Process? What Output Do I Get at the Conclusion of this Phase of the Process? The Pugh Concept Selection Process Flow Diagram. Verbal Descriptions for the Application of Each Block Diagram. Pugh Concept Selection Process Checklist and Scorecard. References.
18. Modeling: Ideal/Transfer Functions, Robustness Additive Models, and the Variance Model.Where Am I in the Process? What Am I Doing in the Process? What Output Do I Get at the Conclusion of this Phase of the Process? Modeling Process Flow Diagram. Verbal Descriptions for the Application of Each Block Diagram. Modeling Checklist and Scorecard. References.
V. TOOLS AND BEST PRACTICES FOR DESIGN DEVELOPMENT.
19. Design Failure Modes and Effects Analysis.Where Am I in the Process? What Am I Doing in the Process? What Output Do I Get at the Conclusion of this Phase of the Process? The DFMEA Flow Diagram. Verbal Descriptions for the Application of Each Block Diagram. DFMEA Checklist and Scorecard. References.
20. Reliability Prediction.Where Am I in the Process? What Am I Doing in the Process? What Output Do I Get at the Conclusion of this Phase of the Process? The Reliability Prediction Flow Diagram. Applying Each Block Diagram Within the Reliability Prediction Process. Reliability Prediction Checklist and Scorecard. References.
21. Introduction to Descriptive Statistics.Where Am I in the Process? What Output Do I Get from Using Descriptive Statistics? What Am I Doing in the Process? Descriptive Statistics Review and Tools.
22. Introduction to Inferential Statistics.Where Am I in the Process? What Am I Doing in the Process? What Output Do I Get from Using Inferential Statistics? Inferential Statistics Review and Tools. References.
23. Measurement Systems Analysis.Where Am I in the Process? What Am I Doing in the Process? What Output Do I Get at the Conclusion of Measurement Systems Analysis? MSA Process Flow Diagram. Verbal Descriptions for the Application of Each Block Diagram. MSA Checklist and Scorecard. References.
24. Capability Studies.Where Am I in the Process? What Am I Doing in the Process? What Output Do I Get at the Conclusion of a Capability Study? Capability Study Process Flow Diagram. Verbal Descriptions of the Application of Each Block Diagram. Capability Study Checklist and Scorecard. References.
25. Multi-Vari Studies.Where Am I in the Process? What Am I Doing in the Process? What Output Do I Get at the Conclusion of this Phase of the Process? Multi-Vari Study Process Flow Diagram. Verbal Descriptions for the Application of Each Block Diagram. Multi-Vari Study Checklist and Scorecard. Reference.
26. Regression.Where Am I in the Process? What Am I Doing in the Process? What Output Do I Get at the Conclusion of this Phase of the Process? Regression Process Flow Diagram. Verbal Descriptions for the Application of Each Block Diagram. Regression Checklist and Scorecard. References.
27. Design of Experiments.Where Am I in the Process? What Am I Doing in the Process? What Output Do I Get at the Conclusion of this Phase of the Process? DOE Process Flow Diagram. Verbal Descriptions for the Application of Each Block Diagram. DOE Checklist and Scorecard. Reference.
VI. TOOLS AND BEST PRACTICES FOR OPTIMIZATION.
28. Taguchi Methods for Robust Design.Where Am I in the Process? What Am I Doing in the Process? What Output Do I Get at the Conclusion of Robust Design? The Robust Design Process Flow Diagram. Verbal Descriptions for the Application of Each Block Diagram. Robust Design Checklist and Scorecard. References.
29. Response Surface Methods.Where Am I in the Process? What Am I Doing in the Process? What Output Do I Get at the Conclusion of RSM? RSM Process Flow Diagram. Verbal Descriptions of the Application of Each Block Diagram. RSM Checklist and Scorecard. Reference.
30. Optimization Methods.Where Am I in the Process? What Am I Doing in the Process? What Output Do I Get at the Conclusion of Optimization? Optimization Process Flow Diagram. Verbal Descriptions of the Application of Each Block Diagram. Optimization Checklist and Scorecard. References.
VII. TOOLS AND BEST PRACTICES FOR VERIFYING CAPABILITY.
31. Analytical Tolerance Design.Where Am I in the Process? What Am I Doing in the Process? What Output Do I Get at the Conclusion of Analytical Tolerance Design? The Analytical Tolerance Design Flow Diagram. Verbal Descriptions for the Application of Each Block Diagram. Analytical Tolerance Design Checklist and Scorecard. References.
32. Empirical Tolerance Design.Where Am I in the Process? What Am I Doing in the Process? What Output Do I Get at the Conclusion of Empirical Tolerance Design? The Empirical Tolerance Design Flow Diagram. Verbal Descriptions for the Application of Each Block Diagram. Empirical Tolerance Design Checklist and Scorecard. Reference.
33. Reliability Evaluation.Where Am I in the Process? What Am I Doing in the Process? What Output Do I Get at the Conclusion of Reliability Evaluations? The Reliability Evaluation Flow Diagram. Detailed Descriptions for the Application of Each Block Diagram. Reliability Evaluation Checklist and Scorecard. References.
34. Statistical Process Control. Where Am I in the Process? What Am I Doing in the Process? What Output Do I Get at the Conclusion of SPC? SPC Process Flow Diagram. Verbal Descriptions of the Application of Each Block Diagram. SPC Checklist and Scorecard. References.In its simplest sense, DFSS consists of a set of needs-gathering, engineering and statistical methods to be used during product development. These methods are to be imbedded within the organization's product development process (PDP). Engineering determines the physics and technology to be used to carry out the product's functions. DFSS ensures that those functions meet the customer's needs and that the chosen technology will perform those functions in a robust manner throughout the product's life.
DFSS does not replace current engineering methods, nor does it relieve an organization of the need to pursue excellence in engineering and product development. DFSS adds another dimension to product development, called Critical Parameter Management (CPM). CPM is the disciplined and focused attention to the design's functions, parameters, and responses that are critical to fulfilling the customer's needs. This focus is maintained by the development team throughout the product development process from needs gathering to manufacture. Manufacturing then continues CPM throughout production and support engineering. Like DFSS, CPM is conducted throughout and embedded within the PDP. DFSS provides most of the tools that enable the practice of CPM. In this light, DFSS is seen to coexist with and add to the engineering practices that have been in use all along.
DFSS is all about preventing problems and doing the right things at the right time during product development. From a management perspective, it is about designing the right cycle-time for the proper development of new products. It helps in the process of inventing, developing, optimizing, and transferring new technology into product design programs. It also enables the subsequent conceptual development, design, optimization, and verification of new products prior to their launch into their respective markets.
The DFSS methodology is built upon a balanced portfolio of tools and best practices that enable a product development team to develop the right data to achieve the following goals:
DFSS is managed through an integrated set of tools that are deployed within the phases of a product development process. It delivers qualitative and quantitative results that are summarized in scorecards in the context of managing critical parameters against a clear set of product requirements based on the "voice of the customer." In short it develops clear requirements and measures their fulfillment in terms of 6s standards.
A design with a critical functional response (for example, a desired pressure or an acoustical sound output) that can be measured and compared to upper and lower specification limits relating back to customer needs would look like the following figure if it had 6 sigma performance.
The dark black arrows between the control limits (UCL and LCL, known as natural tolerances set at 1/2 3 standard deviations of a distribution that is under statistical control) and the specification limits (USL and LSL, known as VOC-based performance tolerances) indicates design latitude that is representative of 6 sigma performance. That is to say, there are 3 standard deviations of latitude on each side of the control limit out to the specification limit to allow for shifts in the mean and broadening of the distribution. The customer will not feel the variability quickly in this sense. If the product or process is adjustable, there is an opportunity to put the mean back on to the VOC-based performance target or to return the distribution to its desired width within its natural tolerances. If the latitude is representative of a function that is not serviceable or adjustable, then the latitude is suggestive of the reliability of the function if the drift off target or distribution broadening is measured over time. In this case, Cp (short-term distribution broadening with no mean shift) and Cpk metrics (both mean shifting and distribution broadening over long periods of time) can be clear indicators of a design's robustness (insensitivity to sources of variation) over time. DFSS uses capability metrics to aid in the development of critical product functions throughout the phases and gates of a product development process.
Much more will be said about the metrics of DFSS in later chapters. Let's move on to discuss the higher level business issues as they relate to deploying DFSS in a company.
At the highest level, any business that wants to excel at product development must have the following three elements in strategic alignment:
Design For Six Sigma fits within the context of a key business process, namely the product development process. DFSS encompasses many tools and best practices that can be selectively deployed during the phases of a product development process. Specifically, DFSS integrates three major tactical elements to help attain the ubiquitous business goals of low cost, high quality, and rapid cycle-time from product development:
The product development process controls the macro-timing of what to do and when to do it using a flexible structure of phases and gates. A balanced portfolio of tools and best practices are what to do within each phase of the product development process. The disciplined application of project management in the form of PERT charts of work breakdown structures defines the micro-timing for the critical path of applying tools and best practices within each phase.
DFSS works equally well in technology development organizations and in product design organizations. This book will demonstrate complete approaches to applying DFSS in both a technology development process and a product design process.
The metrics of DFSS break down into three categories:
DFSS is focused on CPM. This is done to identify the few variables that dominate the development of baseline performance (Yavg.), the optimization of robust performance (S/N and δ), and the certification of capable performance (Cp and Cpk) of the integrated system of designed parameters. DFSS instills a system integration mind-set. It looks at all parameterswithin the product and the processes that make itas being important to the integrated performance of the system elements, but only a few are truly critical.
DFSS starts with a sound business strategy and its set of goals and, on that basis, flows down to the very lowest levels of the design and manufacturing process variables that deliver on those goals. To get any structured product development process headed in the right direction, DFSS must flow in the following manner:
DFSS tools are then used to create a hierarchy of requirements down from the system level to the subsystems, subassemblies, components, and manufacturing processes. Once a clear and linked set of requirements is defined, DFSS uses CPM to measure and track the capability of the evolving set of Ys and xs that comprise the critical functional parameters governing the performance of the system. At this point DFSS drives a unique synergy between engineering design principles and applied statistical analysis methods. DFSS is not about statisticsit is about product development using statistically enabled engineering methods and metrics.
DFSS does not require product development teams to measure quality and reliability to develop and attain quality and reliability. Product development teams apply DFSS to analytically model and empirically measure fundamental functions as embodied in the units of engineering scalars and vectors. It is used to build math models called ideal or transfer functions Y 5 f(x) between fundamental (Yresponse) response variables and fundamental (xinputs) input variables. When we measure fundamental (Yresponse) values as they respond to the settings of input (xinputs) variables, we avoid the problems that come with the discontinuities between continuous engineering input variables and counts of attribute quality response variables.
DFSS avoids counting failures and places the engineering team's focus on measuring real functions. The resulting fundamental models can be exercised, analyzed, and verified statistically through Monte Carlo simulations and the sequential design of experiments.
Defects and time-to-failure are not the main metrics of DFSS. DFSS uses continuous variables that are leading indicators of impending defects and failures to measure and optimize critical functional responses against assignable causes of variation in the production, delivery, and use environments. We need to prevent problemsnot wait until they occur and then react to them.
If one seeks to reduce defects and improve reliability, avoiding attribute measures of quality can accelerate the time it takes to reach these goals. You must do the hard work of measuring functions. As a result of this mind-set, DFSS has a heavy focus in measurement systems analysis and computer-aided data acquisition methods. The sign of a strong presence of DFSS in a company is its improved capability to measure functional performance responses that its competitors don't know they should be measuring and couldn't measure even if they knew they should! Let your competitors count defectsyour future efficiencies in product development reside in measuring functions that let you prevent defective design performance.
DFSS requires significant investment in instrumentation and data acquisition technology. It is not uncommon to see companies that are active in DFSS obtaining significant patents for their inventions and innovations in measurement systems. Counting defects is easy and cheap. Measuring functions is often difficult and expensive. If you want to prevent defects during production and use, you have to take the hard fork in the metrology road back in technology development and product design. Without this kind of data, CPM is extremely difficult.
The technical metrics of Critical Parameter Management in DFSS are as follows:
Information is represented by analog and digital logic and control signals.
What to measure is the mass, energy, and controlling signals within and across your systems. When to measure is defined by your micro-timing diagram (critical path) of tool and best practice applications within the phases of your product development process.
The underpinnings of DFSS deserve a brief review before we turn you loose on the rest of the book. DFSS, like Six Sigma for Production Operations, follows a roadmap. Six Sigma for Production Operations follows a process roadmap outlined by the MAIC acronym, which stands for Measure, Analyze, Improve, and Control. This is based, in large part, on the historic work of Walter Shewhart when he devised the underlying principles of statistical process control for production processes. Unfortunately this has little to do with the process of product development. Many in the Six Sigma business have tried to tell the R&D community that all they need to do is put a "D" in front of the MAIC process and voilá! you get DFSS. NOT TRUE!!! Define, measure, analyze, improve, and control is not a proper process recipe for product development. We know many have started DFSS within this SPC context, but there is a better, more appropriate process context in which to conduct DFSS.
This book is written by technology development and product design engineers for readers with the same or similar backgrounds. A major part of the book's intent is to establish a proper set of roadmaps that fit the paradigms and process context of technology development and product development. These roadmaps are set up in the format of a Phase/Gate product development process structure.
As much as we love and respect the MAIC process for production and transactional processes, it simply has no rational application context for DFSS, if you run your company based on a modern product development process. Leaders such as Admiral Raborn of the Polaris program or later proponents such as Cooper, Wheelwright and Clark, or Clausing and Pugh might be reasonable candidates to be the patron saints of modern Phase/gate product development processes, but it surely is not and should not be Walter Shewhart! Shewhart and his process provide great historical underpinnings for production operations; however, we will not lean too heavily on his work, at least as far as running the phases and gates of a product development process, until the final steps in transitioning from product design into production. In that sense, then, the I2DOV technology development process roadmap flows into the CDOV product design process roadmap, which in turn flows into the DMAIC production process roadmap.
This book is organized in seven sections:
These sections will build on this brief introduction to the disciplined and rigorous world of DFSS for technology and product development. We hope you enjoy this overview describing what "hard stuff" your technology and product development teams need to do (and when they need to do it) in order to take your company to the next level of success in our evolving world of product development excellence.
This text on DFSS was written to serve several types of readers:
If you are an executive, R&D director, or some other form of business leader, we wrote the Introduction and Part I for you.
If you are a program manager, project manager, or a design team leader, we wrote Parts II and III primarily for you.
If you are a technical practitioner who will be applying the tools of DFSS on technology and product design programs and projects we wrote the Tool chapters in Parts IV through VII for you.
An extensive glossary at the end of the book is intended for all readers.
Parts II through VII of this book were designed to serve as a reference to be used over and over as needed to remind and refresh the reader on what to do and when to do it during the phases and gates of technology development and product design. These parts can be used to guide your organization to improve discipline and rigor at gate reviews and to help redesign your product development process to include Six Sigma metrics and deliverables.
If you want to understand any DFSS tool and its deliverables prior to a gate review, we recommend reading the appropriate tool chapter(s) prior to the gate review.
Skip Creveling
Jeff Slutsky
Dave Antis