HAPPY BOOKSGIVING
Use code BOOKSGIVING during checkout to save 40%-55% on books and eBooks. Shop now.
Register your product to gain access to bonus material or receive a coupon.
This PDF will be accessible from your Account page after purchase and requires PDF reading software, such as Acrobat® Reader®.
The eBook requires no passwords or activation to read. We customize your eBook by discreetly watermarking it with your name, making it uniquely yours.
“Don’s book is a very good addition both to the testing literature and to the literature on quality assurance and software engineering… . [It] is likely to become a standard for test training as well as a good reference for professional testers and developers. I would also recommend this book as background material for negotiating outsourced software contracts. I often work as an expert witness in litigation for software with very poor quality, and this book might well reduce or eliminate these lawsuits….”
–Capers Jones, VP and CTO, Namcook Analytics LLC
Software and system testers repeatedly fall victim to the same pitfalls. Think of them as “anti-patterns”: mistakes that make testing far less effective and efficient than it ought to be. In Common System and Software Testing Pitfalls, Donald G. Firesmith catalogs 92 of these pitfalls. Drawing on his 35 years of software and system engineering experience, Firesmith shows testers and technical managers and other stakeholders how to avoid falling into these pitfalls, recognize when they have already fallen in, and escape while minimizing their negative consequences.
Firesmith writes for testing professionals and other stakeholders involved in large or medium-sized projects. His anti-patterns and solutions address both “pure software” applications and “software-reliant systems,” encompassing heterogeneous subsystems, hardware, software, data, facilities, material, and personnel. For each pitfall, he identifies its applicability, characteristic symptoms, potential negative consequences and causes, and offers specific actionable recommendations for avoiding it or limiting its consequences.
This guide will help you
With 92 common testing pitfalls organized into 14 categories, this taxonomy of testing pitfalls should be relatively complete. However, in spite of its comprehensiveness, it is also quite likely that additional pitfalls and even missing categories of pitfalls will be identified over time as testers read this book and compare it to their personal experiences. As an enhancement to the print edition, the author has provided the following location on the web where readers can find major additions and modifications to this taxonomy of pitfalls: http://donald.firesmith.net/home/common-testing-pitfalls
Please send any recommended changes and additions to dgf (at) sei (dot) cmu (dot) edu, and the author will consider them for publication both on the website and in future editions of this book.
&n>
Foreword xiii
Preface xvii
About the Author xxiii
Chapter 1: Overview 1
1.1 What Is Testing? 1
1.2 Testing and the V Models 2
1.3 What Is a Defect? 5
1.4 Why Is Testing Critical? 7
1.5 The Limitations of Testing 9
1.6 What Is a Testing Pitfall? 10
1.7 Categorizing Pitfalls 11
1.8 Pitfall Specifications 11
Chapter 2: Brief Overviews of the Testing Pitfalls 13
2.1 General Testing Pitfalls 13
2.1.1 Test Planning and Scheduling Pitfalls 13
2.1.2 Stakeholder Involvement and Commitment Pitfalls 14
2.1.3 Management-Related Testing Pitfalls 14
2.1.4 Staffing Pitfalls 15
2.1.5 Test-Process Pitfalls 16
2.1.6 Test Tools and Environments Pitfalls 17
2.1.7 Test Communication Pitfalls 18
2.1.8 Requirements-Related Testing Pitfalls 19
2.2 Test-Type-Specific Pitfalls 20
2.2.1 Unit Testing Pitfalls 20
2.2.2 Integration Testing Pitfalls 20
2.2.3 Specialty Engineering Testing Pitfalls 21
2.2.4 System Testing Pitfalls 22
2.2.5 System of Systems (SoS) Testing Pitfalls 22
2.2.6 Regression Testing Pitfalls 23
Chapter 3: Detailed Descriptions of the Testing Pitfalls 25
3.1 Common Negative Consequences 25
3.2 General Recommendations 26
3.3 General Testing Pitfalls 28
3.3.1 Test Planning and Scheduling Pitfalls 28
3.3.2 Stakeholder Involvement and Commitment Pitfalls 44
3.3.3 Management-Related Testing Pitfalls 51
3.3.4 Staffing Pitfalls 65
3.3.5 Test Process Pitfalls 75
3.3.6 Test Tools and Environments Pitfalls 106
3.3.7 Test Communication Pitfalls 131
3.3.8 Requirements-Related Testing Pitfalls 143
3.4 Test-Type-Specific Pitfalls 164
3.4.1 Unit Testing Pitfalls 164
3.4.2 Integration Testing Pitfalls 169
3.4.3 Specialty Engineering Testing Pitfalls 177
3.4.4 System Testing Pitfalls 206
3.4.5 System of Systems (SoS) Testing Pitfalls 211
3.4.6 Regression Testing Pitfalls 225
Chapter 4: Conclusion 241
4.1 Future Work 241
4.2 Maintaining the Lists of Pitfalls 242
Appendix A: Glossary 243
Appendix B: Acronyms 253
Appendix C: Notes 255
Appendix D: REFERENCES 269
Appendix E: Planning Checklist 271
Index 279