Online Sample Chapter
Peer Reviews in Software: A Little Help From Your Friends
Downloadable Sample Chapter
Click below for Sample Chapter related to this title:
wiegersch02.pdf
Table of Contents
Preface.
My Objectives.
Intended Audience.
Reading Suggestions.
Acknowledgments.
1. The Quality Challenge. Looking Over the Shoulder.
Quality Isn't Quite Free.
Justifying Peer Reviews.
Peer Reviews, Testing, and Quality Tools.
What Can Be Reviewed.
A Personal Commitment to Quality.
2. A Little Help from Your Friends. Scratch Each Other's Back.
Reviews and Team Culture.
The Influence of Culture.
Reviews and Managers.
Why People Don't Do Reviews.
Overcoming Resistance to Reviews.
Peer Review Sophistication Scale.
Planning for Reviews.
Guiding Principles for Reviews.
3. Peer Review Formality Spectrum. The Formality Spectrum.
Inspection.
Team Review.
Walkthrough.
Pair Programming.
Peer Deskcheck.
Passaround.
Ad Hoc Review.
Choosing a Review Approach.
4. The Inspection Process. Inspector Roles.
The Author's Role.
To Read or Not To Read.
Inspection Team Size.
Inspection Process Stages.
Planning.
Overview.
Preparation.
Meeting.
Rework.
Follow-up.
Causal Analysis.
Variations on the Inspection Theme.
Gilb/Graham Method.
High-Impact Inspection.
Phased Inspections.
5. Planning the Inspection. When to Hold Inspections.
The Inspection Moderator.
Selecting the Material.
Inspection Entry Criteria.
Assembling the Cast.
Inspector Perspectives.
Managers and Observers.
The Inspection Package.
Inspection Rates.
Scheduling Inspection Events.
6. Examining the Work Product. The Overview Stage.
The Preparation Stage.
Preparation Approaches.
Defect Checklists.
Other Analysis Techniques.
7. Putting Your Heads Together. The Moderator's Role.
Launching the Meeting.
Conducting the Meeting.
Reading the Work Product.
Raising Defects and Issues.
Recording Defects and Issues.
Watching for Problems.
Product Appraisal.
Closing the Meeting.
Improving the Inspection Process.
8. Bringing Closure. The Rework Stage.
The Follow-Up Stage.
The Causal Analysis Stage.
Inspection Exit Criteria.
9. Analyzing Inspection Data. Why Collect Data?
Some Measurement Caveats.
Basic Data Items and Metrics.
The Inspection Database.
Data Analysis.
Measuring the Impact of Inspections.
Effectiveness.
Efficiency.
Return on Investment.
10. Installing a Peer Review Program. The Peer Review Process Owner.
Preparing the Organization.
Process Assets.
The Peer Review Coordinator.
Peer Review Training.
Piloting the Review Process.
11. Making Peer Reviews Work for You. Critical Success Factors.
Review Traps to Avoid.
Troubleshooting Review Problems.
12. Special Review Challenges. Large Work Products.
Geographical or Time Separation.
Distributed Review Meeting.
Asynchronous Review.
Generated and Nonprocedural Code.
Too Many Participants.
No Qualified Reviewers Available.
Epilogue. Appendix A: Peer Reviews and Process Improvement Models. Capability Maturity Model for Software.
Goals of the Peer Reviews Key Process Area.
Activities Performed.
Commitment to Perform.
Ability to Perform.
Measurement and Analysis.
Verifying Implementation.
Systems Engineering Capability Maturity Model.
CMMI-SE/SW.
Prepare for Peer Reviews.
Conduct Peer Reviews.
Analyze Peer Review Data.
ISO 9000-3.
Appendix B: Supplemental Materials. Work Aids.
Other Peer Review Resources.
Glossary. Index.
Preface
No matter how skilled or experienced I am as a software developer, requirements writer, project planner, tester, or book author, I'm going to make mistakes. There's nothing wrong with making mistakes; it is part of what makes me human. Because I err, it makes sense to catch the errors early, before they become difficult to find and expensive to correct.
It's often hard for me to find my own errors because I am too close to the work. Many years ago I learned the value of having some colleagues look over my work and point out my mistakes. I always feel a bit sheepish when they do, but I prefer to have them find the mistakes now than to have customers find them much later. Such examinations are called peer reviews. There are several different types of peer reviews, including inspections, walkthroughs, and others. However, most of the points I make in this book apply to any activity in which someone other than the creator of a work product examines it in order to improve its quality.
I began performing software peer reviews in 1987; today I would never consider a work product complete unless someone else has carefully examined it. You might never find all of the errors, but you will find many more with help from other people than you possibly can on your own. The manuscript for this book and my previous books all underwent extensive peer review, which contributed immeasurably to their quality.
My Objectives
There is no "one true way" to conduct a peer review, so the principal goal of this book is to help you effectively perform appropriate reviews of deliverables that people in your organization create. I also address the cultural and practical aspects of implementing an effective peer review program in a software organization. Inspection is emphasized as the most formal and effective type of peer review, but I also describe several other methods that span a spectrum of formality and rigor. Many references point you to the extensive literature on software reviews and inspections.
Inspection is both one of the great success stories of software development and something of a failure. It's a grand success because it works! Since it was developed by Michael Fagan at IBM in the 1970s, inspection has become one of the most powerful methods available for finding software errors Fagan, 1976. You don't have to just take my word for it, either. Experiences cited from the software literature describe how inspections have improved the quality and productivity of many software organizations. However, only a fraction of the software development community understands the inspection process and even fewer people practice inspections properly and effectively. To help you implement inspections and other peer reviews in your team, the book emphasizes pragmatic approaches that any organization can apply.
Several process assets that can jumpstart your peer review program are available from the website that accompanies this book, http://www.processimpact.com/pr_goodies.shtml. These resources include review forms, defect checklists, a sample peer review process description, spreadsheets for collecting inspection data, sources of training on inspections, and more, as described in Appendix B. You are welcome to download these documents and adapt them to meet your own needs. Please send your comments and suggestions to me at kwiegers@acm.org. Feedback on how well you were able to make peer reviews work in your team is also welcome.
Intended Audience
The material presented here will be useful to people performing many project functions, including:
- work product authors, including analysts, designers, programmers, maintainers, test engineers, project managers, marketing staff, product managers, technical writers, and process developers
- work product evaluators, including quality engineers, customer representatives, customer service staff, and all those listed above as authors
- process improvement leaders
- managers of any of these individuals, who need to know how to instill peer reviews into their cultures and also should have some of their own deliverables reviewed
This book will help people who realize that their software product's quality falls short of their goals and those who want to tune up their current review practices, establish and maintain good communications on their projects, or ship high-quality software on schedule. Organizations that are using the Capability Maturity Model for Software" or the CMMI for Systems Engineering/Software Engineering will find the book valuable, as peer reviews are components of those process improvement frameworks (see Appendix A).
The techniques described here are not limited to the deliverables and documents created on software projects. Indeed, you can apply them to technical work products from any engineering project, including design specifications, schematics, assembly instructions, and user manuals. In addition to technical domains, any business that has documented task procedures or quality control processes will find that careful peer review will discover errors that the author simply cannot find on his own.
Reading Suggestions
To gain a detailed understanding of peer reviews in general and inspections in particular, you can simply read the book from front to back. The cultural and social aspects of peer reviews are discussed in Chapters 1 and 2. Chapter 3 provides an overview of several different types of reviews and suggests when each is appropriate. Chapters 4 through 8 address the nuts and bolts of inspection, while Chapter 9 describes important inspection data items and metrics. If you're attempting to implement a successful review program in an organization, focus on Chapters 10 and 11. For suggestions on ways to deal with special review challenges, such as large work products or distributed development teams, see Chapter 12. Refer to the Glossary for definitions of many terms used in the book.
0201734850P07232001
Index
- Ability to Perform, 190, 191-192
- Accountability, 147
- Active design reviews, 92
- Active listening, 104
- Activities Performed, 190
- Ad hoc reviews, 2, 27, 41, 61, 161, 189
- Adobe Portable Document Format, 76, 149
- Aids, work, for peer reviews 75, 149, 199
- Allott, Stephen, 162
- Analysis
- causal analysis stage and, 121-123
- in High-Impact(TM) Inspections, 58-59
- of early reviews, 161
- problems with, 171-172
- traceability and, 89-90
- Analysis techniques
- for code, 92-94, 122
- defect patterns, 171
- for design documents, 92
- inspection data, 125-142
- models, 91
- for requirements specifications, 90-92
- for user interface designs, 92
- work product, 87, 88-94, 175-176
- Analyst, requirements
- participation in reviews, 71
- review benefits for, 25
- Appraisals, of work products, 34, 53-54, 56, 100, 113-114, 120, 131, 168
- Architect, participation in reviews, 71
- Architecture design, review participants, 71
- ARM (Automated Requirement Measurement) Tool, 90
- Assets, process, xii, 149-151
- Asynchronous reviews, 41, 54, 178, 180-181
- AT&T Bell Laboratories, 7, 54
- Attention-getting devices, 101
- Audioconferencing, 178, 179
- Author
- asynchronous reviews and, 180
- culture and, 15, 16-17
- data privacy and, 128
- defects and, 105-106, 115, 127-128, 171
- deposition meetings and, 55
- egos, 12, 14, 15, 29, 46
- follow-up and, 56, 119-120
- inspection goals, 87, 100
- inspection moderators and, 64, 66
- inspections and, 34, 35, 46-47, 61, 62, 64, 72, 102, 105-106
- issue log and, 115
- material selection and, 66
- misuse of data and, 18-20, 73
- overview and, 52, 82
- pair programming and, 38
- participation in reviews, 71
- peer deskchecks and, 39
- performance assessments and, 18-20
- perspectives and, 72
- planning and, 52, 61
- preparation time and, 86
- questions and, 86
- respect and, 16-17
- rework and, 55, 117-119, 120, 121
- role of, 46-47
- walkthroughs and, 37, 38
- Automated Requirement Measurement (ARM) Tool, 90
- Bad fixes, 66, 120, 172
- Beizer, Boris, 109
- Bell Northern Research, 7
- Benefits
- inspections, 7, 34, 37, 54-55
- metrics, 126-127, 192
- peer reviews, 3-8, 23-25, 185-186
- Best practices, inspection 161-162
- Body language, 105, 179
- Boeing Company, The, 49
- Buddy check. See Peer deskcheck
- Bugs. See Defects
- Bull HN Information Systems, 49
- Caliber-RM, 90
- Capability Maturity Model, Integrated (CMMI-SE/SW), xiii, 187, 194-197
- Capability Maturity Model, Systems Engineering (SE-CMM), 187, 193-194
- Capability Maturity Model for Software (SW-CMM), xiii, 187-193
- Causal analysis, defect, 52, 56, 57, 121-123, 126
- Certification of moderators, 155
- Challenges, special review, 175-183
- Champion, review, 161, 164
- Change
- pain as motivation for, 146
- preparing organization for, 144-149
- Checklists
- defect, xii, 33, 39, 75, 84, 87-88, 89, 170, 171, 196, 199
- design documents and, 92
- Gilb/Graham method and, 57
- inspection moderator's, 33, 96-98, 150
- phased inspections and, 59
- role in inspections, 69
- use in preparation, 53, 87-88
- Checkpoints
- inspection, 62
- review, 10-11, 167
- Choosing a review method, 31, 32, 41-43
- Closed cultural paradigm, 22-23
- CMMI-SE/SW (Capability Maturity Model, Integrated), xiii, 187, 194-197
- Coaching through reviews, 40
- Code
- analysis techniques for, 92-94, 122, 129
- error-prone, 41, 57
- generated, 181
- inspecting, 63, 83, 181
- inspection package for, 75
- nonprocedural, 181
- review participants, 71
- static analysis of, 63
- Code-counting conventions, 129
- Coding standards, 164
- Collaborative Software Review System (CSRS), 180-181
- Collaborative usability inspection, 92
- Colleagues, review process and, 13-30, 37, 40
- Commitment, management 17-18, 160
- Commitment to Perform, 190, 191
- Communication, 16-17, 144, 145, 177, 178
- Comparisons, review characteristics, 36
- Compilation, as code inspection entry criterion, 122
- Complexity, 77, 78
- Constantine, Larry L., 22
- Control charts, 136-137
- Coordinator, peer review, 20, 22, 115, 121, 151-152, 164, 166, 174, 191
- Costs
- choosing review approach and, 42
- early stage and, 12, 141
- inspection effectiveness and, 139
- inspection efficiency and, 140
- inspection team size and, 49, 182
- of inspections, 11, 100, 120, 140, 141, 142
- metrics and, 132-133
- quality and, 3-6
- requirements specifications and, 91-92
- return on investment and, 140-142
- review programs and, 3-4, 148-149, 191, 192
- sampling and, 67
- team reviews and, 35
- vs. benefits of reviews, 8
- Coverage, test measurement, 9
- Critical success factors for reviews, 159-161
- Cross-training, as review benefit, 8, 24, 141, 159, 183
- CSRS (Collaborative Software Review System), 180-181
- Cultural paradigms, 22
- Culture, 12, 13, 15-25
- authors and, 15, 16-17
- data and, 127
- guiding review principles and, 29
- management and, 191
- pair programming and, 38, 39
- peer deskchecks and, 40
- preparation and, 102
- problems and, 164, 165-167, 177-178
- selecting review process and, 32
- signing inspection summary reports and, 114
- successful program and, 143
- Customers, 4, 5, 16, 23, 138, 139, 141, 146, 159, 163
- as review participants, 71, 71, 72, 163-164
- Data
- analyzing, 125-142, 150
- authors and, 128
- benefits of, 126-127, 192
- CMMI-SE/SW (Capability Maturity Model, Integrated) and, 196
- collection of, 125-127, 129, 148, 150, 169, 174
- CMM key practices and, 190
- inspection summary report and, 54, 98-100, 114-115, 121, 150, 151
- management and, 173
- metrics and, 126-129
- misuse of, 18-20, 73, 127-128, 165, 173-174
- overanalyzing, 129
- privacy of, 128
- problems with, 173-174
- spreadsheet, xii, 129, 131, 135, 151, 199
- storing, 150
- time and, 29
- verification and, 192
- Data items, 100, 129, 130-131, 150
- Database, inspection, 129, 131, 135, 151
- Debugging, 24
- Decision rules, 113-114
- Defect causal analysis, 52, 56, 57, 121-123, 126
- Defect checklists, xii, 33, 39, 75, 84, 87-88, 89, 170, 171, 196, 199
- Defect containment, 138
- Defect density, 58, 67, 123, 124, 125, 126, 128, 132, 135, 136, 138, 142
- Defect list/log. See Issue log
- Defect reviews, 194
- Defect-tracking systems, 118
- Defects
- authors and, 105-106, 115, 127-128, 171
- bad fixes, 66, 120, 172
- classification of, 108-110, 122, 170
- corrected, 107, 121, 131, 132, 134
- cost of, 3-4, 139, 140, 141
- counting, 106
- data items and, 131
- definition of, 2
- error-prone modules, 41
- found, 107, 121, 130-131, 132, 133
- inspection rates and, 76-77
- major or minor, 41, 109-110, 117-118, 121, 123, 129, 131, 133, 168, 171-172
- measurement dysfunction and, 19, 128
- metrics and, 132
- origin of, 108
- pointing out, 105-107
- prevention, 4-5, 122, 141, 159, 189
- problems with, 170-172
- recording, 107-110
- secondary, 120
- severity of, 109, 112
- synergy and, 34
- types of, 108-110, 122
- typos vs., 85
- verification and, 192
- Deliverables. See Work products
- Deposition meetings, 55
- Description, peer reviews, xi, 2-3
- Designs
- analysis techniques for, 92
- inspecting, 92
- review participants, 71, 92
- Deskchecks, 15, 39, 86
- Detail design, review participants, 71
- Developer, review benefits for, 24
- Development manager, review benefits for, 24
- Direct analysis, 58
- Disposition. See Appraisals, of work products
- Distributed reviews, 178-180
- Documentation, reviewing system technical, 71
- Documented processes, 147, 149-151, 162, 190, 196
- Dynamic analysis, 11
- Dysfunction, measurement, 19, 128
- Effectiveness, inspection
- data analysis and, 135, 138-140
- early reviews and, 161
- entry criteria and, 68
- Gilb/Graham method and, 57
- inspection rates and, 77
- inspectors and, 33
- metrics and, 121, 126, 132
- preparation and, 101
- problems, 169-172
- quality and, 100
- team size and, 49
- vs. informal reviews, 34
- Efficiency, inspection
- culture and, 19
- data analysis and, 135, 138, 138n, 140
- entry criteria and, 68
- experience and, 146
- inspection rates and, 77
- metrics and, 121, 132
- team size and, 49
- Effort
- inspection, 132
- meeting, 130
- overview, 130
- planning, 130
- preparation, 130
- rework, 130
- Egoless programming, 14
- Egos, 12, 14, 15, 29, 46, 185
- 80/20 rule, 121-122
- Electronic collection or distribution, 40-41, 75-76, 78, 108, 149, 150
- Entry criteria, inspection, 67-69, 149, 192
- Error-prone modules, 41, 57
- Estimating time, 28
- Excuses for not doing reviews, 20, 21
- Exit criteria, inspection, 27, 36, 123-124, 150, 192, 196
- Extreme Programming, 38
- Face-to-face meetings, 41, 178, 180
- Facial expressions, 105, 179
- Fagan, Michael, xii, 34, 45, 47, 51-52, 55, 57
- Faults. See Defects
- Federal Systems Division, IBM, 35
- Finding problems vs. solving problems, 29, 101, 112, 164, 170
- Follow-up, 56, 113, 119-121, 129, 167, 172
- Ford Motor Company, 37
- Formal, Technical, Asynchronous Review Method (FTArm), 180
- Formal reviews, 12, 27, 29, 32, 61, 125-126, 175
- Formality spectrum, peer review, 31-41
- Freedman, Daniel, 67, 110
- Frequency and types of reviews, 161
- Fritsch, Jim, 190
- FTArm (Formal, Technical, Asynchronous Review Method), 180
- Gelperin, David, 58, 115
- Generated code, 181
- Geographical concerns, 41, 177, 192
- Gilb, Tom, and Dorothy Graham, 57
- Software Inspection, 88
- Gilb/Graham inspection method, 57-58, 77
- Goal-Question-Metric (GQM), 126
- Goals, peer review program, 161
- Goddard Space Flight Center, NASA, 90
- GQM (Goal-Question-Metric), 126
- Grady, Robert, 141
- Guide to Classification for Software Anomalies (Institute of Electrical and Electronics Engineers), 109
- Guiding principles for peer reviews, 29-30
- Habits of effective inspection teams, 162
- Hardlook(TM) Analysis Matrix, 58
- Heuristic evaluations, 92
- Hewlett-Packard Company, 7, 47, 109, 141
- Hierarchical defect classification, 109
- High-Impact(TM) Inspections, 58
- HTML, peer review documentation and, 149
- IBM, xii, 7, 9, 35, 45, 109
- Identification of individual reviews, 100
- IEEE (Institute of Electrical and Electronics Engineers)
- Standard 1044.1-1995, Guide to Classification for Software Anomalies, 109
- Standard 1028-1997, Standard for Software Reviews, 11
- Imperial Chemical Industries, 7
- Improvement models, process, 187-197
- Indirect analysis, 58
- Informal reviews, 22, 26, 27, 30, 32, 34, 35-42, 161, 162, 165, 170, 172, 175
- Infosys Technologies, Ltd., 77
- Infrastructure for peer reviews, 143, 196
- Inspection database, 129, 131, 135, 151
- Inspection Lessons Learned questionnaire, 115, 116, 150, 161, 199
- Inspection moderator. See Moderator, inspection
- Inspection package, 74-76, 78
- contents of, 52, 53, 75
- distributing to inspectors, 52, 74, 82
- Inspection summary report, 54, 98-100, 114-115, 121, 150, 151, 199
- Inspections, xii, 3, 22, 31, 32, 33-34, 35
- authors and, 34, 35, 46-47, 61, 62, 64, 72, 102, 105-106
- benefits of, 7, 34, 37, 54-55
- best practices, 161-162
- Capability Maturity Model, Integrated (CMMI-SE/SW) and, 196
- Capability Maturity Model for Software (SW-CMM) and, 190
- characteristics of, 36
- costs of, 11, 100, 120, 140, 141, 142
- data analysis, 125-142
- data items, 129, 130-131
- database for, 129, 131, 135, 151
- documents, 51, 63, 75
- effectiveness and, 33, 34, 49, 57, 68, 77, 100, 101, 121, 126, 132, 135, 138-140, 161, 169-172
- efficiency and, 19, 34, 49, 68, 77, 121, 132, 135, 138, 138n, 140, 146
- entry criteria, 67-69, 149, 192
- exit criteria, 27, 123-124, 150, 192, 196
- Gilb/Graham method, 57-58
- High-Impact(TM), 58
- improving, 115
- management and managers' participation in, 54, 66, 73-74
- measuring impact of, 138-142
- meetings, 53-55, 57, 58, 78, 95-116
- metrics, 34, 54, 55, 57, 77, 100, 106, 116, 119, 121, 126, 127-129, 150
- N-fold, 49
- number of participants, 29, 48-49
- participants, 69-74, 79, 131, 169, 182
- phased, 59
- planning, 61-79
- preparation and, 30
- procedures for, 149
- process, 45-59
- rates, 30, 52, 57, 76-78, 123, 126, 133-134, 136
- ROI from, 6-8, 12, 58, 138, 140-142, 159, 175
- roles of team members in, 46-48
- stages, 50-56
- steps, 51-52
- team size, 29, 48-49
- timing of, 32, 33-34, 62-64, 161, 172, 175
- training, 126
- traps, 162-164
- variations of, 56-59
- vs. informal reviews, 40, 53, 170
- when to hold, 32, 33-34, 62-64, 161, 172, 175
- Inspectors
- choosing, 69-73, 162
- finding, 183
- number of, 29, 48-49
- perspectives, 52, 70-73
- preparation and, 84, 85
- project roles represented, 70-72
- scheduling events and, 79
- source code analysis and, 92
- training, 33, 146, 183
- Installing a peer review program, 143-157
- Institute of Electrical and Electronics Engineers (IEEE). See IEEE (Institute of Electrical and Electronics Engineers)
- Integrated Capability Maturity Model (CMMI-SE/SW), xiii, 187, 194-197
- Internet-based collaboration tools, 179
- Intranets, 78, 149
- ISO 9000-3, 187, 197
- Issue log
- closure and, 117
- confidentiality of, 115
- follow-up and, 121
- inspection package and, 75
- inspectors and, 113
- online, 179
- personal, 84, 101, 102
- process description and, 150
- recorders and, 107-108
- style issues and, 164
- Web site for, 199
- Issues, 107-110, 108, 119
- pointing out, 105-107
- J-curve. See Learning curve
- Jalote, Pankaj, 77
- Johnson, Philip, 180
- Jones, C. L., 122
- Jones, Capers, 138n
- Justifying peer reviews, 6-8
- Key practices of peer reviews, 190
- Key process areas (KPAs), 188-190
- Knight, John C., 59
- Knuth, Donald, 8
- KPAs (key process areas), 188-190
- Large size
- of review teams, 29, 182
- of work products, 58, 63, 175-176
- Lawrence, Brian, 102
- Leakage, of defects, 115, 127-128, 173
- Learning curve, 146, 147, 148, 157, 161
- Lessons Learned questionnaire, inspection, 115, 116, 150
- Line numbers, 69, 75-76
- Lint, 63
- Listening, active, 104
- Litton Data Systems, 7
- Location, of defects, 107-108
- Locations, reviewers in different, 178
- Lockheed Martin Western Development Labs, 12
- Lockwood, Lucy A. D., 92
- Long-distance reviews, 178-179
- Maintainer
- participation in reviews, 71
- review benefits for, 24
- Major defects, 41, 109-110, 117-118, 121, 123, 129, 131, 133, 168, 171-172
- Management and managers
- as authors, 74
- commitment to reviews, 13, 17-18, 22, 147-148, 160, 191
- culture and, 17-20
- issue logs and, 115
- misuse of data, 18-20, 73, 127-128, 173-174
- participation in reviews, 54, 66, 73-74, 163, 173, 182
- problems and, 165, 172-174
- as process owners, 143
- respect between authors and, 73
- review benefits for, 24
- review policies and, 147, 163, 173
- successful peer reviews and, 164
- training, 155-156
- work product appraisals and, 173
- Material, selecting for inspection, 66-67
- Maturity models, 187-197
- Measurement and Analysis, 190, 192
- Measurement dysfunction, 19, 128
- Measurements. See Metrics
- Measuring impact of inspections, 138-142
- Meeting time, 130
- estimating, 78
- Meetings
- challenges to holding, 177-181
- closing, 114-115
- duration, 29-30
- late arrivals to, 182
- need for, 54-55
- problems with, 110-113
- suspending, 112
- synergy during, 34, 54-55, 180
- terminating, 101, 170, 173
- Meetings, types of
- asynchronous review, 180-181
- deposition, 55
- distributed review, 178-180
- face-to-face, 41, 178, 180
- inspection, 53-55, 57, 58, 78, 95-116
- overview, 52-53, 74, 78
- Methods, review, 41-43
- Metrics
- data analysis and, 127-129, 135-136, 174
- electronic collection of, 78
- follow-up and, 119, 121
- Goal-Question-Metric (GQM) and, 126
- improving development process and, 126
- improving inspection process and, 115
- inspection summary report and, 54
- inspection, 34, 54, 55, 57, 77, 100, 106, 116, 119, 121, 126, 127-129, 132-134, 150
- inspectors and, 106
- misuse of, 20, 173
- peer deskchecks and, 39
- peer review coordinators and, 151
- process description and, 150
- return on investment and, 55
- scatter charts, 135, 136
- tracking, 77, 115
- walkthroughs and, 36
- Microsoft, 40, 131
- Milestones vs. tasks, 28
- Minor defects, 41, 109-110, 118, 121, 123, 129, 131, 133, 171-172
- Missing requirements, 91
- Models, analysis, 91
- Moderator, inspection
- analysis strategies and, 87, 169
- certification of, 155
- characteristics of, 65
- checklist, 33, 96-98, 150, 199
- cultural issues and, 167
- deposition meetings and, 55
- distributed reviews and, 179
- follow-up and, 56
- inspection meetings and, 52, 95-103, 110-113
- large groups and, 49, 182
- overview and, 52, 82
- peer deskchecks and, 39
- planning and, 52, 61, 62
- preparation and, 85-86, 169
- problem-solving and, 164
- problems and, 110-113
- rework and, 55
- role of, 29, 34, 35, 46, 49, 52-54, 64-66, 95-103, 110-113
- scheduling events and, 78
- selecting, 66, 167
- summary report and, 98-100
- training, 155
- Myers, E. Ann, 59
- N-fold inspections, 49, 182
- NAH (not applicable here) syndrome, 21
- NASA (National Aeronautics and Space Administration), 90
- National Software Quality Experiment, 86, 142
- Negative and positive aspects of reviews, 15-16, 19-20
- Nielsen, Jakob, 92
- Nonprocedural code, inspecting, 181
- Nortel Networks, 67, 139
- Not applicable here (NAH) syndrome, 21
- Number of inspectors, 29, 48-49
- Numbering lines, 75-76
- Observers, 74, 163
- O'Neill, Don, 140, 146
- Open cultural paradigm, 22, 23
- Origin classifications, defect, 108
- Orthogonal defect classification, 109
- Overview, 52-53, 57, 58, 78, 81-83, 149, 150, 182
- distributing inspection package at, 82
- meeting, 52-53, 74, 78, 81-83
- Owner, peer review process, 27, 143-144, 145, 149, 157, 170, 174, 191
- Package, inspection. See Inspection package
- Pain, as motivation for change, 146
- Pair programming, 38-39, 183
- Pair reviewing. See Peer deskcheck
- Paradigms, cultural, 22
- Paraphrasing, 104
- Pareto analysis, 121
- Parnas, D. L., 92
- Participants, review, 69-74, 110-111, 162, 163, 169, 182
- Passaround, 40-41, 42, 53, 55, 149, 161, 165, 175, 176, 180, 182
- PBR (Perspective-based reading), 90
- PDF (Portable Document Format), 76, 149
- Peer deskcheck, 22, 35, 39-40, 66, 117, 149, 165, 171, 175, 183
- Peer review
- coordinator, 20, 22, 115, 121, 151-152, 164, 166, 174, 191
- formality spectrum, 31-41
- policy, 147, 165, 173, 191
- process description, 143, 149-151
- process owner, 27, 143-144, 145, 149, 157, 170, 174, 191
- program, installing, 143-157
- sophistication scale, 26-27
- Perfectionism, 15, 172
- Personal Software Process, 39
- Personality conflicts, 69, 167
- Perspective-based reading (PBR), 90
- Perspectives, inspector, 52, 70-73
- Phantom Inspector, 34
- Phased inspections, 59
- Philips Semiconductors, 177
- Piloting, of review process, 144, 156-157
- Planning, inspection, 27-29, 51-52, 61-79, 167-169
- Policy, peer review, 147, 165, 173, 191
- Portable Document Format (PDF), 76, 149
- Porter, Adam A., 91
- Practitioner support, 148
- Predecessor documents, 72, 75, 83, 87
- Preliminary reviews, 19, 172
- Preparation, 83-94
- approaches, 86-94
- authors and, 86
- Capability Maturity Model, Integrated (CMMI-SE/SW) and, 196
- code and, 92-94
- defect checklists and, 53, 87-88
- designs and, 92
- culture and, 102
- effectiveness and, 101
- formal reviews and, 30
- Gilb/Graham method and, 57
- High-Impact(TM) Inspections and, 58
- inspection package and, 52
- inspection process and, 51, 53
- inspections and, 34, 79, 94
- inspectors and, 84, 85
- judging adequacy of, 53, 101-102, 152
- lack of, 101, 152
- moderators and, 85-86, 169
- problems, 152, 169-170
- rates, 86, 123, 126, 134, 135, 136, 137, 172
- requirements specifications and, 87
- rules and, 87, 196
- times, 83-84, 85-86, 101, 128
- typo lists, 85-86
- user interfaces and, 92
- verification and, 87
- Prerequisites for successful reviews, 16
- Primark Investment Management Services, 7
- Principles, review, 29-30
- Privacy of data, 128
- Problems
- finding vs. solving, 29, 101, 112, 164, 170
- troubleshooting review, 164-174
- Problems, types of
- communication, 177, 178
- cultural, 164, 165-167, 177-178
- data, 173-174
- defects, 170-172
- effectiveness, 169-172
- geographical, 177
- inspection meeting, 110-113
- management, 172-174
- planning, 167-169
- preparation, 152, 169-170
- review, 164, 165-174
- rework, 172
- time, 178, 180, 182
- Procedures, review, 149-150
- Process, review, 13-30, 115, 149-151, 156-157, 162-163, 199
- Process areas, CMMI-SE/SW, 194, 195
- Process assets, xii, 149-151
- Process brainstorming stage, 56
- Process documentation
- Capability Maturity Model for Software (SW-CMM) and, 190
- contents for peer reviews, 147, 149-151
- review participants, 71
- review traps and, 162
- Process improvement
- based on inspection results, 4, 121-123, 162
- of inspections, 115
- models, 187-197
- Process owner, peer review, 27, 143-144, 145, 149, 157, 170, 174, 191
- responsibilities, 145
- Process stages, inspection, 51-56
- Programmer, participation in reviews, 71
- Project manager
- participation in reviews, 71
- review benefits for, 24
- Project plans
- incorporating reviews into, 27-29, 160, 167
- review participants, 71
- Project reviews, types of, 2
- Project roles, 70-72, 163, 182
- Psychology of Computer Programming, The (Weinberg), 14
- Qualified reviewers, lack of, 183
- Quality, 1-12, 15-16, 185, 186
- combining types of reviews and, 42
- cost of, 3-6, 159
- “good enough,” 5-6
- guiding review principles and, 29
- inspections and, 62
- “is free,” 3-4
- Quality (cont.)
- National Software Quality Experiment, 142
- pair programming and, 39
- quantifying, 57
- tools for, 8-11, 171
- Quality assurance, 61
- Quality assurance manager
- participation in reviews, 71
- review benefits for, 24
- Quality attributes, 91, 93
- Quantifying product quality, 57
- Questions
- at beginning of inspection meeting, 101
- checklists and, 88
- Goal-Question-Metric (GQM) and, 126
- prior to inspection meeting, 86
- scenarios and, 90
- Random cultural paradigm, 22, 23
- Rates
- inspection, 30, 52, 57, 76-78, 123, 126, 133-134, 136
- preparation, 86, 123, 126, 134, 135, 136, 137, 172
- Raytheon Electronic Systems, 4
- Reader, 34, 35, 39, 46, 47-48, 49, 53, 57, 84, 102, 103-105
- Reading, during inspection meeting, 47-48
- Recorder, 34, 35, 46, 49, 53, 84, 102, 107-110, 108
- Records, review, 27-28, 36
- Reinspections, 56, 68, 113, 120-121, 123, 172, 196
- Requirements analyst
- participation in reviews, 71
- review benefits for, 25
- Requirements specifications
- analysis techniques for, 90
- customer participation in reviews of, 71, 72, 163-164
- inspecting, 63, 72, 87, 126
- missing requirements, 91
- preparation for, 87
- reinspections and, 120-121
- review participants, 71, 112, 163-164
- usefulness of, 163-164
- Resistance, to reviews, 20-25, 165-166
- overcoming, 22-25
- Resources, 148
- Respect
- between authors and managers, 73
- between authors and reviewers, 16-17
- Return on investment (ROI), 6-8, 12, 58, 138, 140-142, 159, 175
- Review champion, 161, 164
- Review coordinator. See Coordinator, peer review
- Review policies, 147, 165, 191
- Reviewers
- availability of, in choosing review type, 61
- lack of qualified, 183
- respect between authors and, 16-17
- ReviewPro, 78, 180
- Reviews
- benefits from, 3-8, 23-25, 185-186
- testing and, 8-11
- types of project, 2
- Rework
- authors and, 55, 117-119, 120, 121
- closure and, 117-119
- cost and, 4, 134
- data analysis and, 129
- Gilb/Graham method and, 57
- inspection process and, 51, 53, 54, 55-56
- inspection summary report and, 100
- issue log and, 84
- metrics and, 134
- preparing organization for change and, 144
- percentage of project effort, 4
- planning and, 28, 63, 64
- problems with, 172
- quality and, 4
- scheduling, 167
- selecting material based on risk of, 66
- verification of, 34, 46, 114
- work product appraisal and, 113
- Risk assessment, 5, 89
- Risk
- in inspection meeting delays, 54
- as review selection criterion, 41-42
- of using inspection data to evaluate individuals, 20
- as work product selection criterion, 66-67
- ROI (Return on investment), 6-8, 12, 58, 138, 140-142, 159, 175
- Rothman, Johanna, 179
- Round-robin method, 105, 179
- Rules
- decision, 113-114
- Gilb/Graham method and, 57
- preparation and, 87, 170, 196
- for work products, 75, 88
- Rules of conduct, 110
- Sampling, 67, 168, 176
- Santayana, George, 121
- Scatter charts, 135, 136
- Scenarios, use during preparation, 90-91
- Scenes of Software Inspections: Video Dramatizations for the Classroom, 155
- Scheduling, 41, 78-79, 167-168, 169, 196
- SE-CMM (Capability Maturity Model, Systems Engineering), 187, 193-194
- Secondary defects, 120
- SEI (Software Engineering Institute). See Software Engineering Institute (SEI)
- Selected aspect reviews, 59, 148
- Selecting
- inspection moderators, 66, 167
- inspectors, 69-73, 162
- portions of work products, 66-67, 86, 87, 168-169, 176
- review methods, 31, 32, 41-43
- wording, 16-17, 105-106
- Severity of defects, 109, 112
- Signatures on inspection summary reports, 114-115
- Size
- data items and, 130
- metrics and, 130, 133, 136
- of review team, 29, 48-49, 176, 182
- of work products, 175-176
- Software, Capability Maturity Model for (SW-CMM), xiii, 187-193
- Software development, review checkpoints for, 10-11
- Software Development Technologies, 78, 180
- Software Engineering Institute (SEI), 22, 187
- Scenes of Software Inspections: Video Dramatizations for the Classroom, 155
- Software Engineering Process Group, 51, 144
- Software Inspection (Gilb and Graham), 88
- Software quality assurance plan, 61
- Software Quality Engineering, 58
- Solving problems vs. finding problems, 29, 101, 112, 164, 170
- Sophistication scale, peer review, 26-27
- Source code. See Code
- Source documents, 57, 68
- Space Shuttle Onboard Software, 5, 122
- SPC (statistical process control), 123, 126, 135-136
- Spectrum, peer review formality, 31-32
- Spreadsheet, inspection data, xii, 129, 131, 135, 151, 199
- Stages, inspection process, 50-56
- Standard for Software Reviews (Institute of Electrical and Electronics Engineers), 11
- Standards, coding, 164
- Standards checkers, 164
- Starbase Corporation, 90
- Start times, meeting, 182
- Static analysis of code, 63, 122
- Statistical process control (SPC), 123, 126, 135-136
- Structured walkthroughs, 35
- Study hall approach, 86
- Style issues, 106, 164, 172
- Success factors, critical, 159-161
- Successful peer reviews, 16-17, 22, 148, 159-174, 191
- Summary report, inspection, 54, 98-100, 114-115, 121, 150, 151, 199
- Suspending meetings, 112
- SW-CMM (Capability Maturity Model for Software), xiii, 187-193
- Synchronous cultural paradigm, 22, 23
- Synergy, 34, 38, 54-55, 180
- System technical documentation, review participants, 71
- Systems Engineering Capability Maturity Model (SE-CMM), 187, 193-194
- Tasks vs. milestones, 28
- Taxonomy of bugs, 109
- Team reviews, 15-25, 35, 36, 40, 161, 175
- Team size, inspection, 29, 48-49, 176, 182
- Teleconference reviews, 179
- Templates, 68, 87, 91, 164
- Test documentation, 75, 93
- review participants, 71
- Testability, 87, 90
- Test engineer
- participation in reviews, 71
- requirements specification inspection and, 110
- review benefits for, 25
- Testing, 20-21, 63-64, 122
- reviews and, 8-11
- Third-hour discussions, 122
- Time
- allocation of, 61, 78-79, 160, 191
- data items and, 130-131
- estimating, 28-29, 78
- follow-up, 121
- inspection meetings and, 112
- limiting discussion, 29-30, 79, 112
- metrics and, 132-133, 135, 136
- overviews and, 82
- peer review coordinators and, 151
- planning and, 27
- preparation, 83-84, 85-86, 101, 128
- problems, 178, 180, 182
- return on investment (ROI) and, 141, 142
- rework and, 121
- scheduling and, 78-79, 170
- selecting review process and, 32
- successful reviews and, 16
- Time to market, 5-6
- Tools
- for asynchronous reviews, 41, 180
- audio- and videoconferencing, 178, 179
- Automated Requirement Measurement (ARM), 90
- benefits of reviews vs., 9
- for code, 8-9, 63, 93, 164, 200
- for counting code, 129
- Internet-based collaboration, 179
- quality, 8-11, 171
- for recording, 108
- Web links for, 200
- Traceability, 89
- Training, 148, 152-156
- authors, 171
- Capability Maturity Model, Integrated (CMMI-SE/SW) and, 196
- Capability Maturity Model for Software (SW-CMM) and, 191, 192
- cost of, 142
- course outline for, 153-154
- cultural issues and, 165, 166
- data and, 126
- inspectors, 146, 155, 183
- managers, 155-156
- moderators, 155, 167
- planning issues and, 169
- qualified reviewers, 183
- resistance to reviews and, 145
- sources of, 200
- successful peer reviews and, 160, 162
- Traps, review, 162-164
- Troubleshooting review problems, 164-174
- Types
- of defects, 108-110, 122
- of peer reviews, 11-12, 31-43, 161
- of project reviews, 2
- Typo list, 75, 84-86, 100, 101, 117, 150, 199
- Underutilization of reviews, 20-22
- Unit testing, 15, 63-64
- Users, as review participants, 71, 71, 72, 163-164
- User interface, analysis techniques for, 92
- User interface design, review participants, 71
- User manuals, review participants, 71
- Validation, 8
- Value of reviews, 20
- Verification
- Capability Maturity Model, Integrated (CMMI-SE/SW) and, 194, 195, 196, 197
- Capability Maturity Model, Systems Engineering (SE-CMM) and, 194
- Capability Maturity Model for Software (SW-CMM) and, 190, 192-193
- follow-up and, 34, 56, 114, 119-121
- inspection summary report and, 100
- inspections and, 72
- issue log and, 108
- preparation and, 87
- procedures and, 149
- Verification (cont.)
- quality control and, 8
- of rework, 56, 119-120
- successful programs and, 148
- work product appraisal and, 113
- Verifier, 46, 56, 100, 115, 119, 121, 172
- Verifying Implementation, 190, 192-193
- Version identifiers, 68
- Videoconferencing, 178, 179
- Videotape, Scenes of Software Inspections: Video Dramatizations for the Classroom, 155
- “Virtual” reviewing, 41
- Votta, Lawrence G., Jr., 91
- Walkthroughs, 3, 31, 35, 36-38, 46, 47, 53, 149, 151, 161, 182
- Web sites, xii, 199
- Weinberg, Gerald, 110
- The Psychology of Computer Programming, 14
- Weiss, D. M., 92
- Western Development Labs, Lockheed Martin, 12
- Wording, choosing, 16-17, 105-106
- Work aids for peer reviews, 75, 149, 199
- Work products
- appraisals of, 34, 53-54, 56, 100, 113-114, 120, 131, 168
- examining, 81-94
- large, 58, 63, 175-176
- of managers, 74
- reading, 103-105
- rules for, 75, 88
- selecting portions to inspect, 66-67, 86, 87, 168-169, 176
- size of, in choosing review type, 61
- types to review, 11-12
- version identifiers for, 68
- Yield. See Effectiveness, inspection
- Yourdon, Edward, 35