Support Activities
Testing Is a Team Effort
Because software testing is done at the team level, it requires tools that support and enhance team member communication and that present an integrated interface allowing team members to share common views of testing process activities and artifacts. One of the predominant problems at all stages of the testing process is artifact control and storage. One of the areas that provide the most payback to the testing process is automated configuration management of testing deliverables. There are many documents and executables that are created that must be available to all of the test team members. Team members frequently work these entities in parallel. They must be protected from concurrent updates that may overwrite each other when more than one team member is working on the same deliverable. Furthermore, there must be a central repository where the artifacts are stored for public use.
We have worked on many testing projects where there was no central storage and where everyone on the team created and updated the documents on their local desktops. We created rules defining directories for spe-cific deliverables and stating that everyone was supposed to place their work in these shared public directories. This solution was better than no solution, but it still did not provide versioning with check-out and check-in control of the documents.
First and foremost are testing management and requirements management. Testing management can be implemented using a tool such as MS Project. It allows tasking identification, resource management, and progress assessment through the setting of testing milestones. A requirements management tool is also essential because software requirements must be documented and updated during the development process and test requirements must be documented and updated in parallel with development activities and during testing activities.
Our tool of choice for requirements management has been RequisitePro because it integrates software requirements gathering with test requirements specification. Furthermore, its test requirements grids can be exported to MS Project and then used to guide and monitor the test process. There are other requirements management tools available, some of which are integrated with testing tool suites. While this book is not about tool evaluation, there are two essential considerations when assessing these products. First, is the product already integrated with a testing tool suite? Second, if it is not, does it have an open application programming interface (API) that can be used to create your own integration code?
Software configuration management is next. There are products available that can be used to implement configuration management of testing artifacts. They include MS Visual SourceSafe, Rational ClearCase, and Merant's PVCS, just to name a few. It is imperative that all testing artifacts be stored in an automated configuration management database. It is just as important that the particular configuration management tool you have chosen communicate with the other tools you are using to support test process activities at all levels of testing. If the tool does not, then it must offer and open an API to build the software bridges you need.
Software testing metrics are a necessary component in test evaluation reporting. The metrics should include defect metrics, coverage metrics, and quality metrics. There are many useful defect tracking measures. Defect metrics can be generally categorized as defect density metrics, defect aging metrics, and defect density metrics: the number of daily/weekly opens/closes, the number of defects associated with specific software/test requirements, the number of defects listed over application objects/classes, the number of defects associated with specific types of tests, and so on. Defect reporting should be automated using at a minimum an Excel workbook because Excel has the capability to summarize spreadsheet data in charts and graphs. Defect reporting can also be automated through tools such as Rational ClearQuest, among others.
Testing quality metrics are special types of defect metrics. They include (8):
Current state of the defect (open, being fixed, closed, etc.)
Priority of the defect (importance to its being resolved)
Severity of the defect (impact to the end-user, an organization, third parties, etc.)
Defect source (the originating fault that results in this defectthe what component that needs to be fixed)
Coverage metrics represent an indication of the completeness of the testing that has been implemented. They should include both requirements-based coverage measures and code-based coverage measures. For examples of these metrics, see Chapter 9 in reference (6) and the Concepts section under "Key Measures of Testing" in reference (10).