- No New Models, Please!
- How Much Automation Is Enough?
- Testing Process Spheres
- Support Activities
- A Test Automation Group's Scope and Objectives
- Test Automation Framework Deliverables
- Categories of Testing Tools
- Conclusion
- References
A Test Automation Group's Scope and Objectives
The Scope
A test automation group's purpose should be to develop automated support for testing efforts. This group should be responsible for the design and implementation of a data-driven automated testing framework. They should design and construct suites of automated tests for regression testing purposes. Figure 1.3 illustrates an automated testing infrastructure that was designed for a well-known company by CSST Technologies, Inc.
FIGURE 1.3 A Sample Automated Test Infrastructure
The test automation framework should be deployed specifically to support automated test script development and the maintenance related to all levels of testing. The framework should support unit and integration testing and system/regression testing endeavors. This does not imply that other areas not included in this scope cannot take advantage of the test automation framework and tool suites. Other departments that may be interested in using the test automation scaffolding and the automation tool suite should fund and coordinate deployments with the automation team. An automation effort should focus on the identified areas of deployment.
The chosen approach should cover the test automation activities that will be performed by an automated tools group. Manual testing activities can serve as precursors to test automation. The goal for manual test efforts should be to manually test all application features and, while in the process, to develop test conditions and data that can be implemented using the automation framework for regression testing.
As an example, the data-driven approach could be implemented through structured test scripts that make use of functions and procedures stored in library files, the primary goal being to separate the test data from the test scripts and the secondary goal being to develop reusable test script component architecture. Meeting these goals substantially reduces the maintenance burden for automated test scripts.
Assumptions, Constraints, and Critical Success Factors for an Automated Testing Framework
The following assumptions should be applied.
Assumptions The following assumptions form the basis of the test automation strategy.
An integrated tool suite must be the primary test management, planning, development, and implementation vehicle.
The tool suite must be used to direct and control test execution, to store and retrieve test artifacts, and to capture/analyze/report test results.
The tool suite must include a tool of choice for defect tracking and resolution.
The tool suite must include a component for test requirements management.
The tool suite must include a configuration management tool of choice.
The configuration management tool of choice must be used to put manual and automated test artifacts under configuration management.
All of the tools described above must be integrated with desktop tools such as MS Office.
The proper automated testing workspaces must be created on test servers that are separate from development servers.
The required test engineer desktop-script-development configuration must be defined and implemented.
Testing standards must be documented and followed.
Constraints These constraints limit the success of the automation effort if they are not heeded.
The automated tools group resources must remain independent of any manual testing group.
There may not be a large enough number of available staff on the automation team.
The level of cooperation of the software development group and their management with respect to automated tool use may be too low.
There may be a lack of cooperation and information exchange with developers in creating testable applications.
The release schedules for major versions of the AUT and for customer-specific releases of the AUT can be too tight.
There is uncertainty associated with the GUI updates in AUT.
There may be corporate mandates on what tools must be used.
Critical Success Factors We based the following critical success factors on a set of test automation guidelines developed by Nagle (7).
Test automation must be implemented as a full-time effort, not a sideline.
The test design process and the test automation framework must be developed as separate entities.
The test framework must be application independent.
The test framework must be easy to expand, maintain, and enhance.
The test strategy/design vocabulary must be framework independent.
The test strategy/design must hide the complexities of the test framework from testers.
Strategic Objectives These objectives are based on the critical success factors listed above.
Implement a strategy that will allow tests to be developed and executed both manually (initial test cycle) and via an automation framework (regression test cycles).
Separate test design and test implementation to allow test designers to concentrate on developing test requirements, test planning, and test case design while test implementers build and execute test scripts.
Implement a testing framework that both technical and nontechnical testers can use.
Employ a test strategy that assures that test cases include the navigation and execution steps to perform, the input data to use, and the expected results all in one row or record of the input data source.
Realize an integrated approach that applies the best features of keyword-driven testing, data-driven testing, and functional decomposition testing.
Implement an application-independent test automation framework.
Document and publish the framework.
Develop automated build validation (smoke) tests for each release of the application.
Develop automated environmental setup utility scripts for each release of the application.
Develop automated regression tests for
GUI objects and events
Application functions
Application special features
Application performance and scalability
Application reliability
Application compatibility
Application performance
Database verification