Implementing the Framework
We started by recording a simple performance test script and parameterizing the data that was entered into the application. This was not as trivial as we imagined it would be. When reading the HTTP within the script, we had a difficult time figuring out which data was going where. We eventually got it worked out, and documented as much as we could along the way in case we had to run through the process again (which we did).
We then wrote a function that converted the production data to a data format that was more friendly to the tool we were using. We initially struggled with this step, but because the tool we were using had an actual development language (not a proprietary language), we were able to recruit developer help in writing some of the more tricky code. We initially tried to have the script read straight from the flat file, but we had trouble coordinating data access across all of the virtual users.
When we got the initial prototype working, not only were we able to enter the transactions, but we even uncovered a load limitation in the web application. Coincidentally, we wouldn't have found this limit until much later in the project, as actual performance testing was not scheduled to occur until the next iteration. All said and done, we were able to automate the highest-risk aspect of our parallel testing (verification of calculations that were performed on the transaction data). The less risky aspects of the parallel testing (such as GUI rules) were then executed manually. We were able to process about 400 transactions (with only a couple of errors in our data translation) in about 20 minutes, as compared to the estimated minimum of 8 hours without the use of the performance testing tool.