Transcript
HP Software customer perspective: using HP TestDirector for Quality Center software to report and resolve software defects White paper
Table of contents About the author . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .2 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .2 Learning from experience . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .2 Customized software quality defect-reporting and resolution process . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .3 Defect reporting . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .5 Defect-resolution meeting . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .5 Defect fix . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .6 Retest defects . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .7 Defect-status definitions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .7 Defect-status modifications—status modifications by user . . . . . . . . . . . . . . . .7 Configuring new processes into test automation tools . . . . . . . . . . . . . . . . . . . . .8
About the author During Punky McLemore's tenure as a quality assurance (QA) testing manager at HP, she worked with HP Software's legacy Mercury solutions prior to the Mercury acquisition. Before accepting her position at HP, McLemore was a QA testing manager at a large U.S. financial institution, where she was responsible for testing all new applications for this multi-billion-dollar company. The company’s QA staff employed up to 80 testing specialists and 120 HP TestDirector for Quality Center software users. McLemore has 16 years of experience in QA testing and has used HP Software (legacy Mercury) solutions for seven years.
Introduction In this white paper, I will share my unique viewpoint and my experience using HP TestDirector for Quality Center to support a variety of large-scale projects. I will discuss the setup and management of the processes, and how easily the processes can be changed and maintained using this HP software tool. I will also provide you with some of the best practices that I accumulated as the project lead for testing my company's new automated foreign exchange system— a three-year project during which we extensively relied upon HP TestDirector for Quality Center.
Learning from experience HP TestDirector for Quality Center is an extremely versatile product in all areas—especially for tracking issues and discrepancies, and for managing defectresolution processes. I will share a set of best practices that you can use to design or streamline your own defect-tracking and resolution processes.
One of my projects included creating and running a complex defect-tracking and resolution process for the company’s newest application—an automated foreign currency-exchange system. Previously, the company’s financial managers had conducted currency exchanges by phone. With the new system, brokers would be able to trade currency electronically in real time. Because we had to electronically mimic a manual process, we studied the manual system to gather the majority of our requirements. We interviewed brokers, videotaped traders in action, and held many months of requirements-gathering meetings to examine and identify the business's needs. The system had to be very accurate to avoid misaligned deals, especially since there was the potential to lose hundreds of millions of dollars in a few keystrokes. My team did not have a current electronic system on which to base its expectations, so quality targets were vague. Communication among project group members was key to our success. The project was complex— involving people from all over the world—and testing was expected to take place 24 hours a day, seven days a week. The IT group also sought to automate the manual processes used in the development cycle to meet the project's challenges. My team needed a way to communicate issues, defects, resolutions and scheduling as well as to track activity. Likewise, the project team needed a system that would be highly reliable and easily customizable, and it needed to be able to support users all over the world. The system would have to be easy to use since the users were accustomed to manual processes and were not familiar with automated solutions. The brokers were also not used to completing transactions via computer.
2
After a few false starts with other automation tools, we discovered HP software. The QA department decided to try HP TestDirector for Quality Center as the foundation for all of its testing processes and found that it offered: • High availability: HP TestDirector for Quality Center did not crash. • Good performance: It was fast enough for longdistance users from Asia, Europe and Australia. • Usability: It was easy to customize and use. After a favorable experience with HP WinRunner software, I decided to adopt HP TestDirector for Quality Center for full functional and regression testing. The company used HP TestDirector for Quality Center to model all of its processes for defect-resolution and scheduling releases. We defined everything in HP TestDirector for Quality Center along application lines—all of the enterprise’s applications were referred to as “projects.” The currency-exchange project was launched; it would ultimately take approximately three years to complete and it would deliver outstanding results.
We recorded, tracked and resolved all discrepancies, deficiencies or other abnormal system behavior during the test phases of the software development lifecycle (SDLC) according to the company’s newly defined problem-reporting procedures. In HP TestDirector for Quality Center, a discrepancy is defined as “an unexpected event that occurs while testing or when the expected result does not equal the actual result.” A discrepancy could be a defect, a request for an enhancement or new functionality, or any issue that needed to be corrected. We used the HP TestDirector for Quality Center “Defects” tab tool to report defects. The defect-resolution system tracked every issue and discrepancy in the project—from system, hardware and software issues related to the application to process issues such as who was responsible for each step, how to obtain equipment, etc. I defined detailed steps for the company’s new defect-reporting and resolution process (Figure 1). Customized software quality defect-reporting and resolution process It is important to first define all processes. Only with thorough planning can a defect-reporting and resolution process provide valuable information and streamline the processes for IT. In the following sections, I will describe how you can create reporting processes, define defect security levels, record defect status and track all modifications.
3
Figure 1. Software Quality Assessment (SQA) defect process flowchart
Defect reporting
START
Discrepancy is identified
Discrepancy is logged in HP TestDirector
STATUS = NEW
Use standard for defect documentation
Adjust severity if needed
STATUS = OPEN
Assign defect
Yes
Defectresolution meeting
Renew new open and reopen defects
Need clarification?
Defect fix
Retest defects Business representative
Close defect
Yes
Assign to?
Behavior accepted?
Environment group
Development group
Test group
Assign defect to developer
Receive build package and execute smoke test
Smoke test passed?
Yes
Log defect
No
No Fix code and unit test
Change requirement
Fix in this release?
Retest fail?
Yes
New defects?
Yes
STATUS= REOPEN Assign to development
No
STATUS=FIXED No
Assign defects and execute defect retest
Perform regression test
Log defect
Schedule for later release No
Yes
Assign to development
Renew new open and reopen defects
Set STATUS = READY TO RETEST
Deploy build with fixed defects
Set STATUS = READY TO RETEST
Close defect
Execute smoke test
Environment fixed?
Yes
Close defect
No
STATUS = REOPEN
4
Defect reporting During each phase of the SDLC in HP TestDirector for Quality Center, we reported all discrepancies and issues relating to a project. When a defect was opened, my team would follow the process described below. 1. A release team member would log a defect report in HP TestDirector for Quality Center as soon as enough information was gathered to recreate or explain the situation. The default status was set to “New.” I encouraged everyone on the team to log defects as soon as possible to allow optimum time for investigation and repair. 2. The person who logged the defect assigned an initial severity to the defect based on how it affected the application and/or the test process itself. Listed below are the requirements we used for defining the defect's severity. a. Show stopper: This level of defect prevented the team from proceeding in an area that would affect deployment schedules. No workaround was available, and the team required an immediate solution. b. High: This level of severity was the same as “show stopper,” except that a workaround existed and the team could wait for the defect to be fixed until the next scheduled build. c. Medium: The team needed the defect to be fixed as soon as possible prior to deployment. d. Low: The team could wait for the defect to be fixed in a subsequent release (a later project). e. Enhancement: A defect that could be fixed any time. 3. The team used a defect writing standard to include all pertinent information. A team member then incorporated the specifics of the discrepancy, including print screens, associated test cases, selection criteria and any other critical information to help resolve the issue. 4. The team used the “Assign To” field to indicate who would be the likely choice to conduct an initial investigation, and the appropriate group leader was assigned to the defect. This fast-track assignment allowed defects to be reviewed and possibly resolved prior to the defect-resolution meeting described in the next section. If the team could not decide on the resolving group, the defect was assigned to the project manager. 5. If someone other than a release team member identified a discrepancy, that individual forwarded the specifics to the test lead so the data could be logged into HP TestDirector for Quality Center.
Defect-resolution meeting We wasted time during the testing phase when defects were not acted upon in an efficient manner or went undetected. During defect-resolution meetings, we reviewed and resolved defects efficiently. So we decided to discuss defects in a periodic meeting with the test lead and representatives from the development, project management, product management (user acceptance) and business groups. The highest number of defects in the system approached 5,000, and we handled 25 to 50 defects per defect meeting. The test lead updated HP TestDirector for Quality Center during the meeting, which centered on a review of each defect report. Each report was prepared in HP TestDirector for Quality Center and contained all information needed to determine the severity of the defect and to whom it should be assigned. The defect reports were sorted by severity and status so that high-profile and new bugs would be addressed first. “New” defects were first reviewed for the criteria listed below. • Summary and description: If a defect lacked the information needed to adequately investigate the discrepancy, we assigned it back to the individual who found the defect to provide more information. The defect maintained “New” status. The individual who found the defect then provided the necessary information by updating the defect in HP TestDirector for Quality Center. Next, the defect was assigned back to the project manager via the test lead so it could follow the process for “New” defects. • Severity: We reviewed a defect's severity for accuracy and required the team's consensus. • Assign to: We assigned the defect to the appropriate team (environment, developer or business analyst) for investigation and resolution. Note: Whenever an assignment was changed, HP TestDirector for Quality Center was set up to send out an e-mail to the assignee so the assignee immediately knew if a defect required attention. • Status: We changed the defect's status from “New” to “Open.” We set up HP TestDirector for Quality Center to send e-mail to the assignee each time a defect was assigned, and to send an e-mail to the test lead each time a defect changed status. The project manager and test lead also received an e-mail each time a new defect was entered. The test manager received an e-mail each time status was changed to “User Error” or “Works” as designed. The test manager then used this information for mentoring. The development manager received an e-mail for each status changed to “Reopen.” You can
5
easily customize HP TestDirector for Quality Center so 2. We assigned the defect to the development lead if that these changes can be made by a non-programmer; we suspected there was a bug in the code, if any my team often made changes to support processcode changes were needed, or if it was identified as improvement initiatives. One tester used 50 percent of an enhancement for a later release of the application. his time to administer HP TestDirector for Quality Center This person then assigned the defect to a developer. and make all of the experimental and real changes. This When the developer fixed the code, the developer is one of the most powerful reasons to use this solution. changed the defect status to “Fixed.” The fixed defects made their way into build packages and were handed Next, my team would review any “Open” defects, and off to the test group. HP TestDirector for Quality Center this is where the “Comments” became useful. At each produced a report that reflected all defects fixed in a stage of defect-resolution from “New” to “Closed,” we build. When we handed off a build, the test lead set entered comments to document the defect’s journey. the status of the “Fixed” defects to “Ready to Retest,” The “Comments” revealed the why, how and when and assigned the defect to a tester for retesting. a defect would be resolved. I would then make appropNote: A strength of HP TestDirector for Quality Center riate assignments to move the defect’s resolution along. is its ability to define attributes for defects so as to If the system’s behavior was acceptable and the systems support various processes. One of these is the “Code functioned as expected, we updated the defect with Fix” process. When configuring HP TestDirector for the acceptance information and assigned it to the test Quality Center, the code fix team took into account analyst who set the defect status to “Closed.” the attributes needed to track the defect as it went through the fixing process: setting developer priorities, My team then reviewed what was entered in a userkeeping fix-time statistics, etc. The same procedure defined field called “Cause.” This enabled us to examine was applied by all the other groups. The team the root cause of the defect, as well as to pull reports at controlled the quality of the information for each a later time for process-improvement purposes. We also defect by a system of attribute ownership. had the option to review any disputed “Closed” defects. 3. We assigned the defect to the environment support Defect fix group when configuration and set up errors were Because we held defect-resolution meetings, we assigned identified. These discrepancies usually surfaced when defects to the appropriate lead from one of the resolving we ran a “smoke test.” (Smoke testing is non-exhaustive groups: development, environment support or the software testing that determines whether the most business group. crucial functions of a program work without disturbing the finer details of the program. The term was derived 1. We assigned the defect to the business group if there from hardware testing, in which the device passed the were any questions regarding a defect's test if it didn’t catch fire the first time it was turned on.) requirements. At this point, we made any necessary The environment support group fixed the problem, set requirement changes. the status of the defect to “Ready to Retest” and then assigned the defect back to the test lead for retest and subsequent closure.
Figure 2. Defect R&D comments field
Defect Status
Goals of the ITIL process
New
By default, the status of the new defect was “New” when added by a test analyst.
Open
The test lead changed the status from “New” to “Open” and updated the R&D comments field within an acceptable timeframe (dependent upon severity) and the defects were assigned to the appropriate teams.
Fixed
Developers updated the status from “Open” to “Fixed” when the defect was fixed and unit tested. The turnaround time for defect fixes was driven by the defect severity.
Ready to Retest
Project/product/release managers updated status to “Ready to Retest” when the defect had been fixed, the system tested and identified as being included in a build.
On Hold
Test analysts updated the status from “Ready to Retest” to “On Hold” when the defect could not be tested in the build due to another defect that prevented the retest execution.
Closed
Test analysts updated the status from “Ready to Retest” to “Closed” after re-testing on the new build of the application.
Reopen
Test analysts updated the status from “Ready to Retest” to “Reopen” when the defect retest failed. Test analysts updated the status from “Closed” to “Reopen” when the previously closed defect reappeared in a course of testing.
6
Retest defects We retested and immediately closed any defects we fixed without changing code. We included defect fixes requiring code changes in a build package and retested, and either closed or reopened them depending on the outcome of the test. All of this information was handled very efficiently in HP TestDirector for Quality Center: 1. Our test analyst reviewed any set up information and the “Ready for Retest” defects for retesting in the build. We also identified any “Test Pending” defects and reviewed them for retest consideration in the cycle. 2. Our test analyst then updated the status to “Ready for UAT” (User Acceptance Testing) when the defect detected in UAT/production was fixed and tested by integration test analysts. 3. Next, our test analyst retested the defect and updated its status appropriately. 4. If the defect was not resolved, the test analyst updated the status to “Reopen,” and the process resumed at Step 2. 5. Our test leads provided us with the status on all unresolved defects on a daily basis. 6. After the conclusion of each release, during the post-release phase, our test lead scanned the defect statuses for defects that were not yet resolved, to address them during the next release.
Figure 3. Status changes allowed by HP TestDirector for Quality Center
Defect-status definitions Each defect had a status that we updated throughout the test cycle (Figure 2). Every time we updated the defect status, we added comments into the defect R&D comments field. Defect-status modifications—status modifications by user The table below is a simplified version of the status changes allowed by HP TestDirector for Quality Center, which provides an easy learning curve and enforces a preferred workflow. Listed below are some of the status changes, also shown in Figure 3. • Any changes to the system were made according to appropriate change control procedures. • As necessary, the failed test or sections of the test were re-executed after being corrected. • At any time during the process, our project managers, system analysts, developers and test staff could use HP TestDirector for Quality Center to retrieve the status of all defects stored in the HP TestDirector for Quality Center repository. • The reporting capabilities of HP TestDirector for Quality Center provided a way to monitor the overall progress of defect-resolution.
User
Change from:
Status to:
Test analyst
Initial Entry
New
New
Open
New
Closed
Open
Closed
Ready to Retest
Test Pending
Ready to Retest
Reopen
Ready to Retest
Closed
Ready to UAT
Ready to UAT
New
Open
Open
CR Created
Fixed
Ready to Retest
Initial Entry
New
Ready to UAT
Reopen
Ready to UAT
Closed
Open
Fixed
Reopen
Fixed
Project/Product/Release managers
User acceptance analyst
Developer
7
Configuring new processes into test automation tools All our processes evolved over time. It took us more than one year to transition from a chaotic process to a high-performance team that had: • Documentation • Repeatable, measurable results • A program for improvement HP TestDirector for Quality Center was pivotal in our success because it enabled us to model and remodel as we defined and redefined our processes. When we realized that HP TestDirector for Quality Center was successfully helping us to build the defectresolution process, we decided to tackle automation. To prepare for this next step, we decided to use the “Test Plan” in HP TestDirector for Quality Center as a repository for our manual tests, as well as the tool for managing and tracking all of our efforts. We used the concepts we learned in HP TestDirector for Quality Center to build robust test plans, as well as to build test sets that included manual tests and automated scripts that we originally built in HP WinRunner.
To learn more, visit: www.hp.com/go/software © Copyright 2007 Hewlett-Packard Development Company, L.P. The information contained herein is subject to change without notice. The only warranties for HP products and services are set forth in the express warranty statements accompanying such products and services. Nothing herein should be construed as constituting an additional warranty. HP shall not be liable for technical or editorial errors or omissions contained herein. 4AA1-4226ENW , August 2007
Contact information To find an HP Software sales office or reseller near you, visit www.managementsoftware.hp.com/buy.