Software Test Automation Success - Michael Snyman, Nedbank, South Africa WWW.EUROSTARCONFERENCES.COM
←
→
Page content transcription
If your browser does not render page correctly, please read the page content below
Europe’s Premier Software Testing Event Stockholmsmässan, Sweden “Testing For Real, Testing For Now” Software Test Automation Success Michael Snyman, Nedbank, South Africa WWW.EUROSTARCONFERENCES.COM
Software test automation success A case study Mike Snyman
Case study Who: Large financial institution Where: Johannesburg When: 2006 to 2008 What: Adoption of test automation with the aim of reducing cost and increasing productivity
Contents 1. The challenges facing the organisation 2. The approach to solving the problem 3. How we look at, and classify automation tools (Why) 4. Our automation process 5. Our automation framework 6. Contemporary automation issues 7. Automation benefit calculations 8. What we have learned
Introduction “ On the island of testers they were forever doomed to search for what should not and could not exist, knowing to succeed would bring misery to the gods.” Lessons Learned in Software Testing Cem Kaner James Bach Bret Pettichord “ A fool with a tool is still a fool.” March 2009
Challenges 1. Lack of an end-to-end defined automation process. 2. Insufficient re-use of tools, scenarios and data. 3. Automation involved late in the process. 4. No living documentation. 5. Slow uptake of test automation and shelfware. 6. No means of proving automation benefits and cost justification. 7. Dependency on key resources. March 2009
Solving the problem 1. Formal organisational project launched. 2. Project had to be based on a sound financial base. 3. Funding for the project approved at the highest level. 4. Failure not an option. 5. Resourced with capable and skilled resources. 6. Driven by a passionate person. March 2009
Project testing 1. Formal project launched in May 2006. 2. Project had a significant budget. 3. Detailed business case was documented to facilitate return on investment. 4. Steering committee tracked progress on a monthly basis. 5. Macro project broken down into workgroups. 6. Dedicated workgroup responsible for communication and change management. March 2009
Project testing workstreams • Test process workstream. • Test governance workstream. • Test mechanisation and tools workstream. • Test repository workstream. • Test environment workstream. • Test change management, communication and training workstream. March 2009
Test automation work stream • Automation process. • Automation process integrated with SDLC. • Tool selection, procurement and training. • Automation framework. • Tool classification. • Quantification of automation benefits. March 2009
Automation process Analysis and Scripting and Parameter Parameter Design configuration definition management Script execution Testing of Scenario scripts Validation collection Review of Result results communication March 2009
Analysis and design phase • Understand the client’s test Analysis and Design Scripting and configuration Parameter definition Parameter management requirements. • Script execution Testing of Scenario Automated solution is proposed. scripts Validation collection • Prototyping. Review of Result results communication March 2009
Scripting/Configuration phase • User requirements implemented. Analysis and Design Scripting and configuration Parameter definition Parameter management • Activities in this phase could Script execution Testing of Scenario include recording, scripting and scripts Validation collection building of special utilities. Review of Result • results communication Internal testing of automated test solution. March 2009
Parameter identification • Scripts assessed against user- Analysis and Design Scripting and configuration Parameter definition Parameter management defined scenarios. • Script execution Testing of Scenario Components identified for scripts Validation collection parameterisation. Review of Result • results communication Components categorised based on nature and function. March 2009
Parameter management • Large amounts of data created in Analysis and Design Scripting and configuration Parameter definition Parameter management previous step are managed. • Customised spreadsheets are Script execution Testing of scripts Validation Scenario collection created. • Requirement exists that data in Review of results Result communication these sheets can be maintained by non-technical personnel. • All scenarios described in both technical and non-technical terms. March 2009
Scenario collection • Spreadsheet populated with Analysis and Design Scripting and configuration Parameter definition Parameter management scenarios provided by stakeholders of the system. Script execution Testing of Scenario scripts Validation collection • Manual test cases could provide input. Review of Result results communication • Actual production outages used as input. March 2009
Validation • Automated validation of results Analysis and Design Scripting and configuration Parameter definition Parameter management important due to volume. • Spreadsheet updated with Script execution Testing of Validation Scenario scripts collection expected results. • Validation done by script using Review of Result communication results information from spreadsheet. • Clear “Pass” or “Fail” indicators provided. • Summary of results provided. March 2009
Testing of scripts • Like all new software, the scripts Analysis and Design Scripting and configuration Parameter definition Parameter management must be tested. • Important that when defects are Script execution Testing of Validation Scenario scripts collection reported during the actual test cycle that the tool is above reproach. Review of Result results communication • Scripts must be able to react to anomalies in a predicted fashion. • Must still be able to report effectively on operational scenarios. March 2009
Script execution • Scripts are run against target Analysis and Design Scripting and configuration Parameter definition Parameter management application. • Script execution Testing of Scenario Stability of system determines scripts Validation collection support required. Review of Result • results communication Strive to provide as much meaningful data in the shortest possible time. March 2009
Review of results • Results are reviewed internally. Analysis and Design Scripting and configuration Parameter definition Parameter management • Focus on abnormal failures due to Script execution Testing of Scenario environmental or other conditions. scripts Validation collection • Checks are done to ensure data Review of Result results communication integrity and scope coverage. March 2009
Result and benefit communication • Because of size, results are saved Analysis and Design Scripting and configuration Parameter definition Parameter management in repositories. • Script execution Testing of Scenario Screen dumps of all defects are scripts Validation collection kept. Review of Result • results communication Communicate benefit in terms of automation lifecycle stage. March 2009
5000000 5 4500000 6 4000000 4 Stage 4 3500000 3000000 Stage 3 Investment 2500000 Benefits 3 2000000 Stage 2 1500000 2 1000000 Stage 1 1 500000 0 0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 Regression cycles March 2009
Categorising our toolset Why? The strength in categorising tools is the ability to provide uniquely customised, automated testing solutions which, in conjunction with manual testing, aim to mitigate both the product and project risks associated with solution deployment. March 2009
Categorising our toolset (continued) Front-end user Front-end user interface interface (Non- (Functional) Functional) Front-end Back-end Back-end network network Middleware interaction interaction (Non- (Functional) Functional) Backend March 2009
Categorised toolset • Front-end user interface (Functional) • Back-end or network interaction (Functional) • Front-end user interface (Non-functional) • Back-end or network interaction (Non-functional) • General automated testing utilities March 2009
Front-end user interface (Functional) • Record and playback. Front-end user Front-end user interface interface (Non- (Functional) Functional) • Simulate user interaction with Front-end applications. Back-end • Provide support for unit, Back-end network Middleware network interaction (Non- interaction integration, system and (Functional) Functional) acceptance testing. Backend • User regression testing particularly suits the nature of these tools. March 2009
Back-end or network interaction (Functional) • Ability to simulate user interaction Front-end user Front-end user in the absence of a front-end. interface (Functional) interface (Non- Functional) Front-end • Supports bottom-up integration. Back-end • Provides the ability to find defects Back-end network Middleware network interaction (Non- interaction much sooner in the SDLC. (Functional) Functional) • We found it valuable in testing Backend SOA implementations. March 2009
Front-end user interface (Non-functional) • Ability to generate the required load for performance, load and Front-end user interface Front-end user interface (Non- stress testing. (Functional) Functional) Front-end • These types of tests can typically not be performed manually. Back-end Back-end network network Middleware interaction interaction (Non- • Tool is dependent on the (Functional) Functional) functional quality of the application under test. Backend • Test environment comparison with production . March 2009
Back-end or network interaction (Non-functional) • Due to the nature of implementation, it is not practical Front-end user Front-end user to use the front-end to generate interface (Functional) interface (Non- Functional) the required load (ATM, point-of- sale devices). Front-end • Typically these kinds of tests cannot be performed manually. Back-end Back-end network network Middleware interaction interaction (Non- (Functional) Functional) • Test environment comparison with production . Backend • We found it valuable in testing the performance of services in SOA implementations. March 2009
General automated testing utilities • Provides general support for manual and automated testing activities. • Compares large amounts of data. • Generates unique data sets from production data. • Generates vast amount of test data. • Provides support to all previously-mentioned tool categories. March 2009
Contemporary automation issues Automation lag • Refers to the time lost between the implementation of the application, and the point at which automated functional scripts can be used during testing. March 2009
Automation lag Spreadsheet Parameters Validation Scripting Testing Feedback Script Execution Analysis Prep Data Automation Lag Design & Build Application available for testing Implementation Auto mate d testi ng Manual test execution March 2009
Contemporary automation issues (continued) User acceptance testing and automation • To perform successful user acceptance testing, the involvement of the user is critical – both in defining the test, execution and validation of results. • Users responsible for acceptance are asked to complete the scenario spreadsheet. • We ask the user to monitor a few initial script executions. March 2009
Automation benefit calculations 1. Historically problematic to get accurate measures. 2. Focus initially on quality. 3. Difficult to quantify quality in monetary terms. 4. Decision taken to focus on productivity improvement and associated cost reduction. March 2009
Automation benefit calculations process 1. Identify the smallest common element. 2. In most cases it would be the parameters identified in the scripting process. 3. Compare effort (duration) associated with manual testing with that of automated testing per parameter. 4. Describe the improvement in monetary terms by using an average resource cost. March 2009
Automation benefit 1. Total cost saving for the period 2005 to mid-2008: R86 183 171.00 ($ 8600 000 US) 2. Benefit confirmed by system owners and users. 3. Calculation method reviewed by Finance. 4. Return of this nature justified the significant investment made in both project testing and automation tools. March 2009
What have we learned? 1. Having a test tool is not a strategy. 2. Automation does not test in the same way that a manual tester does. 3. Record and playback is only the start. 4. Automated test scripts are software programs and must be treated as such. 5. The value lies in maintenance. 6. Categorise your toolset, and design integrated solutions. . March 2009
Thank you Mikes@Nedbank.co.za March 2009
You can also read