Driving efficiency with DevOps agility: Avoiding - the speed traps on the road to innovation
←
→
Page content transcription
If your browser does not render page correctly, please read the page content below
Table of contents 1. Centralized test management for decentralized teams 3 2. Challenge: Making automated testing part of a larger, 4 strategic test plan a. Best practice 4 b. Case study: Nordea Bank 5 c. Case study: IHG 6 3. Challenge: Scaling test automation across 7 an organization a. Best practice 7 a. Case study: Dell 8 b. Case study: LVMH 9 4. Challenge: Building test automation for fast and 10 efficient feedback loops in DevOps pipelines a. Best practice 10 a. Case study: Extreme Networks 11 b. Case study: Specsavers 12 5. Challenge: Measuring, tracking, and improving 13 your test automation a. Best practice 13 a. Case study: Guardian Life 14 b. Case study: Marvell Technology Group 15 6. Conclusion 16
CENTRALIZED TEST MANAGEMENT FOR DECENTRALIZED TEAMS Testing has always been critical to shipping high-quality software. The rise of Agile and DevOps has pushed organizations to become increasingly automated while shifting testing left, earlier in the software development and delivery process. This translates to more testing being performed than ever before. A recent TechWell survey revealed that as an organization matures its DevOps practice, it becomes more likely to consider quality as a shared responsibility across the organization. This has resulted in higher test automation rates and a proliferation of test automation tools available in the marketplace. Individual teams often choose best-in-breed solutions for their own specific testing requirements. While this autonomy can lead to innovation on a team level, enterprises looking to scale effective and efficient testing across the entire organization are finding it difficult to integrate multiple tools to gain a clear, accurate assessment of overall release readiness. Best practices, traceability, reporting, and knowledge sharing remain siloed. There are hundreds (even thousands) of test automation tools available to DevOps teams: open-source, low-code/no-code, free, and proprietary. Finding the right tool from the available options isn’t just difficult, it sometimes feels impossible. It is rare that a single tool can accomplish all your testing needs. Instead, teams leverage different tools to fit a specific purpose or a cobble together a best-in-breed approach. Most organizations are using a combination of 10+ test automation tools to fit their needs. This is in addition to the organization’s countless other development tools that fit into the larger picture of the software development lifecycle. As more teams within the organization get involved in testing and the accumulative toolset grows, different challenges are likely to arise: • Siloed knowledge and expertise • Misaligned test management • Testing bottlenecks • End-to-end traceability • Ineffective reporting for impact analysis This paper discusses the top points brought to life by a recent Tricentis survey on the major challenges enterprise testing organizations face as they look to scale. We look to industry best practices and solutions, and provide examples of their application from real-world customer experience. tricentis.com © 2021 Tricentis USA Corp. All rights reserved | 3
CHALLENGE: MAKING AUTOMATED TESTING PART OF A LARGER, STRATEGIC TEST PLAN More automation, more frequent (and faster) releases, and more teams testing have translated to more testing data in more tools and systems across the organization. Siloed tools, tests, and teams make it difficult to standardize best practices and scale specialized expertise across the business. While the past few years has seen a tremendous increase in test automation rates across all industries, no organization is 100% automated. Exploratory and manual testing continue to play a vital role in almost every release. However, some organizations view these two types of testing as discrete functions, even though they are still critical to achieve quality outcomes. A common misstep in growing testing organizations is managing test plans, requirements, automated tests, manual tests, and exploratory tests separately. This approach can lead to lower accuracy or confidence in whether code is ready to release and creates testing inefficiencies that drain resources and introduce bottlenecks. Best practice The world’s top testing organizations implement standards and processes to centralize test assets for automated test plans and cases alongside manual and exploratory testing. Integrating systems of record with other functional areas, including developers and build/release engineering, is also critical to maintain full traceability for each release planning cycle. Examples of these systems would be source code management tools such as GitHub or developer planning tools like Atlassian Jira. For the organizations we interviewed, this has resulted in higher-quality releases, better collaboration across teams, and more efficient use of shared resources. tricentis.com © 2021 Tricentis USA Corp. All rights reserved | 4
Case study: Nordea Bank Nordea Bank was maintaining 40+ different test tools across the QA group. Multiple automation frameworks, tools, and approaches using different programing languages were being used, casting huge overhead for maintenance and inefficient delivery across the bank. The lack of a centralized testing strategy and tooling to support its efforts created inefficient test case management — including duplication of effort across teams for both manual and automated test cases, wasted effort, and inconsistencies in testing from team to team. Nordea implemented a standardized test management tool to help streamline processes and provide greater visibility for business-critical deliverables. Functional and non-functional testing are both planned together across different test cycles, which include both manual and automated tests. Automated test frameworks interface with the system for both test planning and result tracking and are monitored from development to delivery through the data tracked in test templates created by Nordea’s testing teams. This process has allowed Nordea to reduce efforts related to maintenance, capture data for critical business reports to stakeholders, and consolidate legacy tools into a single source of truth for QA, all while still leveraging the right automation or framework that each team requires. “We realized we needed a strong foundation to base our test strategy on. We needed automated delivery of testing and automated execution of QA processes, our traceability, and our QA reporting.” Andrew Armstrong, Group Head of QA & Testing at Nordea tricentis.com © 2021 Tricentis USA Corp. All rights reserved | 5
Case study: IHG IHG’s centralized testing strategy covers all test management activities across the enterprise, including end-to-end tests, functional tests, regression tests, smoke tests, system integration tests, user acceptance tests, business process tests, exploratory tests, and all associated metrics. It provides the entire organization with a “single source of truth” for quality — from the team level up to the line of business. Their centralized testing strategy integrates with many industry-leading Agile, automation, and pipeline tools to give IHG an enterprise view into all things testing. This enables different stakeholders to get fast, role-based access into what quality activities are planned, how they are progressing, and what defects are discovered. Through this process, IHG has been able to identify issues much earlier and address them when it’s significantly easier, faster, and less expensive to do so. The end result for IHG translated to a 50% reduction in effort on non-test execution tasks, 30% productivity improvement, and a 25% cost savings to QA. “We’ve had testers move from traditional testing models into a continuous testing model — shifting left, ensuring that we focus on speed by getting the right feedback to the stakeholders at the right time, focusing on quality by improving our test coverage, and focusing on cost by increasing our productivity and efficiency.” Tammie Davis, Director Global Quality Engineering at IHG tricentis.com © 2021 Tricentis USA Corp. All rights reserved | 6
CHALLENGE: SCALING TEST AUTOMATION ACROSS AN ORGANIZATION As noted, there are thousands of test automation tools in the market to cover everything from security scanning to load testing to compliance and accessibility testing. No single tool or framework can cover all the needs of an enterprise organization. Teams have evolved over time to follow more “self-governing” principles that have encouraged a best- in-breed tooling philosophy, particularly in test automation. For Agile and DevOps, this is considered a best practice — use the right tool for the job at hand, not just the tool you have. Automation engineers become experts in the tools their teams utilize, which can be of great benefit to a team or organization. On the other hand, specialized knowledge of scripts, documentation, and business alignment can potentially vanish should an engineer-expert leave. As an organization scales to include more teams and a greater variety of tools, this approach can create silos of information that are difficult to access by outside teams looking to collaborate or enhance their own processes. If automation is poorly managed, inconsistent standards and processes can lead to inconsistent quality. Like any other tests, automated testing requires oversight and management (including traceability, history, and analytics). Because there are so many types of automated testing, organizations that are leveraging a best-in-breed approach with multiple test automation tools benefit from having one place to centrally manage test automation. This helps prevent inconsistencies in execution, creates traceability, and reduces time and effort to verify release readiness and troubleshoot issues. Larger enterprise organizations have started to adopt a centralized team, often called a testing center of excellence (TCoE), composed of a representative from each group to help with cross-collaboration and knowledge sharing in order to standardize best practices and testing efforts. Smaller or growing testing organizations may not have a standard centralized body to evaluate how their tools can fit into the broader needs of the organization. These groups also understand how the knowledge held within those teams can benefit the organization as it scales. Best practice Successful enterprise organizations centralize test automation to manage, streamline, and scale test automation over time. Repeatable best practices are critical to drive consistency and scale specialized knowledge across the organization while still allowing individual product teams or launches the flexibility to shape their testing strategy. This process also creates the ability to trace test schedules and determine progress across previously disconnected tools. tricentis.com © 2021 Tricentis USA Corp. All rights reserved | 7
Case study: Dell At Dell Technologies, 30+ product teams were building, testing, and delivering applications using different methodologies, terminologies, pipelines, and toolsets. Different teams meant different things by terms like performance testing and integration testing, and they were completing them with different tools and approaches selected to meet each team’s unique needs. Instituting centralized test automation to manage test automation in a single place ensures that test automation practices can scale over time. Dell achieved strong economies of scale with this strategy. Today, no matter what methodology (e.g., Scrum at scale, Kanban, waterfall, SAFe) or test automation framework (homegrown frameworks, Robot, Selenium) a team is using, their tests can be managed and executed in a consistent manner via their centralized test automation system. In addition, this system helps Dell quickly identify and leverage relevant test assets and enables them to shift resources from duplication to innovation. “We can easily accommodate variations across each team and project while standardizing our test assets, test data, and our overall test processes. Different teams can share tests and even run each other’s tests without even thinking about what test automation engine is used behind the scenes.” Adam Arakelian, Director of Software Engineering at Dell tricentis.com © 2021 Tricentis USA Corp. All rights reserved | 8
Case study: LVMH Moet Hennesy-Louis Vuitton (LVMH), the world’s leading luxury retailer and home to 75+ “maisons” like Christian Dior, TAG Huer, and Dom Perignon, had a deep-seated corporate commitment to deliver excellence. Each product launch needed to be a flawless execution, which also required rigorous testing across web, mobile, Salesforce, eCommerce, ERPs, WMS, and more. LVMH built a new QA process with the same customer focus and attention to detail the company is known for. A fundamental mindset shift to “business driven testing” led the process for creating a new approach. Applying this approach at the necessary speed and scale required automation, integration, and extreme reuse — all facilitated by the adoption of a centralized testing and centralized orchestration of testing across Applitools, Selenium, Cucumber, BrowserStack, and more. This new process created greater alignment between development and QA — understanding what was tested and when and improving collaboration while reducing misunderstandings between the two groups. This process ultimately reduced test preparation time from 10 days to 3 days per rollout, cut test execution time by 75%, increased the speed of overall testing 4X, and helped reduce testing costs by 75%. tricentis.com © 2021 Tricentis USA Corp. All rights reserved | 9
CHALLENGE: BUILDING TEST AUTOMATION FOR FAST AND EFFICIENT FEEDBACK LOOPS IN DEVOPS PIPELINES Testing is frequently cited by studies and practitioners as the #1 blocker for releasing faster, with the average test cycle times (23 days) taking longer than the 2-week development cycles that were typical of organizations a few years ago. This not only creates a bottleneck but, in many cases, shows that testing is not in sync with development. A transition in the industry to Agile promoted the “fail fast” mentality but providing feedback of that failure to developers as quickly as possible is not adequately addressed by some organizations. When DevOps came onto the scene around 2008, it aimed to solve one of the largest issues found with traditional Agile models — communication and coordination between stakeholders in design, development, testing, and deployment. Testing bottlenecks and lack of “fail fast feedback” leads to slower testing cycles, creates friction between teams that should be collaborating (known as the release blame game), and ultimately delays releases and makes them more costly. DevOps cannot succeed if this persists in an organization. The ability to quickly and continuously address issues in code is the oil that allows the speed of a DevOps engine to run. Best practice While no tool alone can solve the cultural challenges present in an enterprise organization, centralizing test orchestration and integrating with planning, CI/CD, and even ChatOps tools enables instant feedback to developers. This ensures that actionable feedback is delivered to the right people throughout the development lifecycle — when and where they are prepared to act on it. Centralized test orchestration has shown to help with DevOps practices by making it easier for collaboration between teams, creating more efficient developer productivity and ultimately allowing for faster releases with greater quality code. tricentis.com © 2021 Tricentis USA Corp. All rights reserved | 10
Case study: Extreme Networks This was exactly the issue for Extreme Networks after a breakdown in the “fail fast” chain of command started to slow the testing process on high-velocity releases. In addition to having to wait for test results until the end of the day, engineers lacked visibility into the testing processes on critical features. Consequently, Extreme Networks integrated developer tools such as Atlassian JIRA to its central testing tool, thus aligning collaboration between groups. Test cases are now tied to JIRA user stories, where developers see results of test runs instantly. Over the course of a few months, the team reduced release times by 66% and is working towards 80% faster release times for all products and features. “At the end of the day, it’s all about efficiency and saving the engineers’ time.” Kevin Lin, Director of Cloud and Wireless QA tricentis.com © 2021 Tricentis USA Corp. All rights reserved | 11
Case study: Specsavers Specsavers was looking to modernize its development testing and delivery methods to support a digital operating model and an omnichannel approach to improve customer experience. Silos within its development and testing organization made it difficult to share testing data between teams, integrate testing into automated pipelines, or effectively assess release readiness. Using centralized test orchestration, Specsavers was able to pull together multiple processes to achieve a behavior-driven development (BDD) approach for test-driven development (TDD). All teams now have holistic visibility into testing progress — whether they are testing manually with Serenity BDD tests or with any of the other test automation tools in the organization — while also delivering results back to development in real time. Testing is integrated across the full software delivery pipeline, so all collaborators can understand testing progress and assess release readiness at a glance. By modernizing testing and improving test automation management, the team is now able to test 120% faster while improving quality across the application stack, as well as save time by sharing and reusing test assets across projects. “We have a fast diagnosis of failing tests. Which, in turn, leads to ensuring better quality of development because we can flag those areas very quickly.” Dave White, Automation Test Manager for Specsavers tricentis.com © 2021 Tricentis USA Corp. All rights reserved | 12
CHALLENGE: MEASURING, TRACKING, AND IMPROVING YOUR TEST AUTOMATION Testing teams are putting an increasing amount of effort into growing automation rates. Leaders need to understand if those efforts are driving the results they need — higher quality, greater velocity, or overall savings. This only happens with the right metrics, whether that’s a specific project or across the business. The inability to quantify the benefits of automation stagnates automation efforts as well as additional efforts to justify the added time and cost. A common pitfall is not understanding and aligning with the overall business objectives and failing to ensure that defined KPIs support the definition of progress. Best practice In order to fully understand the effectiveness of their test automation and testing practice, organizations should establish quality KPIs that are aligned to their organizational and team goals, based on a complete view of testing activities. This should also be in context with non-automated testing or centralized quality metrics. Determining these KPIs should be done early and align to a regular reporting schedule. Organizations that have adopted this process find they can quickly gain greater efficiencies, both at the team and organizational levels while also defining the ROI for automation projects and easily justify the costs to the business — escaping the stigma of being just a cost center. tricentis.com © 2021 Tricentis USA Corp. All rights reserved | 13
Case study: Guardian Life Guardian Life insurance was faced with a difficult task: move faster while supporting more (and more complex) projects — and do it all without driving up costs across 5 disparate QA teams with different policies, procedures, automation tools, and metrics. The team needed to be able to quantify the benefit of automation across 100+ active engagements monthly and 400+ projects annually. After 6 months of centralizing quality metrics, the business was able to deliver real-time reports on process improvements that include a 29% reduction in defects, 30% faster testing cycles, 31% cost reduction, and 38% increase in project capacity. “I made sure we worked with our Scrum master/enterprise agility team to ensure our tools would integrate well and that we would have guardrails for how those tools would operate — while also giving flexibility to teams to define some of their own workflow.” Robina Laughlin, Vice President, IT Quality Management at Guardian tricentis.com © 2021 Tricentis USA Corp. All rights reserved | 14
Case study: Marvell Technology Group Marvell made testing transformation a key element when it launched its DevOps initiatives. Specifically, it recognized that maximizing the benefits of DevOps would require an enterprise-grade centralized solution to help with asset sharing, approval processes, integrations with the DevOps pipeline, and real-time reporting on test results and quality trends. The previous process was not timely, efficient, or scalable to support the speed needed for DevOps delivery without creating backlogs. The team leveraged a unified test management system for testing that correlated requirements in Atlassian Jira to available tests in the repository. Team members can easily see if reusable assets already exist before creating new assets from scratch. Approvals are tracked and reviewed at a glance. An integration between the unified test management tool and Jenkins enables automated test execution within the DevOps pipeline, with results reporting back through test management into Jira for fast feedback to development. Test results and Marvell’s key quality metrics (e.g., failure rate from version to version) are visualized in custom dashboards, giving stakeholders instant insight into test status and release readiness. tricentis.com © 2021 Tricentis USA Corp. All rights reserved | 15
CONCLUSION Testing is not only necessary, but also critical to the success of every launch in every organization. Test automation, Agile, and DevOps have increased the sophistication of software testing while increasing the speed and accuracy of delivery. However, just like a Formula 1 racecar, it requires a team effort and focus on precision to not only run the engine of software testing but facilitate an environment of excellence to consistently deliver quality at speed. As organizations decentralize their development and testing to better address more complex applications and code, more teams are being created. Those teams develop more tests, leveraging more tools — all of which increase the quality of their respective applications but also increase the likelihood of creating silos. Those silos can be between products, lines of businesses, development and QA teams, or engineering and the business. Silos are the enemy of growth in any organization, but especially when scaling to support an enterprise IT team. Not only do silos distract you from solving harder, more valuable problems, but they create inconsistencies in execution and outcomes. Inconsistent execution creates barriers to taking what is working well for one team and scaling it across the rest of the organization. Without the ability to quickly scale best practices, growth in your business can become stagnant. So what can we take away from all of this? Automation offers incredible potential for acceleration of efficient delivery. However, if not managed properly, automation can also result in silos within the organization that threaten its ability to grow and innovate at a rate needed to meet customer expectations. In order to avoid such pitfalls in your organization, here are a few best practices to help accelerate the success of your testing organization: • Centralize test assets, execution, orchestration, as well as data and analytics in addition to test scripts throughout the lifecycle. This is vital for reducing redundancy, saving time and effort, and creates coordination across decentralized teams. • Centralize test automation to ensure you are shipping higher quality software faster — and more effectively manage shared resources. • Ensure your test management solution integrates not only with your testing tools and infrastructure, but with developer, DevOps, and planning tools to enhance collaboration and discover opportunities to innovate. • Automation itself is just part of a larger, unified test management strategy you need to meet your organizational goals — it is critical to put into context against all of your other testing. DISCLAIMER: Note, the information provided in this statement should not be considered as legal advice. Readers are cautioned not to place undue reliance on these statements, and they should not be relied upon in making purchasing decisions or for achieving compliance to legal regulations. tricentis.com © 2021 Tricentis USA Corp. All rights reserved | 16
ABOUT TRICENTIS Tricentis is the global leader in enterprise continuous testing, widely credited for reinventing software testing and delivery for DevOps and agile environments. The Tricentis AI-based, continuous testing platform provides automated testing and real-time business risk insight across your DevOps pipeline. This enables enterprises to accelerate their digital transformation by dramatically increasing software release speed, reducing costs, and improving software quality. Tricentis has been widely recognized as the leader by all major industry analysts, including being named the leader in Gartner’s Magic Quadrant five years in a row. Tricentis has more than 1,800 customers, including the largest brands in the world, such as Accenture, Coca-Cola, Nationwide Insurance, Allianz, Telstra, Dolby, RBS, and Zappos. To learn more, visit www.tricentis.com or follow us on LinkedIn, Twitter, and Facebook. AMERICAS EMEA APAC 2570 W El Camino Real, Leonard-Bernstein-Straße 10 2-12 Foveaux Street Suite 540 1220 Vienna Surry Hills NSW 2010, Mountain View, CA 94040 Austria Australia United States of America office@tricentis.com frontdesk.apac@tricentis.com office@tricentis.com +43 1 263 24 09 – 0 +61 2 8458 0766 +1-650-383-8329 tricentis.com © 2021 Tricentis USA Corp. All rights reserved | 17 v. 0821
You can also read