DevOps Continuous Testing Best Practices - Een uitgave in de Continuous Everything reeks - CB Online
←
→
Page content transcription
If your browser does not render page correctly, please read the page content below
DevOps Continuous Testing Best Practices Een uitgave in de Continuous Everything reeks Bart de Best Onder redactie van Louis van Hemmen
Colofon Meer informatie over deze en andere uitgaven kunt u verkrijgen bij: Leonon Media (0)572 - 851 104 Algemene vragen : info@leonon.nl Sales vragen : verkoop@leonon.nl Manuscript / auteur : redactie@leonon.nl © 2021 Leonon Media Omslagontwerp : Eric Coenders, IanusWeb, Nijmegen Productie : Printforce B.V., Culemborg Titel : DevOps Continuous Testing Sub titel : Een uitgave in de Continuous Everything reeks Datum : 8 juni 2021 Auteur : Bart de Best Uitgever : Leonon Media ISBN13 : 978 94 92618 450 Druk : Eerste druk, 8 juni 2021 ©2021, Leonon Media Niets uit deze uitgave mag worden verveelvoudigd en/of openbaar gemaakt door middel van druk, fotokopie, microfilm, of welke andere wijze dan ook, zonder voorafgaande schriftelijke toestemming van de uitgever. TRADEMARK NOTICES Scaled Agile Framework and SAFe are registered trademarks of Scaled Agile, Inc.
"We build our computer (systems) the way we build our cities: over time, without a plan, on top of ruins." by Ellen Ullma
X DevOps Continuous Testing Figuren FIGUUR 1-1, DEVOPS LEMNISCAAT. ........................................................................................................ 1 FIGUUR 4-1, VERANDERPARADIGMA. .................................................................................................... 11 FIGUUR 4-2, VERANDERPARADIGMA - BEELDVORMING. ............................................................................ 12 FIGUUR 4-3, VERANDERPARADIGMA- MACHTSVERHOUDING...................................................................... 14 FIGUUR 4-4, VERANDERPARADIGMA - ORGANISATIE................................................................................. 17 FIGUUR 4-5, VERANDERPARADIGMA - RESOURCES. .................................................................................. 19 FIGUUR 5-1, IDEAL TEST PYRAMID. ........................................................................................................ 21 FIGUUR 5-2, NON-IDEAL DESIGN PYRAMID. ............................................................................................ 22 FIGUUR 6-1, VALUE STREAM VOOR CONTINUOUS TESTING. ...................................................................... 27 FIGUUR 6-2, USE CASE DIAGRAM VOOR CONTINUOUS TESTING. ................................................................ 29 FIGUUR 7-1, VOORBEELD UNITTESTEN. .................................................................................................. 39 FIGUUR 8-1, DEVOPS CE-SPIDER MODEL. .............................................................................................. 45 FIGUUR 8-2, DEVOPS CT-SPIDER MODEL. .............................................................................................. 49 Tabellen TABEL 1-1, CONTINUOUS EVERYTHING ASPECTEN. ..................................................................................... 2 TABEL 1-2, BIJLAGEN. ........................................................................................................................... 3 TABEL 3-1, VEEL VOORKOMENDE PROBLEMEN BIJ HET TOEPASSEN VAN TESTEN............................................. 10 TABEL 5-1, TESTSOORT-MATRIX TEMPLATE. ............................................................................................ 23 TABEL 5-2, TESTSOORT-MATRIX VOORBEELD. .......................................................................................... 23 TABEL 5-3, TESTTECHNIEK-MATRIX TEMPLATE. ........................................................................................ 24 TABEL 5-4, TESTTECHNIEK-MATRIX VOORBEELD. ...................................................................................... 24 TABEL 5-5, TESTOBJECT-MATRIX. .......................................................................................................... 25 TABEL 5-6, TESTOBJECT-MATRIX VOORBEELD. ......................................................................................... 25 TABEL 5-7, TESTTOOL PATTERN. ........................................................................................................... 26 TABEL 5-8, TESTTOOL VOORBEELD. ....................................................................................................... 26 TABEL 6-1, USE CASE TEMPLATE. .......................................................................................................... 30 TABEL 6-2, USE CASE DIAGRAM VOOR CONTINUOUS TESTING. .................................................................. 33 TABEL 7-1, GHERKIN KEYWORDS. .......................................................................................................... 36 TABEL 7-2, GHERKIN FEATUREFILE VOORBEELD. ....................................................................................... 36 TABEL 7-3, PYTHON UNITTEST TEMPLATE. .............................................................................................. 37 TABEL 7-4, PYTHON UNITTEST VOORBEELD. ............................................................................................ 38 TABEL 7-5, TESTSTRATEGIE TEMPLATE. .................................................................................................. 40 TABEL 7-6, TESTSTRATEGIE VOORBEELD. ................................................................................................ 40 TABEL 8-1, DEVOPS CE-MODEL............................................................................................................ 43 TABEL 8-2, CONTINUOUS EVERYTHING. ................................................................................................. 44 TABEL 8-3, CMMI LEVELS VOOR CONTINUOUS EVERYTHING. .................................................................... 45 TABEL 8-4, PR-ORG-009. VOLWASSENHEIDSNIVEAUS............................................................................. 46 TABEL 8-5, CT MATURITY CHARACTERISTICS. ........................................................................................... 49 Bijlagen BIJLAGE A, LITERATUURLIJST................................................................................................................. 53 BIJLAGE B, BEGRIPPENLIJST .................................................................................................................. 55 BIJLAGE C, AFKORTINGEN .................................................................................................................... 61 BIJLAGE D, AGILE TOOLS ...................................................................................................................... 63 BIJLAGE E, WEBSITES .......................................................................................................................... 67 BIJLAGE F, INDEX ................................................................................................................................ 69
XII DevOps Continuous Testing Voorwoord Dit boek is samengesteld op basis van mijn Development & Operations (DevOps) praktijkervaring. Voor DevOps is er geen framework of methode dan wel methodiek. Om toch een structuur aan de content van dit boek te geven is daarom uitgegaan van het boek ‘DevOps Assessments’, [Best 2019a]. Daarin is het concept van Continuous Everything gedefinieerd waarvan Continuous Testing een onderdeel is. Tevens zijn de andere DevOps boeken die ik heb gepubliceerd heb, gebruikt om invulling te geven aan dit boek. Dit boek is in eerste instantie geschreven voor de training Continuous Testing. De structuur van dit boek en de gelijknamige masterclass is gelijk gehouden. Tevens zijn de opdrachten toegevoegd zodat dit boek ook gebruikt kan worden voor zelfstudie. Hiermee is een boek tot stand gekomen dat niet een A-tot-Z boek is maar veel meer een naslagwerk om te komen tot het invulling geven aan Continuous Testing best practices. Samen met het boek DevOps Assessments is het daarmee een praktisch handvat om Continuous Testing op de landkaart te zetten en verder te ontwikkelen. Veel van mijn ervaringen heb ik ook al gedeeld in de artikelen op www.ITpedia.nl. Tevens heb ik de kennis en kunde vertaald naar diverse trainingen die ik verzorg voor IT Management Group. Deze zijn te vinden op www.dbmetrics.nl. Hierbij dank ik de volgende personen van harte voor hun inspirerende bijdrage aan dit boek en de fijne samenwerking! • D. (Dennis) Boersen Argis IT Consultants • Dr. L.J.G.T. (Louis) van Hemmen BitAll B.V. • F. (Freek) de Cloe smartdocs.com • J.W. (Jan-Willem) Hordijk CTO at Plint AB • N (Niels) Talens www.nielstalens.nl • W. (Willem) Kok Argis IT Consultants Ik wens u veel plezier toe bij het lezen van dit boek en vooral veel succes bij het toepassen van Continuous Testing binnen uw eigen organisatie. Mocht u vragen of opmerkingen hebben, aarzel dan vooral niet om met mij contact op te nemen. Er is veel tijd besteed om dit boek zo compleet en consistent mogelijk te maken. Mocht u toch tekortkomingen aantreffen, dan zou ik het op prijs stellen als u mij daarvan in kennis stelt, dan kunnen deze zaken in de volgende editie verwerkt worden.
Bijlagen 51 Bijlagen
Bijlage A, LiteratuurLijst 53 Bijlage A, Literatuurlijst In Tabel A-1 is een overzicht gegeven van boeken die direct of indirect zijn gerelateerd aan DevOps. Referenties Publicaties Agile Project K. Schwaber, “Agile Project Management with Scrum”, Microsoft Management 2015 Press, ISBN13: 978 0 7356 1993 7. Best 2011 B. de Best, “SLA best practices”, Leonon Media 2011, ISBN13: 978 90 71501 456. Best 2012 B. de Best, “Quality Control & Assurance”, Leonon Media 2012, ISBN13: 978 90 71501 531. Best 2014a B. de Best, “Acceptatiecriteria”, Leonon Media, 2014, ISBN 13: 978 90 71501 784. Best 2014b B. de Best, “Agile Service Management met Scrum”, Leonon Media, 2014, ISBN13: 978 90 71501 807. Best 2014c B. de Best, “Cloud SLA, Leonon Media, 2014 ISBN13: 978 90 71501 739. Best 2015a B. de Best, “Agile Service Management met Scrum in de Praktijk”, Leonon Media, 2015, ISBN13: 978 90 71501 845. Best 2015b B. de Best, “Ketenbeheer in de Praktijk”, Leonon Media 2015, ISBN13: 978 90 71501 852. Best 2017a B. de Best, “Beheren onder Architectuur”, Dutch language, Leonon Media, 2017, ISBN13: 978 90 71501 913. Best 2017b B. de Best, “DevOps best practices”, English language, Leonon Media, 2017, ISBN13: 978 94 92618 078. Best 2019a B. de Best, “DevOps Assessments”, Dutch language, Leonon Media, 2019, ISBN13: 978 90 71501 81 4. Best 2019b B. de Best, “DevOps Architectuur”, Dutch language, Leonon Media, 2019, ISBN13: 978 94 92618 06 1. Best 2020 B. de Best, “Agile Design”, Dutch language, Leonon Media, 2020, ISBN13: 978 94 92618 375. Caluwé 2011 L. de Caluwé en H. Vermaak, “Leren Veranderen”, Kluwer, 2011, tweede druk, ISBN13: 978 90 13016 543. Davis 2016 Jennifer Davis, Katherine Daniels, “Effective DevOps Building a Culture of Collaboration, Affinity, and Tooling at Scale”, O'Reilly Media; 1 edition, ISBN-13: 978 14 91926 307, 2016. Deming 2000 W. Edwards Deming, “Out of the Crisis. MIT Center for Advanced Engineering Study”, 2000, ISBN13: 978 02 6254 11 52. Downey 2015 Allen. B. Downey, “Think Python”, O'Reilly Media, Inc, Usa; Druk 2, ISBN-13: 978 14 91939 369, 2015 Galbraith 1992 Galbraith, J.R. “Het ontwerpen van complexe organisaties”, 1992, Alphen aan de Rijn: Samson Bedrijfsinformatie. Humble 2010 Jez Humble, David Farley “Continuous Delivery Reliable Software Releases through Build, Test, and Deployment Automation”, Addison- Wesley Professional; 1 edition, ISBN-13: 978 03 21601 919, 2010. Kim 2014 Gene Kim, Kevin Behr, George Spafford “The Phoenix Project”, IT Revolution Press, ISBN-13: 978 09 88262 508, 2014. Kim 2016 Gene Kim, Jez Humble “The DevOps Handbook: How to Create World-Class Agility, Reliability, and Security in Technology Organizations, Patrick Debois, John Willis”, IT Revolution Press, ISBN-13: 978 19 42788 003, 2016.
54 DevOps Continuous Testing Referenties Publicaties Kotter 2012 John P. Kotter, “Leading Change”, Engels 1e druk, november 2012, ISBN13: 978 14 22186 435. Kaplan 2004 R. S. Kaplan en D. P. Norton, “Op kop met de Balanced Scorecard”, 2004, Harvard Business School Press, ISBN13: 978 90 25423 032. Layton 2017 Mark C. Layton Rachele Maurer, “Agile Project Management for Dummies”, tweede druk, John Wiley & Sons Inc, 2017, ISBN13: 978 11 19405 696. Looijen 2011 M. Looijen, L. van Hemmen, “Beheer van Informatiesystemen”, zevende druk, Academic Service, 2011, ISBN13: 978 90 12582 377. MAES R. Maes, “Visie op informatiemanagement”, www.rikmaes.nl Scrum Ken Schwaber and Jeff Sutherland, “The Scrum GuideTM”, 2017, www.scrumguides.org. Toda 2016 (Luke) Toda, President Strategic Staff Services Corporation and Director of TPS Certificate Institution Nobuyuki Mitsui, CTO of Strategic Staff Services Corporation, “Success with Enterprise DevOps Koichiro” “White Paper”, 2016. Tabel A-1, Literatuurlijst.
Bijlage B, Begrippenlijst 55 Bijlage B, Begrippenlijst In Tabel B-1 is een begrippenlijst opgenomen. Deze is in het Engels opgesteld omdat zeer veel termen uit de Engelse taal afkomstig zijn en de uitleg makkelijker leest als de hele uitleg in het Engels is opgenomen. Begrip Betekenis Acceptance test For developers the acceptance testcases gives the answer “How do I know when I am done?”. For the users the acceptance testcases gives the answer “Did I get what I wanted?”. Examples of acceptance testcases are Functional Acceptance Testcases (FAT), User Acceptance Testcases (UAT) and Production Acceptance Testcases (PAT). The FAT and UAT should be expressed in the language of the business. Agile Infrastructure Within DevOps both Development and Operations work in an Agile way. This requires an Agile Infrastructure that can be changed with the same pace as the application is changed through the deployment pipeline. A good example of an Agile Infrastructure is the use of Infrastructure as Code. Alternate path See happy path. Anti-pattern An anti-pattern is an example of the wrong interpretation of a pattern. The anti-pattern is often used to explain the value of the pattern. Automated tests Testcases should be automated as much as possible to reduce waste and to increase velocity and quality of the products that are to be delivered. Bad paths A ‘bad path’ is a situation where the application does not follow the ‘happy path’ or ‘the alternate’ path. In other words, something goes wrong. This exception must be handled and should be monitorable. Behavior driven The development of software requires that the users are asked to development define the (non) functional requirements. Behavior driven development is based on this concept. The difference however is that the acceptance criteria of these requirements should be written in the customer’s expectation of the behavior of the application. This can be accomplished by formulating the acceptance criteria in the Given – When – Then format. Business value Applying DevOps best practices results in increasing the business value. Research of Puppet Labs (State Of DevOps Report) proofs that high-performing organizations using DevOps practices are outperforming their non–high performing peers in many following areas [Kim 2016]. Cloud configuration Cloud configuration files are used to initiate a cloud service before files using it. In this way cloud service providers enable customers to configure the cloud environment for their needs. Commit code Committing code is the action in which the developer adds the changed sourcecode to the repository, making these changes part of the head revision of the repository [Wiki]. Commit stage This is the phase in the CI/CD secure pipeline where the sourcecode is compiled to the object code. This includes the performance of the unittestcases. Configuration Configuration Management refers to the process by which all management artefacts, and the relationships between them, are stored, retrieved, uniquely identified and modified.
56 DevOps Continuous Testing Begrip Betekenis Containers A container is an isolated structure that is used by developers to build their application independently from the underlying operating system or hardware. This is accomplished by interfaces in the container that are used by developers. Instead of installing the application in an environment, the complete container is deployed. This saves a lot of dependencies and prevents configuration errors to occur. Cycle time (flow time) Cycle time measures more the completion rate or the work capability of a system overall, and a shorter cycle time means that less time is being wasted when a request has been made but no progress or work is getting done. Cycle time (lean) The average time between two successive units leaving the work or manufacturing process. Development Development is an activity that is performed by the DevOps role ‘developer’. A developer is responsible for the complete lifecycle of a configuration item. Within DevOps there is no difference anymore between designer, builder or tester. Error path See happy path. Fast feedback Fast feedback refers to the second way of the three ways of Gene Kim. The second way is about having feedback on the functionality and quality of the product that is created or modified as soon as possible in order to maximize the business value. Feature toggles A feature toggle is a mechanism that makes it possible to enable or disable a part of the functionality of an application released in production. Feature toggles enables testing the effect of changes on users in production. Feature Toggles are also referred to as Feature Flags, Feature Bits or Feature Flippers. Feed forward Feed forward within the context of DevOps is the mechanism by which experiences in the present value stream are used to improve the future value stream. Feed forward is the opposite of feedback since feedback is focused on the past and feed forward on the future. Feedback Feedback within the context of DevOps is the mechanism by which errors in the value stream are detected as soon as possible and is used to improve the product and if necessary to improve the value stream as well. Given-When-Then The Given-When-Then format is used to define acceptance criteria in a way that the stakeholders understand how the functionality actually will work. GIVEN – the fact that… WHEN – I do this… THEN – this happens Happy path An application supports a business process by receiving, editing, storing and providing information. The assumed steps in which the information processing is performed is called the happy path. The steps in alternate ways are called the alternate path. In that case, the same result will be achieved via another navigation path. The crawl of the application that causes an error is called an error path. Infrastructure as Code Normally infrastructure components have to be configured in order (IaC) to perform the requested functionality and quality for example a rule set for a firewall or the allowed IP addresses for a network.
Bijlage B, Begrippenlijst 57 Begrip Betekenis These configurations normally are stored in configuration files which enable the operators to manage the functionality and the quality of the infrastructure components. Infrastructure as code (IaC) makes it possible to programme these infrastructure component settings and deploy these settings through the CI/CD secure pipeline by the use of machine-readable definition files, rather than physical hardware configuration or interactive configuration tools. Infrastructure Infrastructure management consists of the lifecycle management of management all infrastructure products and services in order to support the correct working of the applications that run on top of the infrastructure. Latent defects Problems that are not visible yet. Latent defects can be made visible by injecting faults into the system. Lead Time (LT) Lead time is the time from when a request is made to when the final result is delivered, or the customer’s point of view on how long something takes to complete. Lean tools • A3 thinking (problem solving) • Continuous flow (eliminates waste) • Kaizen • Kanban • KPI (Key Performance Indicator) • Plan Do Check Act (PDCA) • Root cause analysis • Specific, Measurable, Accountable, Realistic, Timely (SMART) • Value stream mapping (depict the flow) • JKK (No defects are passed to next process) Micro service Microservices are a variant of the service-oriented architecture (SOA) architectural style that structures an application as a collection of loosely coupled services. In a microservices architecture, services should be fine-grained, and the protocols should be lightweight [Wiki]. Micro service This architecture consists of a collection of services where each architecture service provides a small amount of functionality and the total functionality of the system is derived from composing multiple versions of a service in production simultaneously and to roll back to a prior version relatively easily. Monitoring Framework A framework of components that together form a monitor facility that is capable to monitor business logic, applications and operating systems. Events, logs and measures are routed by the event router to destinations [Kim 2016]. Non-Functional NFR are requirements that define the quality of a product like Requirement (NFR) maintainability, manageability, scalability, reliability, testability, deploy ability and security. NFR are also referred to as operational requirements. Non-Functional NFR testing is the testing aspect that focusses on the quality of the Requirement (NFR) product. testing Operations Operations is the team often responsible for maintaining the production environment and helping to ensure that required service levels are met [Kim 2016]. Product owner The Product Owner is a DevOps role. The Product Owner is the internal voice of the business.
58 DevOps Continuous Testing Begrip Betekenis The Product Owner is the owner of the product backlog and determines the priority of the product backlog items in order to define the next set of functionalities in the service. Programming paradigm A style of building the structure and elements of computer programs. Pull request process This is a form of peer review that span Dev and Ops. It is the mechanism that lets engineers tell others about changes they have pushed to a repository. QA Quality Assurance (QA) is the team responsible for ensuring that feedback loops exist to ensure the service functions as desired [Kim 2016]. Release patterns There are two patterns of releases to be recognized [Kim 2016]: • Environment-based release patterns: In this pattern there are two or more environments that receive deployments, but only one environment is receiving live customer traffic. • Application-based release patterns: In this pattern the application is modified in order to make selectively releases possible and to expose specific application functionality by small configuration changes. Shared version control In order to be able to use trunk-based development developers need repository to share their source code. The source code must be committed into a single repository that also supports version control. Such a repository is called a shared version control repository. Single repository A single repository is used to facilitate trunk-based development. System of Engagement SoE’s are decentralized Information Communication Technology (SoE) (ICT) components that incorporate communication technologies such as social media to encourage and enable peer interaction [What-is]. System of Information The term SOI includes are all the tools that are used to process and (SoI) visualize information from SoR systems. Typically, examples are Business Intelligence (BI) systems. System of Records A SoR is an ISRS (information storage and retrieval system), that is (SoR) the authoritative source for a particular data element in a system containing multiple sources of the same element. To ensure data integrity, there must be one -- and only one -- system of record for a given piece of information [What-is]. The ideal testing The ideal testing automation pyramid is a way of testing that can be automation pyramid characterized as follows: • Most of the errors are found using unittests as early as possible. • Run faster-running automated tests (e.g., unittests) before slower-running automated tests (e.g., acceptance and integration tests), which are both run before any manual testing. • Any errors should be found with the fastest possible category of testing. The non-ideal testing The non-ideal testing automation pyramid is a way of testing that automation inverted can be characterized as follows: pyramid • Most of the investment is in manual and integration testing. • Errors are found later in the testing. • Slower running automated tests are performed first.
Bijlage B, Begrippenlijst 59 Begrip Betekenis Test driven Test driven development is the approach in which the source code is development written after the completion of the testcase definition and execution. The source code is written and adjusted until the testcase conditions are met. Test harness Software constructed to facilitate integration testing. Where test stubs are typically components of the application under development and are replaced by working components as the application is developed (top-down integration testing), test harnesses are external to the application being tested and simulate services or functionality not available in a test environment. The Agile Manifesto The Agile Manifesto (Manifesto for Agile Software Development) was set up during an informal meeting of seventeen software developers. This meeting took place from 11 to 13 February 2001 at "The Lodge" in Snowbird, Utah. The charter and the principles formed an elaboration of ideas that had arisen in the mid-nineties, in response to methods traditionally classed as waterfall development models. Those models were experienced as bureaucratic, slow and narrow-minded and would hinder the creativity and effectiveness of developers. The seventeen people who have drawn up the Agile Manifesto together represented the various Agile movements. After the publication of the charter, several signatories set up the “Agile Alliance" to further convert the principles into methods [Wiki]. The Lean movement An operating philosophy that stresses listening to the customer, tight collaboration between management and production staff, eliminating waste and boosting production flow. Lean is often heralded as manufacturers’ best hope for cutting costs and regaining their innovative edge. Tool-assisted code This is a review technique where authors and reviewers use review specialized tools designed for peer code review or facilities provided by the source code repositories [Kim 0216]. Value stream The process required to convert a business hypothesis into a technology-enabled service that delivers value to the customer [Kim 2016]. Value Stream Mapping Value stream mapping is a Lean tool that depicts the flow of (VSM) information, materials and work across functional silos with an emphasis on quantifying waste, including time and quality. Virtualized environment An environment that is based on virtualization of hardware platforms, storage devices and network resources. In order to create a virtualized environment usually VMware is used. Visualization In computing, virtualization refers to the act of creating a virtual (rather than actual) version of something, including virtual computer hardware platforms, storage devices, and computer network resources. Virtualization began in the 1960s, as a method of logically dividing the system resources provided by mainframe computers between different applications. Since then, the meaning of the term has broadened [Wiki]. Waste Waste comprises the activities that are performed in the manufacturing process that are not adding value to the customer. Examples in the context of DevOps are: • Unnecessary software features. • Communication delays. • Slow application response times.
60 DevOps Continuous Testing Begrip Betekenis • Overbearing bureaucratic processes. Waste reduction Minimization of waste at its source is to minimize the quantity required to be treated and disposed of, achieved usually through better product design and/or process management. Also called waste minimization [Businessdictionary]. Work In Progress (WIP Material that has entered the production process but is not yet a finished product. Work in progress (WIP) therefore refers to all materials and partly finished products that are at various stages of the production process. WIP limit This is a Key Performance Indicator (KPI) that is used in the Kanban process to maximize the number of items that has been started but that is not completed. Limiting the amount of WIP is an excellent way to increase throughput in your software development pipeline. Tabel B-1, Begrippenlijst.
Bijlage C, Afkortingen 61 Bijlage C, Afkortingen Afkorting Betekenis %C/A Percent Complete / Accurate AWS Amazon Web Services BDD Behavior Driven Development BI Business Intelligence CA Continuous Auditing CE Continuous Everything CI Continuous Integration CIA Confidentiality, Integrity & Accessibility (or Availability) CD Continuous Delivery CO Continuous dOcumentation CIO Chief Information Officer CL Continuous Learning CM Continuous Monitoring CMMI Capability Maturity Model Integration CoP Communities of Practice (CoP) CP Continuous Planning CT Continuous Planning CTO Chief Technical Officer DevOps Development & Operations DML Definitive Media Library DevOps DevOps DoD Definition of Done DoR Definition of Ready DTAP Development, Test, Acceptance and Production E2E End-to-End ESB Enterprise Service Buss FAT Functionele AcceptatieTest GAT Gebruiker AcceptatieTest GDPR General Data Protection Regulation GIT Global Information Tracker GSA Generieke & Specifieke Acceptatiecriteria GUI Graphical User Interface GWT Given-When-Then HRR Hand-off Readiness Review IaC Infrastructure as Code ICT Information Communication Technology ID Identifier INVEST Independent, Negotiable, Valuable, Estimatable, Small and Testable ISMS Information Security Management System ISVS Information Security Value System IT Information Technology ITIL Information Technology Infrastructure Library
62 DevOps Continuous Testing Afkorting Betekenis JVM Java Virtual Machine KPI Key Performance LRR Launch Readiness Review LT Lead Time MT Module Test MVP Minimal Viable Product NFR Non-Functional Requirement PAT Productie AcceptatieTest PBI Productie Backlog Item PDCA Plan Do Check Act PST Performance StressTest PT Processing Time QA Quality Assurance QC Quality Control RACI Responsibility, Accountable, Consulted and Informed RASCI Responsibility, Accountable, Supporting, Consulted and Informed REST API REpresentational State Transfer Application Programming Interface SAFe Scaled Agile Framework SAT Security AcceptatieTest SBB System Building Block SBB-A System Building Block Application SBB-I System Building Block Information SBB-T System Building Block Technology SIT Systeemintegratietest SLA Service Level Agreement SMART Specific, Measurable, Accountable, Realistic, Timely SME Subject Matter Expert SoE System of Engagement SoI Systems of Information SoR System of Records SoX Sarbanes Oxley SQL Structured Query Language SRG Standards Rules & Guidelines ST Systeemtest TDD Test Driven Development TFS Team Foundation Server UAT User Acceptance Test UT Unit Testing VSM Value Stream Mapping WIP Work In Progress WoW Way of Working Tabel C-1, Afkortingen.
Bijlage D, Agile tools 63 Bijlage D, Agile tools Op basis van een onderzoek dat bij tien organisaties is uitgevoerd [Best 2015a] is een lijst samengesteld van de ingezette tools en de inzet van deze tools. De inzet van tools blijken zeer veelvuldig te worden aangepast. Figuur D-1 is daarom een momentopname. Het model kan wel goed gebruikt worden om de eigen DevOps tools op compleetheid te beoordelen. In Tabel D-1 is tevens per tool een URL opgenomen om meer informatie over de tools te geven. Figuur D-1, Gebruik van tools door tien organisaties. Short name Long name Website #define issue-tracker (proprietary) Bamboo Bamboo www.atlassian.com CCCQ IBM Rational ClearCase ClearQuest ® ® ® www.ibm.com Cherwell Cherwell Software www.cherwell.com
64 DevOps Continuous Testing Short name Long name Website ClearCase IBM Rational ClearCase ® ® www.ibm.com Clientele Mproof Clientele IT Service Management www.mproof.com Enterprise Architect Sparx Systems Enterprise Architect www.sparxsystems.com FitNesse FitNesse www.fitness.org GIT Global Information Tracker www.git-scm.com HP ALM HP Application Lifecycle Management www8.hp.com HP QC HP Quality Center www8.hp.com HP SC HP ServiceCenter Software H10076.www1.hp.com HP Service Manager HP Service Manager www8.hp.com IBM Tivoli Service IBM Tivoli Service Management Suite www-01.ibm.com Management Suite Ice-Scrum Ice-Scrum www.icescrum.org Installshield Installshield www.installshield.com IT Service IT Service Management (Mexon) www.mexontechnology.com Management Jenkins Jenkins www.jenkins-ci.org Jira Jira www.atlassian.com JMeter Apache JMeter TM jmeter.apache.org LeanKit LeanKitTM www.Leankit.com Maven Apache Maven Project maven.apache.corg Mavim Mavim www.mavim.com McAfee McAfee ® www.mcafee.com MRM Microsoft Release Manager www.ibm.com/software MS Excel MicroSoft Excel www.microsoft.com MSI MicroSoft Installer msdn.microsoft.com MTM Microsoft Test Manager msdn.microsoft.com Nexus Nexus www.ichrome.com/solutions /nexus NPS Net Promoter Score www.netpromoter.com Octopus Deploy Octopus Deploy www.octopusdeploy.com Omnitracker Omnitracker www.omnitracker.com OTRS Open-source Ticket Request System www.otrs.org Powershell Automated Powershell script microsoft.com/powershell ReportServer SQL Server Reporting Services (SSRS) msdn.microsoft.com Robot Framework Robot Framework www.robotframework.org ROSS RepliWeb Operations Suite for SharePoint www.repliweb.com ScrumWise ScrumWise www.scrumwise.com Selenium Selenium docs.seleniumhq.org Serena Dimensions Serena Dimensions www.serena.com SOAtest Parasoft SOAtest www.parasoft.com SonarQube SonarQube TM www.sonarSource.org Subversion Apache SubVersioN (SNV) subversion.apache.org TestComplete TestComplete SmartBear Software TFS Team Foundation Server msdn.microsoft.com TOPdesk TOPdesk www.topdesk.nl
Bijlage D, DevOps tools 65 Short name Long name Website Twist Twist www.thoughtworks.com WinMerge winMerge www.winmerge.com Tabel D-1, Tools.
Bijlage E, Websites 67 Bijlage E, Websites Businessdictionary [Businessdictionary] http://www.businessdictionary.com Collabnet [CollabNet] https://www.collab.net CleanArchitecture [CleanArchitecture] https://www.freecodecamp.org/news/a-quick- introduction-to-clean-architecture- 990c014448d2/ CleanCode [CleanCode] https://cvuorinen.net/2014/04/what-is-clean- code-and-why-should-you-care/ dbmetrics [dbmetrics] http://www.dbmetrics.nl DevOps [DevOps] • http://DevOps.com DDD [DDD] https://www.slideshare.net/skillsmatter/ddd-in- agile doxygen [doxygen] http://www.doxygen.nl/manual/docblocks.html doxygen voorbeeld [doxygen voorbeeld] http://www.doxygen.nl/manual/examples/qtsty le/html/class_q_tstyle___test.html#a0525f798 cda415a94fedeceb806d2c49 EXIN [Exin] http://www.exin.nl Gladwell [GLADWELL] • http://www.gladwill.nl IIR [IIR] http://www.IIR.nl Investopedia [Investopedia] https://www.investopedia.com ITMG [ITMG] http://www.ITMG.nl ITPedia [ITPEDIA] http://www.itpedia.nl UnitTest [UnitTest] • https://docs.python.org/3/library/unittest.html Wiki [Wiki] http://nl.wikipedia.org / wiki / Cloud_computing Wiki docgen [Wiki docgen] https://en.wikipedia.org/wiki/Comparison_of_d ocumentation_generators Tabel E-1, Websites.
Epiloog 69 Bijlage F, Index baseline · 43, 48 BDD · 3, 23, 35, 39, 44, 55, 61 % bedrijfsproces · 43 beeldvorming · 2 %C/A · 61 Behavior Driven Development · 47, Zie BDD beheerder · 12, 13 A beheervoorziening · 7 best practice · 2, 3, 11, 15, 35, 39, 46, 53, A/B testing · 44, 47 55, 82, 84 acceptatiecriterium · 5, 9, 35, 55, 56 best-of-breed tool · 20 acceptatieomgeving · 7, 19, 21, 22, 28, 32, beveiliging · 44 41 BI · 61 acceptatietest · 2, 22, 32, 55 black box test · 6, 7, 30 actor · 30, 31 blameless post mortem · 44 ad hoc testing · 46 blue/green environment · 43 Agile · 1, 3, 15, 20, 35, 37, 39, 45, 53, 54, Boehm · 5 55, 59, 81, 82, 83, 84 bottom-up · 11, 17 - coach · 1 bouwen · 1, 20, 31 - design · 1 branch · 21 - infrastructure · 55 branching · 43 - proces · 45 broken build · 43 - werkwijze · 35 build · 43, 47, 48, 56 Agile Release Trains · 15 build-in failure mode · 44 Agile Scrum · 3, 14, 16, 18 business - aanpak · 10 - case · 14 - framework · 14, 16 - perspectief · 21 - proces · 14, 15 - requirement · 21 - team · 15 - value · 55, 56 alternate path · 55 business DevOps · 44 Amazon Web Services · Zie AWS Business Intelligence · Zie BI annotatie · 44 anti-pattern · 5, 9, 13, 16, 18, 20, 21, 37, 40, 41, 55 C applicatie · 1, 6, 7, 12, 13, 14, 15, 24, 25, 26, 27, 30, 31, 32, 35, 37, 43 CA · 2, 61 applicatieobject · 18 canary releasing · 43 Applicatieobject-matrix · 18 capability · 56 applicatieobjecttype · 18 Capability Maturity Model Integration · Zie architect · 1 CMMI architectuur · 11, 16, 19, 83 CD · 2, 5, 15, 30, 31, 43, 46, 49, 55, 57, architectuurmodel · 19 61 architectuurprincipes · 16, 19, 83 CE · 2, 46, 61 artefact · 2, 8, 28, 55 CE-model · 43 auditing · 44 check-in · 43 automated regression testing · 47 Chief Information Officer · Zie CIO automated sign off · 47 Chief Technology Officer · Zie CTO automated test · 55 CI · 2, 5, 15, 31, 43, 46, 49, 55, 57, automatiseringstools · 35 CI/CD secure pipeline · 2, 5, 9, 12, 15, 20, AWS · 61 21, 22, 31 CIA · 61 CIA-matrix · 7 B CIO · 61 CL · 2, 43, 44, 61 backlog item · 22, 23, 39, 58 cloud · 55 back-up & recovery · 7 cloud configuration file · 55 bad path · 48, 55 cloud service · 55 base use case · 30 CM · 2, 43, 44, 61 CMMI · 43, 45, 46, 61
70 DevOps Continuous Testing CO · 2, 43, 44, 61 Development & Operations · Zie DevOps, codestandaard · 15 Zie DevOps commit code · 55 development team · 15, 16, 17 commit stage · 55 Development, Test, Acceptance and Communities of Practice · Zie CoP Production · Zie DTAP compleetheid · 63 DevOps · 1, 2, 3, 9, 10, 11, 12, 14, 16, 17, Completeness / Accurateness · Zie %C/A 18, 20, 24, 29, 31, 43, 45, 46, 47, 48, compliance · 44 53, 54, 55, 56, 57, 59, 61, 67, 81, 82, compliancy · 44, 48 83, 84 component · 56, 57, 58, 59 - architect · 16 concept · 1 - cyclus · 1 concurrent gebruikers · 7, 22 - team · 3, 24, 47, 48 Confidentiality, Integrity & Accessibility · - tool · 63 Zie CIA DevOps-organisatie · 20 configuratiebeheer · 43 DML · 61 configuration management · 55 documentatie · 37 container · 56 DoD · 10, 44, 61 Continuous DoR · 44, 61 - Auditing · Zie CA driver · 41 - Delivery · 46, 53, Zie CD DTAP · 48, 61 - Deployment · 2 DTAP environment · 48 - dOcumentation · Zie CO - Documentation · 2 - Everything · 1, 2, 10, 43, 44, 45, 46, E 82, Zie CE - Integration · 2, Zie CI E2E · 44, 48, 61 - Learning · 2, Zie CL effectief · 15 - Monitoring · 2, Zie CM efficiënt · 15 - Planning · 2, Zie CP eigenaarschap · 10, 11, 14, 15, 16, 17 - Testing · 1, 2, 3, Zie CT e-mail pass around · 44 - Testing roadmap · 10, 15 emerging design · 31 control · 46, 58 End-to-End · Zie E2E CoP · 15, 16, 61 Enterprise Service Buss · Zie ESB CP · 2, 61 epic · 15, 45 criterium · 1 error guessing · 23 CT · 2, 43, 44, 46, 48, 49, 61 error path · 56 CTO · 61 ESB · 61 Cucumber · 35 E-shaped · 43, 44 cycle time · 44, 56 esource · 11 event · 44, 57 D F data · 46 data driven testing · 48 fast feedback · 9, 10, 11, 13, 21 data masking · 48 FAT · 5, 7, 19, 22, 23, 24, 25, 26, 32, 39, database · 6 55, 61 database management system · 47 feature · 36, 45, 47, 56, 59 defect · 9, 10, 12, 13, 18, 19, 21, 22, 28, feature toggle · 43, 56 31, 32, 44, 48, 57 feedback · 22, 44, 46, 47, 48, 56, 58 defect record · 48 feedforward · 56 defect reductie · 22 file system · 6, 38 Definition of Done · Zie DoD firmware · 9 Definition of Ready · Zie DoR five times why methode · 10 Definitive Media Library · Zie DML flow · 44, 45, 57, 59 dekkingsgraad · 13, 25 fout · 19, 21, 40 deployen · 2, 5, 8 framework · 57 deployment · 48, 55 functie · 6, 15, 16, 20, 31, 37, 38 deployment pipeline · 48, 55 functionaliteiteis · 6 Dev engineer · 1 functioneel beheerder · 19 developer · 8, 15, 17, 20, 25, 55, 56, 58, Functionele AcceptatieTest · Zie FAT 59 functionele requirement · 13 development · 55, 56, 59
Epiloog 71 Information Security Value System · Zie G ISVS Information Technology · Zie IT GAT · 7, 61 Information Technology Infrastructure GDPR · 61 Library · Zie ITIL gebruiker · 41 Infrastructure as Code · Zie IaC Gebruiker AcceptatieTest · Zie GAT infrastructure component · 57 gedragsspecificatie · 35 infrastructure management · 57 General Data Protection Regulation · Zie integrated test tooling · 47 GDPR integrated VSM · 48 Generieke & Specifieke Acceptatiecriteria · INVEST · 61 Zie GSA IP address · 56 gereedschap · 43 I-shaped · 44 Gherkin · 5, 27, 35, 36 I-shaped people · 17, 20 GIT · 41, 61 ISMS · 61 Given When Then · 56, Zie GWT ISVS · 61 Global Information Tracker · Zie GIT IT · 61 governance · 16, 17 iteratief · 2, 13, 27 Grady Booch · 8 ITIL · 61 Graphical User Interface · Zie GUI green build · 43 GSA · 61 GUI · 6, 7, 18, 25, 61 J Guild · 16 GWT · 5, 21, 22, 35, 36, 56, 61 Java Virtual Machine · Zie JVM JVM · 62 H K handmatige testcase · 22 Hand-off Readiness Review · 61 Kaizen · 48, 57 happy path · 23, 48, 55, 56 Kanban · 57, 60 hardware · 5, 9, 18, 56, 57, 59 Ken Schwaber · 14, 54 Hero gedrag · 16 kennis · 6, 9, 12, 16, 17, 18, 20, 41, 82 holistisch · 9, 12 kennisdomein · 20 holistische aanpak · 5, 9 kennisoverdracht · 16, 84 homoniem · 10, 12 keten · 15, 16, 27, 44 HRR · 43, 61 Key Performance Indicator · Zie KPI hypothesis driven development · 43 kostprijs · 13, 20 KPI · 57, 60, 62 KPI trend measurement · 49 kwalitatieve requirement · 13 I kwaliteit · 1, 2, 5, 16, 19, 40, 41 kwaliteitscriteria · 44 IaC · 55, 56, 57, 61 kwaliteitseis · 6, 7, 18 ICT · 61 ID · 61 ideal test pyramid · 3, 18, 19, 21, 22, 44, 47, 58 L IDentifier · Zie ID inchecken · 40 late feedback · 3, 13, 19, 21 incident · 14 latent defect · 57 incrementeel · 2, 13, 27 Launch Readiness Review · Zie LRR Independent, Negotiable, Valuable, Layton · 54 Estimatable, Small and Testable · Zie Lead Time · Zie LT INVEST Lean · 9, 57, 59 informatiesysteem · 1, 15, 16, 18, 19, 35, Lean testaanpak · 9 36, 84 Lean tool · 57 informatievoorziening · 1 lemniscaat · 1, 2 Information Communication Technology · lifecycle · 5, 9, 20, 47, 56, 57, 84 Zie ICT lijnmanager · 1 Information Security Management System · log · 44, 57 Zie ISMS loosely coupled · 57 LRR · 43, 62
72 DevOps Continuous Testing LT · 57, 62 PAT · 7, 22, 23, 24, 25, 26, 39, 55, 62 pattern · 23, 37, 40, 41, 55, 58 PBI · 18, 21, 22, 23, 24, 25, 26, 62 M PDCA · 57, 62 peer review · 47 machtsverhouding · 3 PEN-test · 41 management · 46 performance · 22, 48, 55, 57 manual architecture · 44 Performance StressTest · Zie PST manual testing · 46 performancecriterium · 29 manuele taak · 13 pipeline · 43, 44, 47, 48, 55, 57, 60 manufacturing process · 59 pipeline phase · 47 mastertestplan · 13, 27 Plan Do Check Act · Zie PDCA MD · 6 planningsobject · 22, 23 merge-hell · 21 policy · 3 merging · 43 PPT · 2, 5, 9, 43 metadata · 14, 19, 48 PPT-aspect · 44 methodiek · 1 pre-condition · 36 methodology · 46 principe · 12 metrics · 44 probleem · 2, 9, 10 microservice · 57 problemen · 10 microservice architecture · 57 proceseigenaar · 1 Minimal Viable Metadata · 48 proceseigenaarschap · 15 Minimal Viable Product · Zie MVP procesmanager · 1 module · 6, 24, 25, 31 Processing Time · Zie PT Module Test · Zie MT product backlog · 22, 23, 39, 58 monitoring · 43, 57 product owner · 1, 57 monitorvoorziening · 2 Productie AcceptatieTest · Zie PAT MT · 6, 28, 62 Productie Backlog Item · Zie PBI multiple expertise · 20 productieomgeving · 5, 13, 22, 28 MVP · 62 production data · 47, 48 programmeerstijl · 40 programmeur · 1, 6, 37, 38, 40, 41 programming paradigm · 58 N promoten · 5, 15 PST · 7, 22, 23, 24, 25, 26, 39, 62 navigator · 41 PT · 62 NFR · 57, 62 pull request process · 44, 58 Non Functional Requirement · Zie NFR pull-request · 41 non-ideal test pyramid · 21, 22 Python · 37 O Q ontwerper · 31 QA · 44, 58, 62 ontwikkelaar · 12, 13, 16, 38 QC · 62 ontwikkelomgeving · 5, 6, 7, 9, 10, 12, 13, quality · 46 18, 19, 21, 22, 28, 30, 31, 38 Quality Assurance · Zie QA ontwikkelproces · 37 Quality Control · Zie QC oorzaak · 2 operations · 55, 57 operator · 31 Ops engineer · 1 R Organisatievormgeving · 3 OTAP-straat · 8, 18 RACI · 16, 62 outcome · 20, 36 RASCI · 16, 62 over-the-shoulder · 44 redundantie · 13 refactoring · 28, 32, 37 referentie architectuur · 16 regressietest · 13, 21, 30, 32, 38 P regressietestbasis · 10 regression testing · 47 pair programming · 41, 44, 47 release · 2, 58, 64 paperless · 13 release pattern · 58 paperware · 9 repository · 37, 43, 45, 46, 55, 58
Epiloog 73 REpresentational State Transfer Application Specific, Measurable, Accountable, Programming Interface · Zie REST API Realistic, Timely · Zie SMART requirement · 2, 5, 9, 12, 13, 14, 19, 20, Spotify · 15, 16 22, 27, 30, 31, 35, 43, 55, 57, 62, 84 Spotify organisatie · 15 Resources · 3 sprint backlog · 22, 23, 24, 26 Responsibility, Accountable, Consulted and sprint tijd · 16 Informed · Zie RACI SQL · 62, 64 Responsibility, Accountable, Supporting, squad · 47, 48, 49 Consulted and Informed · Zie RASCI SRG · 43, 44, 48, 62 REST-API · 6, 7, 18, 25, 62 ST · 6, 14, 28, 62 risico · 1, 10, 22, 23, 25, 26, 35, 37, 39, stakeholder · 56 40, 41, 44, 48 Standard Rules & Guidelines · Zie SRG risicoanalyse · 7, 27 story · 48 roadmap · 15 strategy · 46 roadmap planning · 48 Structured Query Language · Zie SQL rollback technique · 43 Subject Matter Expert · Zie SME rootcause · 9 super use case · 30 rootcause analyse · 57 synoniem · 10, 12 systeemtest · 6, 18 System Building Block · Zie SBB S System Building Block Application · Zie SBB-A sad path · 48 System Building Block Infrastructure · Zie SAFe · 62 SBB-I SAFe framework · 16 System Building Block Technology · Zie Sarbanes Oxley · Zie SoX SBB-T SAT · 7, 22, 23, 24, 25, 26, 39, 62 System Integration Test · Zie SIT SBB · 30, 62 System of Engagement · Zie SoE SBB-A · 62 System of Records · Zie SoR SBB-I · 36, 62 System Test · Zie ST SBB-T · 30, 62 Systems of Information · Zie SoI Scaled Agile Framework · Zie SAFe scheduler · 47 S-CI · 48 T scrum master · 1, 14, 15, 16 secure code review · 48 TDD · 2, 3, 9, 21, 23, 30, 31, 35, 37, 39, security · 46, 48, 57 41, 44, 46, 59, 62 Security AcceptatieTest · Zie SAT Team Foundation Server · Zie TFS Service Level Agreement · Zie SLA technical debt · 46 shift left organisatie · 21 technical debt backlog · 11, 15, 17, 46 silo · 59 telemetry · 44 Simian army · 44 template · 23, 24, 25, 26, 29, 30, 37, 40, SIT · 6, 22, 23, 24, 25, 26, 28, 39, 62 43 SLA · 44, 62 test · 7, 21 SMART · 57, 62 - aanpak · 5, 9 SME · 16, 62 - architectuur · 10 SoE · 15, 16, 58, 62 - automation · 19, 41, 47 software · 5, 8, 9, 13, 16, 24, 30, 31, 35, - basis · 5, 6, 7, 27, 30, 31 46, 47, 48, 55, 59, 60, 64, 81, 82 - case · 2, 5, 9, 10, 12, 14, 18, 19, 20, Software · 53, 59, 63, 64 21, 22, 24, 28, 31, 32, 35, 37, 41, 46, software lifecycle · 9 47, 48, 55 softwareontwikkelaar · 13 - cyclus · 31 softwareontwikkeling · 3, 37 - data · 47, 48 softwareontwikkelproces · 5, 9, 12, 46 - data generating tool · 47 SoI · 62 - Driven Development · Zie TDD SoR · 15, 16, 27, 58, 62, Zie - generation · 47 source code · 43, 44, 46 - level · 47 sourcecode · 9, 12, 13, 24, 32, 35, 37, 40, - lifecycle · 47 41, 55, 58, 59 - management · 2, 5, 6, 9, 10, 27, 44, sourcecode header · 13 46, 47, 48 sourcecodestandaard · 13 - object · 24, 25, 26, 31, 41, 47, 48 SoX · 62 - object-matrix · 21, 25, 26 - omgeving · 6, 7
74 DevOps Continuous Testing - pattern · 18, 41, 46, 47 - script · 47 U - situatie · 41 - skills · 10 UAT · 62 - soort · 3, 5, 6, 7, 9, 10, 14, 18, 21, 22, uitwijkvoorziening · 7 23, 24, 25, 26, 27, 28, 29, 31, 39, 41 uniforme - soorten · 2, 3 - meta data · 48 - soort-matrix · 18, 22, 23, 39 - test terminology · 47 - strategie · 2, 3, 5, 18, 23, 25, 27, 28, - test tooling · 47 30, 35, 39, 40 - testproces · 47 - strategie pattern · 18 - WoW · 18 - strategy · 46, 47, 48 unit test case · 47 - taal · 10 Unit Testing · Zie UT - techniek · 23, 24, 25, 26, 39 unittest · 3, 5, 6, 18, 21, 37, 38, 39, 67 - techniek-matrix · 23, 24, 25, 26, 39 unittestcase · 37, 41 - tool · 3, 11, 19, 25, 26 unittestcases · 12, 13, 18, 35 - type · 47 use case · 3, 5, 27, 28, 29, 30, 31 testen · 1, 2, 5, 6, 7, 9, 12, 13, 14, 15, 16, use case diagram · 3, 5, 18, 27, 28, 29, 18, 20, 21, 22, 23, 24, 25, 26, 27, 28, 30, 31 30, 31, 32, 35, 37, 38, 39, 41 User Acceptance Test · Zie UAT tester · 20, 31, 56 User eXperience design · 43 testtechniek · 3, 5, 7, 10, 18, 21 user interface test · 41 - API-testing · 40 UT · 6, 28, 38, 48, 62 - branch testing · 40 - code-driven testing · 40 - error-handling testing · 40 V - interface testing · 40 - loop testing · 40 value stream · 3, 5, 12, 19, 20, 27, 28, 29, - negative testing · 40 44, 48, 56, 57, 59, 62, 84 - static testing · 40 Value Stream Mapping · Zie VSM testtechniek-matrix · 21, 39 velocity · 16, 55 TFS · 62 veranderparadigma · 2, 11, 12, 14, 17, 19 The Agile Manifesto · 59 versiebeheer · 43 the ideal testing automation pyramid · 58 versioning · 46 The Lean movement · 59 virtualized environment · 59 the non-ideal testing automation inverted visibility · 44 pyramid · 58 visie · 12 theme · 45 visualization · 59 time-to-market · 13 volwassenheidsniveau · 45, 46 toetsen · 2, 5, 16, 18, 30 volwassenwording · 15 tool-assisted code review · 44, 59 VSM · 44, 62 tooling portfolio · 16 toolintegratie · 20 toolleverancier · 20 W top-down · 59 traceability · 43 waste · 5, 9, 12, 13, 14, 16, 18, 22, 39, traceerbaarheid · 43, 44 41, 55, 57, 59, 60 transactieverwerking · 15 waste reductie · 5, 9, 12, 19, 60 tribe · 47, 48 Way of Working · Zie WoW trunk · 58 white box test · 6, 30, 38 T-shaped · 44 WIP · 62 T-shaped people · 20 Work In Progress · Zie WIP WoW · 9, 11, 13, 14, 15, 17, 18, 19, 62
Epiloog 75 Nawoord Mijn ervaring is dat de denkbeelden die ik vastleg in een artikel of een boek zich blijven evolueren. In geval u met een bepaald onderwerp uit dit boek aan de slag gaat in uw eigen DevOps organisatie, dan raad ik u aan om even met mij contact op te nemen. Wellicht zijn er aanvullende artikelen of ervaringen op dit gebied die ik met u kan delen. Dit geldt ook omgekeerd evenredig. Als u bepaalde ervaringen hebt die een aanvulling zijn op hetgeen in dit boek is beschreven, dan nodig ik u uit om dit met mij te delen. U kunt mij bereiken via mijn e-mail adres bartb@dbmetrics.nl. Over de auteur Drs. Ing. B. de Best RI is vanaf 1985 werkzaam in de ICT. Hij heeft voornamelijk bij de top 100 van het Nederlandse bedrijfsleven en de overheid gewerkt. Hierbij heeft hij gedurende 12 jaar functies vervuld in alle fasen van de systeemontwikkeling, inclusief exploitatie en beheer. Daarna heeft hij zich toegelegd op het service management vakgebied. Momenteel vervult hij als consultant alle aspecten van de kennislevenscyclus van service management, zoals het schrijven en geven van trainingen aan ICT-managers en service managers, het adviseren van beheerorganisaties bij het richting geven aan de beheerorganisatie, de beheerinrichting, het verbeteren van beheerprocessen, het uitbesteden van (delen van) de beheerorganisatie en het reviewen en auditen van beheerorganisaties. Hij is op zowel HTS-niveau als Universitair niveau afgestudeerd op het beheervakgebied.
You can also read