Join Curiosity and Jim Hazen on October 21st for “In the beginning there was a model: Using requirements models to drive rigorous test automation”

5 Reasons to Model During QA, Part 5: A “Single Pane of Glass” for technologies and teams

Welcome to the final instalment of 5 Reasons to Model During QA! If you have missed any of the previous four articles, jump back in to find out how modelling can:

  1. Identify bugs during the requirements analysis and design phase, where they require far less time and cost to fix;
  2. Drive up testing efficiency, automating the creation of test cases, test data and automated test scripts;
  3. Maximise test coverage and shorten test cycles, focusing QA on the most critical, high risk functionality;
  4. Introduce QA Resilience and Flexibility to change, automatically updating a rigorous test suite as requirements evolve.

This last article in the series shifts focus to consider modelling within the broader context of the Software Delivery Lifecycle. It goes beyond QA, considering how models deliver value to the BAs, developers and testers who can work collaboratively from them.

The need for a Ubiquitous Language

The need for greater collaboration between the “three musketeers” of software delivery has been in sharp focus since at least the advent of Agile principles. Methodologies like Behaviour Driven Development (BDD) have re-iterated the need to work from a common understanding of how the system should function. The goal is to avoid miscommunication and the frustrating rework it creates.

This close collaboration requires a ‘ubiquitous language’ that BAs, Developers and Testers can all speak, regardless of their technical skillset. It must be bridge language that can be used to agree a shared vision of the system being delivered.

However, different stakeholders require different information regarding a system in order to perform their role effective. Their roles also require different levels of technical granularity. Any ubiquitous language must therefore enable discussions between stakeholders, while allowing individuals to drill down into the level information they need.

Working from an agreed picture of the system not only avoids misunderstanding, it also facilitates a parallel and iterative approach to software delivery. Testers for instance no longer need to wait for the design and development of a system to finish but can begin creating tests directly from the designs themselves. They can also improve the designs, performing the “shift left” QA that was set out in part one of this series.

However, the ideal of parallel development and a shared understanding of the system is often difficult to achieve in practice. A siloed approach often persists, and iterations frequently become mini-Waterfalls in which systems are designed, developed and tested in a linear order. Testers and developers must still furthermore translate requirements into formats that they can use, introducing further disruption and bottlenecks:

Common challenges in the SDLC
A linear approach to the SDLC, with numerous manual “information hops” between artefacts.

Modelling offers a way to overcome these challenges, particularly when using a format already familiar to BAs, developers and testers.

Flowcharts: A Bridge Language for the SDLC

Flowcharts provide a modelling language capable of specifying complex software logic, while remaining accessible to all stakeholders. In principle, anyone can read a flowchart, without requiring specialist coding skills or subject matter expertise of given components.

Flowcharts therefore provide a way of pooling and centralising knowledge from across an SDLC. Test Modeller, can generate flowcharts from these existing requirements models, while models can also be built from scratch. A range of additionally connectors augment the models, for instance importing test cases and Gherkin specifications. The same models can also be supplemented after a system has been built, importing process logs, page object scans and UI recordings. This works to build a more complete picture of the system under development.

Flowcharts offer the significant advantage of being widely used by Business Analysts, who often already use specification languages like Business Process Modelling Notation (BPMN).

Critically, they also deliver value directly to testers and developers. Subflows can be used to embed lower level, technical specifications within master flowcharts. The master models provide an overview of the system, while also allowing developers to drill down into logically precise models of individual components.

Testers can also work collaboratively from the same models, generating test cases directly from the models. They can overlay automation logic and test data variables, creating a complete test suite from the flowcharts. This testing can occur from the moment the flowcharts are created, for truly parallel and “shift left” QA.

Flowchart Models as Ubiquitous Language
Flowchart models: A BPMN Requirement that houses the complete test suite needed to test it.

The process of Model-Based test creation is set out in parts two and three of this series. What’s important to emphasise here is how this process does not undermine the value of the flowcharts to BAs and developers: the test automation logic and data is assigned to blocks of the model without thereby editing the look and feel of the model itself. The flowchart remains a BPMN model for BAs; for testers it houses the complete test suite.

Flowcharts in this instance serves as a single source of truth for designing, developing and testing a system. They house a shared understanding that can be added to iteratively, pooling and preserving knowledge. The more accurate the picture, the more reliable and informed the development and testing.

A model will never technically be “complete”, as systems evolve constantly while some information will always remain unknown in complex systems. However, storing known information in a central asset works to avoid miscommunication and technical debt. It reduces the frustration of thinking “If only someone had told me that!”[1], and the rework that follows.

Along with the quality and efficiency gains associated with “shift left” QA and automated test asset generation, modelling appears to be a sure fire way to reliably deliver software earlier, and at less cost to fix.

Join Curiosity and Jim Hazen on October 21st for “In the beginning there was a model: Using requirements models to drive rigorous test automation”

[1]  Dan North & Associates (2006), Introducing BDD, retrieved from on 30-May-2019.

[Image: Pixabay]