Today, organisations utilise and adopt a range of technologies, both old and new, in service of enabling their “agile” delivery methodologies. Yet, despite the focus on fast delivery, modern Test Data Management (TDM) practices and technologies are repeatedly overlooked. In turn, test data continues to undermine both testing speed and quality. Parallel teams, pipelines, and frameworks remain dependent on outdated TDM practices, as organisations are incapable of supplying the data required to match the speed of agile, DevOps and CI/CD delivery pipelines.
This TDM challenge is not new. Yet, a failure to adapt to technologies and delivery methodologies remains the primary reason for struggling test data strategies and the associated delays.
This blog considers how technologies and their integrations can create challenges when you test, and how a successful test data strategy must also be agile, in order to deliver data to your systems and teams at speed.
This blog is part 2/4 in our series focused on test data strategy success. Check out other parts here:
- Test Data Strategy Success: Data Regulation
- Test Data Strategy Success: Tech Debt & Data Delivery
- Test Data Strategy Success: Tooling to Meet The Strategy
Learn how you can transform the relationship that your teams and frameworks share with data, shifting from slow and manual data provisioning to streaming rich test data in real-time. Read our Test Data-as-a-Service solution brief!
Integrated Test Data is More Complex Than Ever
With the availability of numerous database technologies and the introduction of new ones, organisations are evolving and adopting new technologies. This in turn creates new challenges for test data management, and for testing in general. When considering the adoption of new technologies, you must also consider how these technologies and their integration will challenge the way you test, and the way you get data into your systems.
The most common challenge faced when new technologies are adopted is the extraction and injection of data into those technologies. Data being injected into your new technologies must be traceable, controlled, and compliant. This is pivotal to the success of your test data strategy.
Additionally, consider the challenge of managing integrated systems, for instance a system that passes data through mainframes into SQL, and then onto non-SQL Databases. If my solution is an event streaming architecture in Kafka, and we’re moving to a microservices based architecture, how should my test data approach change?
Even a “simple” architecture will have multiple, integrated data sources and interfaces. Your test data must seamlessly traverse them. Original Image: EpochFail, Wikimedia Commons, published under CC BY-SA 3.0 license.
Test Data Strategies are Lagging Behind Delivery Methodologies
All too often, test data approaches and test approaches remain static, regardless of the technologies being adopted. All technologies should be introduced into your organisation through a set of architectural principles, one such principle is loose coupling.
Loose coupling is used to describe systems in which components are connected with each other in such a way that when changes happen in one component, the performance, and existence of others are not affected. This principle is extremely useful in concentrating test effort and minimising the volume of test data needed. Likewise, consider the principles we discussed in part one of this blog series, those should also be reflected in how your systems are architected.
If you focus on how you are injecting and extracting data from your technologies, your organisation will start concentrating test effort, gain a better understanding of the data passing through your systems, and minimise the volume of test data needed across your whole software delivery lifecycle.
Learn more by watching the Architecture and Technology episode of Test Data at The Enterprise by Rich Jordan below:
Test Data Delivery Methodology & Test Approach
Being conscious of delivery methodology and importantly delivery timelines and cadence is a key aspect of an effective test data strategy. Most organisations will either be delivering IT chains through waterfall or agile methodologies, or somewhere in-between the two. Either way, an effective test strategy, and indeed test data strategy, must align to this delivery methodology. However, that is rarely the case.
It is crucial for your organisation to ensure clear communication between the test team and those responsible for provisioning test data. Otherwise, you risk serious problems. For example, if it takes four weeks to deliver data, but your test team is working in two-week sprints, then your provisioning is insufficient, and your test data strategy needs to be re-considered.
This is a common problem, which has led to testers spending 44% of their time on test data related activities [1]. If provisioning of test data was on-demand, or if these teams had access to synthetic data generation utilities, this problem could be overcome test time could be better spent on ensuring a quality release.
Understand how you deliver – and the data you need
Understanding your delivery method and the needs of the test team becomes key to creating an effective test data capability within your organisation. If a test team following an agile methodology has an approach where they are planning endless end-to-end testing, you shouldn’t be provisioning full-scale environments for full-scale data. Rather, you really need to be taking a step back and challenging the test team scope.
Most testing should not be end-to-end. You should seek ways to limit the scope of testing within your system architecture, creating test data for these isolated “blast radiuses”. Original Image: EpochFail, Wikimedia Commons, published under CC BY-SA 3.0 license.
Not all problems seen as test data problems today are really problems with the data. Many boil down to a lack of understanding of the system under test or a lack of collaboration between the people carrying out the testing with those provisioning the test data. Effective implementations of test data strategies bring harmonisation between teams and test assets.
This harmony requires alignment of your test approach, delivery methodology and technology. Model-based testing offers one way for achieving this. Data or criteria for the data can be embedded into living, easy to understand models, thus eliminating the handoff between test assets and the data as much as possible.
An example sign-up page model within Test Modeller. This model utilises Test Data variables to automatically assign data to nodes and actions ready for automation.
Learn more by watching the Delivery Methodology and Test Approach episode of Test Data at The Enterprise by Rich Jordan below:
Delivering Successful Test Data Technologies
The technology and methodology principles discussed in this blog and the accompanying Test Data at The Enterprise videos go a long way to creating a really effective test data capability within your organisation.
Here are four questions you must be able to answer when setting up your Test Data architecture and technology:
- Which core technologies in your organisation hold data today?
- What are the architectural principles that underpin the systems that are already implemented in your organisation, and how do they pass data around?
- What is the delivery methodology your teams are using, and does your test data strategy align to it?
- What is your test team’s approach and is test data strategy at the heart of it?
Knowing the answer to these questions will help you test better and deliver a far more effective test data strategy!
Watch the complete Test Data at The Enterprise series by Rich Jordan to learn how you can introduce six tenants for a better Test Data Strategy at your organisation!
Footnotes:
[1] Capgemini, Sogeti (2020), The Continuous Testing Report 2020. Retrieved from https://www.sogeti.com/explore/reports/continuous-testing-report-2020/ on 16/03/2023.