Key risk factors to mitigate during a data migration
Part one in this article series summarized the shockingly high failure rates for migration projects, identifying data migration as a key area of...
Design Complex Systems, Create Visual Models, Collaborate on Requirements, Eradicate Bugs and Deliver Quality!
Product Overview | Solutions |
Success Stories | Integrations |
Book a Demo | Release Notes |
Free Trial | Brochure |
Pricing |
Our innovative solutions help you deliver quality software earlier, and at less cost!
AI Accelerated Quality Scalable AI accelerated test creation for improved quality and faster software delivery.
Test Case Design Generate the smallest set of test cases needed to test complex systems.
Data Subsetting & Cloning Extract the smallest data sets needed for referential integrity and coverage.
API Test Automation Make complex API testing simple, using a visual approach to generate rigorous API tests.
Synthetic Data Generation Generate complete and compliant synthetic data on-demand for every scenario.
Data Allocation Automatically find and make data for every possible test, testing continuously and in parallel.
Requirements Modelling Model complex systems and requirements as complete flowcharts in-sprint.
Data Masking Identify and mask sensitive information across databases and files.
Legacy TDM Replacement Move to a modern test data solution with cutting-edge capabilities.
See how we empower customer success, watch our latest webinars, read our newest eBooks and more.
Events Join the Curiosity team in person or virtually at our upcoming events and conferences.
Blog Discover software quality trends and thought leadership brought to you by the Curiosity team.
Help & Support Find a solution, request expert support and contact Curiosity.
Success Stories Learn how our customers found success with Curiosity's Modeller and Enterprise Test Data.
Documentation Get started with the Curiosity Platform, discover our learning portal and find solutions.
Integrations Explore Modeller's wide range of connections and integrations.
Curiosity are your partners for designing and building complex systems in short sprints!
Meet Our Team Meet our team of world leading experts in software quality and test data.
Our History Explore Curiosity's long history of creating market-defining solutions and success.
Our Mission Discover how we aim to revolutionize the quality and speed of software delivery.
Our Partners Learn about our partners and how we can help you solve your software delivery challenges.
Careers Join our growing team of industry veterans, experts, innovators and specialists.
Press Releases Read the latest Curiosity news and company updates.
Success Stories Learn how our customers found success with Curiosity's Modeller and Enterprise Test Data.
Blog Discover software quality trends and thought leadership brought to you by the Curiosity team.
Contact Us Get in touch with a Curiosity expert or leave us a message.
The value of migrating from mainframe to web and cloud architecture often appears obvious to businesses: reduced infrastructure costs, increased speed, and all the flexibility of open system architecture. Overall, that promises reduced capital expenditure, a promise that is driving wide-spread business interest in moving away from the mainframe.
However, this business interest carries technical complexity and, where essential systems are involved, risk. That’s why the team at Curiosity have built an automated approach to migration testing, iteratively testing components as they move from legacy to new systems.
Let’s first consider how mainframe systems can derail migration projects, before seeing how this automated approach de-risks migration projects.
Mainframe components add uncertainty to lengthy and sometimes costly migration projects. They are frequently poorly documented black boxes, built long ago or inherited through mergers. Yet, they are still relied upon for essential business processes, making them paramount to a successful migration.
Developers must understand the inner workings of these poorly documented systems, ready to migrate them and integrate them within a wholly new system. However, there is little help available for understanding the forgotten logic, which frequently carries business data through mazes of interrelated legacy systems.
Testers are often similarly left guessing at how the legacy systems work. Yet, they must build rigorous tests and data to validate that migrated systems run the same essential business processes reliably.
This rigorous testing must identify the distinct journeys with which data can travel through the interrelated legacy systems, along with expected results for each complex journey. Lacking clear understanding, it’s impossible to define these reliably, and manual test design instead hits just a fraction of the system logic.
Creating and executing the tests is further slow and complex, especially if performed manually. In this instance, testers must script or execute a large number of highly repetitive tests. These complex tests must reflect the complex data journeys through migrated systems, and test teams often struggle to find or make linked-up data sets for end-to-end test execution.
Often, testers need to run tests against both the new and legacy system to verify their understanding, and it’s still unfortunately common to find testers entering repetitive text into green screens. This doubles the time invested in test creation and execution, forcing migration testing ever-later in the project.
The time, complexity and uncertainty of migrating mainframe components means that they are often left until the last minute in migration projects. However, leaving these business-critical components until late in the project risks undoing all the time and cost invested in the project so far.
Rigorous mainframe migration testing should instead be performed iteratively, identifying and resolving issues as they emerge. This minimises the risk of finding critical bugs in migrated systems when it’s too late to fix them, in turn mitigating against the risk of costly project failure.
Using Test Modeller, test and development teams can now iteratively test migrations as they move components, resolving issues in new systems as they arise. They can use the same visual models to execute tests for both legacy and new systems, performing snapshot comparisons to ensure that the new system performs reliably.
Meanwhile, the intuitive flowcharts maintain the knowledge and visibility needed for reliable testing and development in future:
Figure 1 – Executable flowcharts perform rapid before/after comparisons as systems
migrate and maintain knowledge of both systems.
With Test Modeller, test teams can rapidly create automated tests for both new and legacy systems, performing iterative snapshot comparisons during a migration. Automating tests for both systems is as quick and simple as dragging and dropping actions to the same flowcharts, selecting different frameworks to test each.
This avoids the bottleneck of test creation and execution, while test design is further optimised for rigour. Test Modeller uses automated coverage algorithms to generate the smallest set of tests needed to test the modelled logic, along with the “just in time” data journeys needed to execute every test.
Figure 2 – Testing iteratively throughout the migration project reduces the risk of late project collapse.
Each modelled component further becomes a re-usable subflow in the approach, a Lego Brick that can be leveraged in future testing and development. This simplifies end-to-end and integration testing, rapidly assembling components to generate tests. Test optimisation then generates comprehensive, linked-up tests and data journeys for the integrated systems, ready to execute against both the old and new system logic.
Visual run results for both the migrated and legacy system are then available side-by-side at the model level, allowing test and development teams to evaluate the risk of pushing migrated systems live.
Using this approach, test and development teams can iteratively compare new and old systems, identifying and resolving issues as they arise. Meanwhile, they maintain precise documentation of both systems, avoiding the challenges of forgotten system knowledge and technical debt in future.
Using visual models to create tests for both systems thereby helps to ensure that mainframe migration projects deliver their full value, minimising the risk of late project collapse.
You can see this approach in action here. We’d love to hear your thoughts and to chat about your techniques for mainframe migration project success.
Part one in this article series summarized the shockingly high failure rates for migration projects, identifying data migration as a key area of...
Today, the majority of enterprises are engaged in ongoing system migrations – and most of those projects will either fail completely, overrun on...
Curiosity Software Ireland, specialist vendor of visual test automation, today announced its dedicated solution for mainframe migration testing. The...
Part two in this series identified some key risk factors associated with a data migration, which underpin the shocking migration failure rates...
The previous article in this series set out how a successful data migration hinges on a range of criteria:
Each year, organisations and consumers globally depend on Oracle FLEXCUBE to process an estimated 26 Billion banking transactions [1]. For...
Banks globally rely on Oracle FLEXCUBE to provide their agile banking infrastructure, and more today are migrating to FLEXCUBE to retain a...
In today's fast-paced financial landscape, where seamless data exchange is crucial for operational efficiency, the adoption of ISO 20022 has emerged...
Curiosity often discuss barriers to “in-sprint testing”, focusing on techniques for reliably releasing fast-changing systems. These solutions...