Accelerating complex Data Migrations
  1. Home/
  2. Case studies/
  3. Accelerating complex Data Migrations

Accelerating complex Data Migrations

Our confidential government client’s programme objective was to optimise investment and operations by centralising the management and infrastructure of a vital service. A huge part of the challenge was around the mass data migration required; we designed and built a custom product to facilitate this piece within a tight timeframe.

Client & location:

DIA - New Zealand

Date:

2023

Background

We successfully accelerated a complex government ERP consolidation by reducing the burden of extracting and prioritising the migration of data from over 1000+ disparate source systems across almost 70 organisations. Our decentralised tool had an innovative ‘target to source’ approach, and built-in self service and feedback capability, enabling effective mobilisation of non-technical SMEs to onboard and map their data, collaborating with data engineers through to load, so the data engineers could then refocus their efforts on the most difficult tasks that truly needed them. Additional complexity stemmed from significant collaboration and coordination challenges arising from the large number of locations and staff schedules that had to be overcome by our solution and associated processes. Our approach saved money and time, and optimised deployment of resources to de-risk and enable a seamless, faster, data migration.
Webportal to manage data migration tasks

Overview of the Data Migration tool mapping page.

Highlights

Data migration tools typically present mapping tasks in the order ‘source to target’. With this layout, users are encouraged to focus on their source data, to try and find a home for all of it. While this may be important in the long run, it may not actually be necessary for Day 1 go-live. Encouraging users to do more than needed can jeopardise meeting deadlines.

For the new base systems to become operational on Day 1, organisations only needed to map a subset of their source data to the mandatory target attributes. Post-launch, additional source data could be mapped to optional target attributes with the flexibility of Business as Usual (BAU) time in the new entity.

With our Data Migration tool, we innovatively presented mapping tasks as ‘target to source’. This, coupled with being able to filter to show mandatory target attributes only, guided the users to perform their mapping tasks in a prioritised way.
Raw data in a spreadsheet to be migrated via portal

Example of the initial raw data.

Portal data quality checking page

Presenting mapping tasks ‘target to source’ and allowing users to filter by 'mandatory' attributes first ensures the priority workflow can be followed.

MVP in just 5 months

There was no off-the-shelf mapping tool globally that could fit the programme’s needs, especially none with flexibility that allowed for the data standards to evolve along the way.

Existing tools were too rigid and would have required too many hours performing manual tasks and workarounds - the business case favoured building a custom tool.

We expedited the usual timeline of a custom build by working with real users early to understand what really mattered. Our early design prototypes and testing allowed us to fail fast and learn. Total time from early discovery through to design and MVP build (where the mapping feature was available) was just 5 months. All planned major features were live by the 1 year mark.
Data specialist feedback inside portal

Feedback functionality was incorporated into the tool to streamline updates and to create a single source of truth.

“Ghost is by far the easiest partner I work with… Most of the time, partners have a view of how they add value, often by dictating the process, and Ghost have a special ability to give advice and drive, but stay so flexible as well.”

/ Daryl Shing, Chief Architect


Get in touch


A Ghost , Cyma and SECTION6 company

© GC6 2024