Data Migration
Data Migration is the key to upgrading applications
A successful data migration is one completed by an organisation that has done it before. Typically, no two migrations are the same, even if the existing and target applications are from the same vendor!
Unfortunately, in Australia the selected vendor and their chosen SI downplay the detail and the difficulties that arise when migrating the data from old to new applications. They instead focus on the functional comparisons and benefits of the new cloud software and suggest the migration of the data is a forgone conclusion, when in fact the data migration will become the most difficult part of the move from old to new software. If you think about it from the following perspective, it should help you understand the complexities in the data transition from old to new software.
When you were deciding on the new software you would have reviewed how well the proposed vendor’s software satisfied your functional requirements. You would have also looked for significant performance improvements for existing functions. These improvements or differences in how the proposed software performed your business processes are what causes the data migration challenges. In essence, both the old and the new software have an underlying data model. They will be different (most often very different) and these differences will create those significant data migration challenges.
As already stated, these challenges have a significant impact on the data model configurations during migrations. These impacts are the critical items that if not discovered and resolved prior to migrating the existing data will cause failures at testing, extend the time to finish, and increase the spend on the project! By “how much” depends on the data and the business rule differences. Historically, in most instances, they have been significant!
For an enterprise implementation, this is because both Financial or ERP applications have a degree of functional and configuration change unique to every organisation. These business rules cause significant translation challenges in the data when moving the data from old to new cloud-based financial applications.
Notwithstanding the data model differences, there are in every migration several items that are the same and should always be adopted.
Some of these items are:
– Define your architecture first
– Agree on the number of checkpoints for data validation
– Discover all business rules logic that might affect the migration
– Define key mappings for the data
– Resolve all data transformations and translations from the old to the new data model
– Test data often
– Correct quickly, and decide if the final migration is a big bang or gradual approach.
When you were deciding on the new software you would have reviewed how well the proposed vendor’s software satisfied your functional requirements. You would have also looked for significant performance improvements for existing functions. These improvements or differences in how the proposed software performed your business processes are what causes the data migration challenges. In essence, both the old and the new software have an underlying data model. They will be different (most often very different) and these differences will create those significant data migration challenges.
As already stated, these challenges have a significant impact on the data model configurations during migrations. These impacts are the critical items that if not discovered and resolved prior to migrating the existing data will cause failures at testing, extend the time to finish, and increase the spend on the project! By “how much” depends on the data and the business rule differences. Historically, in most instances, they have been significant!
For an enterprise implementation, this is because both Financial or ERP applications have a degree of functional and configuration change unique to every organisation. These business rules cause significant translation challenges in the data when moving the data from old to new cloud-based financial applications.
Notwithstanding the data model differences, there are in every migration several items that are the same and should always be adopted.
Some of these items are:
– Define your architecture first
– Agree on the number of checkpoints for data validation
– Discover all business rules logic that might affect the migration
– Define key mappings for the data
– Resolve all data transformations and translations from the old to the new data model
– Test data often
– Correct quickly, and decide if the final migration is a big bang or gradual approach.
After the data has been extracted from the source application, it is moved into the Common Data Store (CDS). The transformation tool used to do this is dependent on the site requirements and budget. Generally, an ETL tool is used to load the CDS, although, a combination of custom code and native database tools may also be used to achieve the same results. The transformation from the application data format into the common data structure is unique to each source format, however, the delivery of the data into the CDS is then common to each of the data types.
Common business and validation rules are applied to the data within the CDS. These rules are normally run against the specific object type, regardless of its source. Within the CDS, key values such as Policy number / Product ID / Customer ID / Chart of Account Numbers / are cross-referenced into application neutral values. The outbound processing will then translate these keys into the value required by the target application should a relationship be defined. The cross-reference is dynamic, so as further relationships are defined, they can be updated within the CDS.
The implication of the dynamic cross-reference technique used to manage key relationships is in the instance of the migration to the target system(s). Cross-reference data may be created by the target system (such as when a new data object instance is created and the key automatically assigned) and the data created during the import process can be used to feed the relationships back into the CDS. This means that should the data need to be extracted to a peripheral system or another target, the target values can be included in the data stream as required.
Novon (including Lonispace acquisition) has seen migration data from one application to another for more than 19 years and has in that time completed many successful data migrations. From on-prem back-office financial applications to on-prem supply chain ERP solutions and more recently from on-prem Financials / ERP applications to cloud-based ERP / Financial applications.
Common business and validation rules are applied to the data within the CDS. These rules are normally run against the specific object type, regardless of its source. Within the CDS, key values such as Policy number / Product ID / Customer ID / Chart of Account Numbers / are cross-referenced into application neutral values. The outbound processing will then translate these keys into the value required by the target application should a relationship be defined. The cross-reference is dynamic, so as further relationships are defined, they can be updated within the CDS.
The implication of the dynamic cross-reference technique used to manage key relationships is in the instance of the migration to the target system(s). Cross-reference data may be created by the target system (such as when a new data object instance is created and the key automatically assigned) and the data created during the import process can be used to feed the relationships back into the CDS. This means that should the data need to be extracted to a peripheral system or another target, the target values can be included in the data stream as required.
Novon (including Lonispace acquisition) has seen migration data from one application to another for more than 19 years and has in that time completed many successful data migrations. From on-prem back-office financial applications to on-prem supply chain ERP solutions and more recently from on-prem Financials / ERP applications to cloud-based ERP / Financial applications.