Doing an SAP HANA implementation whether it is a Green Field or Brown Field rollout is a sizable change in itself. Couple this with all the coordination that is required to ensure the consistency of data and you have the perfect recipe for suffering a migraine.
This paracetamol popping exercise can be avoided though with enough preparation, timely communication, and utilization of automation to speed up different stages.
The first and logically most crucial step is identifying local data Leads within each of the functional steams, or business units, who can act as points of contact for Data Migration activities. Historically, this activity is more of an afterthought once things get going and the project realizes there is not a SPOC (Single Point of Contact) for data migration tasks at the local level. These SPOC’s do not necessarily have to be independent roles, they are roles that can be performed by one of the key users who is well connected to the rest of the teams. Testers can be ideal candidates due to their functional knowledge. The overall data migration lead cannot be present at every location and the local data leads can help by being his/her feet on the ground for managing some of the below activities.
As any seasoned SAP user will tell you, in a ‘Business As Usual’ mode the SAP system accumulates a lot of irrelevant data due to changes to tax laws, vendor acquisitions/insolvency, legislative changes, etc. Nowhere is this additional overhead(ache) more pronounced than when doing an SAP implementation/rollout. It is key to identify which data must be migrated against what must be truncated or left being in legacy. The sooner this exercise is performed the less likely it will be to encounter issues while performing UAT with the migrated data.
Similar to the activity above, the data in the legacy solution cannot always be left behind for example in the case of carve-outs. The data identified as not required should be cleansed from the SAP system. We dedicate a separate point for this as this requires efforts on the legacy system. There could be new programs required to delete this extra data and often this additional effort is not considered and leads to conflicts between the different stakeholders due to budget and project scope.
Once the data is cleansed and identified as migration relevant, it must be extracted from the system. This task sounds a lot easier than it is in reality, “Just select the data and do an extraction query, right?”, No! Extracting data out of an SAP system requires working with the tables and defining detailed queries based upon the transformation rules to be applied to the extracted data. This means, that the same table must be extracted with different selection criteria and value ranges, anyone who has worked with SAP GUI will agree this is a somewhat tedious task. What adds to the tediousness is these extractions must be performed for each Mock (Trial Conversion) exercise all over again as there are changes in Legacy. As with all repetitive actions, using automation tools such as Qualibrate can go a long way in improving the speed and quality of this Data Preparation activity.
Rarely can the data extracted from legacy be directly loaded in the To-Be system due to variations in table structures, technical validation of business rules, and so on. The data extracted therefore is subject to transformation rules based on the table structure for the To-Be system and the tools for loading data. The rules can vary from simple replacement of values to conditional rules based on values in other tables altogether. Alongside that, they are also heavily impacted by changes to the business requirements and related CRs (Change Requests). With anything that is this complex and volatile, there is nothing more valuable than documenting these rules to the minutest detail possible, firstly to avoid misunderstandings and secondly as a point of reference for validation later in the program.
The extracted data along with the defined rules can now be transformed into a format understandable to the data loading tools. Based on the complexity of the rules the transformation can either be performed manually in MS Excel or using ETL tools such as IBM DataStage or Informatica PowerCenter that are the go-to tools. Complex transformation rules vary from merging several tables, sorting unstructured data to adding values during the transformation. Rarely is it advisable to perform complex transformations manually or repurposing tools like MS Excel.
Data transformed, ready to be loaded has one final checkpoint to go through before it can find its new home in the To-Be system. Although the data has been transformed using the tools the complexity of the rules mandates a four-eye principle before loading. The recommendation of performing this check prior to the load is purely to save time on re-loading the data. The data to be loaded is usually several hundred thousand rows (if not millions) across a multitude of tables which requires a substantial amount of time to load. Any issues that can be found before loading the data in the system will save the effort required for reloading the data, if a quick fix directly in the To-Be system is not possible. This validation can be manual or Excel based Macros have also been employed to increase throughput when multiple Mocks are to be performed.
After loading the data in the To-Be System, it is crucial to run a validation on the system before the start of UAT to ensure all the data is loaded as per the requirements and is coherent with the business rules. Tools such as Qualibrate can accelerate UAT either through guiding users or by executing the tests automatically. Significant time can be saved in the validation of the system by using automation tools that help ensure consistency not just of data but also the functionality of the SAP system.
To summarize, few activities related to an SAP project are straightforward and must be performed at a scale that can seem daunting. The daunting can be prevented from haunting the project teams, by organizing the activities and establishing processes to leverage the right tooling for the right phase.