The toolset on offer by SAP for data migration is often overlooked and consideration only given to it when the project has already kicked off. Perhaps this is due partly to the fact that many customers are not aware, on a technical level, of the data migration tools on offer and what would best suit their project.

“Often the pre-migration and data analysis activities are seen as critical to the data migration project and focus is placed on this,” states Paul McCormick, executive and lead migration consultant at GlueData.

“Our experience has taught us that it’s just as critical to consider what toolset to use. Choosing the right tool to do the job will dramatically reduce the risk of the data migration, will speed up the technical element of the data migration build, enable data errors and issues to be identified earlier and provide the most effective way to automate data transformations.”

Top reasons to focus on SAP migration toolsets:

  1. There are different toolsets available for on-premises versus cloud environments.
  2. The build and execution methodology is very much driven in terms of what can be achieved by the specific toolset.
  3. Resources and their skill sets. Skills in the various toolsets are very specialised and you will rarely find consultants with expertise in all the toolsets on offer.
  4. The loading mechanism to be used will impact the toolsets.
  5. The volume of data will play a role in determining the toolset – will the volumes be small or large?
  6. The data source will play a role in deciding toolsets – is the data source coming from an SAP system or a non-SAP system?
  7. The current status of the data quality must be considered – what is the quality of the current data like, will the transformation be light or heavy?

SAP’s most common data migration toolset

  • SAP Data Services (SDS) and SAP Information Steward (IS)
  • SAP Migration Cockpit (SMC)
  • SAP Agile Data Preparation (ADP)
  • SAP HANA EIM Smart Data Integration (SDI) / Cloud Platform SDI
  • SAP HANA EIM Smart Data Quality (SDQ)
  • Legacy Systems Migration Workbench (LSMW) – (no longer supported by SAP on S/4HANA)

SAP Data Services (DS) & Information Steward (IS) are typically used when there are:

  • Substantial data volumes.
  • Large number of data sources and targets.
  • Complex transformation requirements including de-duplication.
  • Requirement to perform analysis of the data ahead of the migration (we recommend this as best practice).
  • Where data quality reporting is needed, either as a one-off or as a continuous activity.
  • Data services can load data directly to SAP, making use of an IDOC or a Bapi. If an SAP ABAP load program is used as loading mechanism, then DS can output to a .txt file. DS can also load to a HANA staging schema to be used by Migration Cockpit.
  • SAP Information Steward is a tool for data analysis, profiling and dashboard reporting. SAP IS is often uses together with SAP DS when the project has data assessments and quality-related tasks.

SAP Migration Cockpit (SMC) is a relatively new tool released by SAP and is embedded within S/4 HANA – both cloud and on-premises versions. It is positioned for use in the transformation and loading of data into S/4 HANA. SMC takes data held in a predefined data template format, either in spreadsheets or in a staging database and applies value mapping, technical validation, and reporting of any technical validation issues to this data. It uses standard load APIs developed by SAP.


  • Replaces LSMW in an S/4 HANA target environment.
  • Does not require any separate infrastructure as it runs in the same environment as the data is to be loaded. SAP Transaction LTMC starts up Migration Cockpit.
  • Can load data using .xml templates (File Based Load) for small data volumes.
  • Can load data using a HANA Schema (Staging Based Load) for larger data volumes. (If a customer does not have a S/4 HANA Enterprise licence then additional licensing will be required for the HANA DB where the SAP load ready data can be staged.)

“In our view, the use of SAP Migration Cockpit is entirely complementary to the use of SAP DS, as we would typically recommend a separation between the transformation and load steps of ETL. It works well in combination with SAP Data Services, particularly when loading data into a cloud-based S/4 HANA system but should only be used as a standalone solution when loading low complexity data with minimal transformation into S/4 HANA. It is also fully compatible with SAP’s ADP as this uses the same data templates to construct data for loading. Reporting on data loaded or errors during load is weak. Problem-solving issues using MC is not at this point easy,” continues McCormick.

SAP Agile Data Preparation has been developed as a Web tool that incorporates simplified versions of transforms from SAP DS to enable profiling, cleansing, de-duplication and data preparation. It provides an extract and transform tool to complement the load functionality of SAP MC and works best with limited volumes of data.

“Some might think that SAP IS and SAP ADP compete, but I believe they satisfy different needs based on the specific customer requirement. SAP IS is a fully-fledged data quality monitoring and data prep tool installed on-premises. Where a customer has the need for a data quality assessment and monitoring tool before a data migration, during a data migration and after a GO-Live, they would invest into SAP IS. Where the data migration is less complex with smaller data volumes but they still want the ability to use a tool which can assist in enhancing data quality before migration, then the customer could look at using something like SAP ADP,” says McCormick.

SAP Smart Data Integration (SDI) and Smart Data Quality (SDQ) are SAP HANA-based tools that allow you to replicate and transform data from (and in some cases to) remote sources into SAP HANA. (Note that the use of this is for SAP HANA generally rather than S/4 HANA.)


  • Does not require any separate infrastructure as it runs in the same HANA environment.
  • Supports loading to HANA systems in a public cloud.
  • Being a native HANA product, SDQ is great at processing large volumes of data.
  • Can connect to multiple sources but can only load to HANA as a target.
  • SAP HANA SDI and SAP Cloud Platform SDI deliver transformation and migration/integration capabilities as part of the SAP HANA platform. SAP HANA SDI is primarily for HANA on-premise systems and SAP Cloud Platform SDI is for use with SAP Cloud Platform.
  • Smart Data Quality is often used to report on the quality of data and to provide auto correction/enrichment of that data.

“SAP SDI and SDQ have much in common with SAP DS both in terms of their user interface and the functionality available. They are often used where big data source systems are in scope and SAP DS is not available. Also, in cases where components available for transformations are light and not as mature as SAP DS. The interface uses a web IDE, and latency can be an issue, it is not necessarily the easiest development environment,” states McCormick.

Historically, Legacy Systems Migration Workbench (LSMW) has been one of the most widely used tools for data migration in SAP landscapes. Although it provides no extraction functions, it does have extensive mapping, transformation and loading capabilities and is limited only by what is possible within an ABAP programming environment. However, it must be noted that SAP has stated that LSMW is no longer supported for loading data in an S/4 HANA system and is superseded by the SAP Migration Cockpit. (SAP NOTE:2287723)

GlueData’s top considerations when selecting an SAP tool for your data migration project. Most important: Do not only focus on the data migration project, also look into future use.

  • Complexity of the source and target landscapes

To what extent do you need to combine data from multiple sources, remove duplicates and provide enrichment to the data to be loaded?

Will the data remain in multiple systems rather than be consolidated into the new environment after go-live and how do you need the data to remain synchronised?

  • Volumes and complexity of data

What volumes of data need to be read from the sources, mapped, transformed and loaded into the new system?

How critical a consideration is data migration performance?

What downtime windows will the tool need to work within?

To what extent does the tool need to accommodate any complex interdependencies between different sets of data?

  • Security and sensitivity requirements

Is there any information that should not be visible to certain people?

To what extent is it necessary to control access to data, documents and reports?

Will any of the data need to be encrypted and if any specific algorithms are to be used, can the tool support these?

  • Reuse of the developments/landscapes after go-live to support data quality governance

To what extent will transformation rules, validations and connections to other systems be used during the data migration need to be reused after go-live to ensure the continued quality of your data?

What other forthcoming projects requiring data migration are being considered?

Do you have an ETL tool to deal with after go-live data quality initiatives or data integration projects?

  • Level of In-house expertise

Does your organisation already have experience/skills in using one or more of the tools?

What additional skills will be needed to support data quality after go-live, or to support multiple ongoing interfaces?

  • Quality of existing data (known or expected quality)

Are there known deficiencies in current data, or do you think analysis might show them up?

Will data cleansing be a single once-off exercise, or will you continue to cleanse data throughout the migration?

At GlueData, we have developed a complete SAP data migration methodology which includes best practices strategy and architecture as well as multiple accelerators to streamline the data migration and ensure ongoing data quality.

We recently completed a large-scale data migration using a three tier data services landscape. Data was staged in an MS SQL Server and the load mechanisms included IDOC, custom programs and LSMW. The scope of the migration included:

  • 77 ERP Data Objects migrated to S/4HANA.
  • 38 Data Objects migrated from Oracle to HANA for BW consumption.
  • 8 Terabytes of data, and over 40 billion rows of data migrated.

We are currently busy with multiple SAP 4/HANA data migrations in various countries across the world and across multiple industries including banking, pharmaceuticals, retail and mining.

See our Web site www.gluedata.com or contact us at info@gluedata.com to find out more.