Uniform master data is a competitive factor that should not be underestimated, but is often not fully exploited, because many companies store and maintain their master data in several locations and a number of different applications. This leads to redundancy and discrepancies that hold up business processes, reduce the quality of customer service, and make reporting more difficult. This not only generates costs, but also considerably reduces a company’s competitiveness in other areas – for example if misleading results of analyses lead to the wrong business decisions.
SAP Master Data Management (SAP MDM) provides support in this area. This component of SAP NetWeaver enables master data to be stored, extended, and consolidated across systems and locations, and to be distributed among all applications within the IT landscape. With the solution, the various master data can be searched for across all applications and merged on the basis of a unique identification number. In addition, SAP MDM offers the infrastructure required for central maintenance and distribution (One Stop Data Maintenance). For this, SAP MDM makes use of the integration solution SAP Exchange Infrastructure (SAP XI), which provides the technical extractors for SAP applications as well as applications from third-party providers. SAP MDM also contains portal-based input screens and predefined workflows, which can be used to map the maintenance processes. Existing solutions can be integrated and enhanced within this open infrastructure.
The three steps in a master data strategy
To ensure successful master data management in a heterogeneous IT environment, it is not enough to simply integrate the applications. Master data must be traceable and supported in all business processes across the whole of the system. For this reason, the SAP MDM approach employs three main steps for a company-wide master data strategy.
The first step involves cleaning up the master data. In this process, the master data objects across the whole system must be identified and all information available for these objects must be merged. The “business partner” object, for example, would have to be described with the following attributes: company name, address, telephone number, fax number, Internet address or sales tax ID. One of these objects – for example the vendor “ABC Parts Inc.” – may have different values and be stored differently in different applications: In system A with the number 123 and in system B with the number 456, for example. As a result, identical or similar master data objects must be assigned to each other using qualifying object type attributes that are valid across the whole company – that is to say, those field entries that are maintained in all or at least most of the applications. For the “business partner” object, this could be the company name, address, and telephone number. In this way, master data can be consolidated without the need to adjust the original systems.
Clean-up, harmonization, and central management
However, this alone is not sufficient for effective master data management. In globally distributed IT landscapes, in particular, uniform maintenance and distribution are essential in order to harmonize the master data over the long term. In the second step, global attributes – for example those already used in the clean-up – are therefore used to guarantee that all applications receive the same data. Users have the opportunity to add further attribute values to the distributed objects in their own solutions. This is important if not all the attributes are maintained centrally. While general product attributes such as the short description, long text, price, and color may be maintained centrally and distributed to the local applications as the basic data, attributes relevant for logistics – such as the package size or weight per unit of measure – are maintained in the logistics application.
In addition to this, objects that logically belong together can be changed and distributed together. For example, if a user wants to work with the master data of a certain product, then the specifications, implementation plans, marketing documents and other objects that belong to this product are grouped together in a package and provided to the particular target application within a single context.
In the third step, it is no longer necessary to maintain the objects in the local applications. Instead, a central master data server, which is responsible for managing the complete object definition including object dependencies, directly distributes the objects to the applications. In this process, active status management updates each step of the distribution: it logs all changes to an attribute and evaluates this information for the connected applications. If an application contains the changed attribute, the attribute is changed there via SAP Exchange Infrastructure. As a result, distribution is controlled and can be tracked at all times.
Applications determine the required data
SAP MDM offers a series of functions and services for correctly harmonizing master data in a distributed system landscape:
- The grouping by business context determines which objects belong together from various business perspectives. Here, the time and scope of distribution must be defined to ensure better control. As a result, one particular distribution scenario defines that all sales applications are supplied with the general product data and the related marketing documents, for example. Another scenario distributes the general data and technical documents to the technical services applications.
- Client-specific data control: If companies want to control data at local level, the individual applications only contain the required data, and only then when it is requested. The applications are capable of differentiating between the parts of a cross-company data object, and identify only those parts that they need. Distribution scenarios can also be created for this to separate the relevant attributes from those that are not needed.
- Synchronous duplicate check: At the same time that master data is maintained, special functions search for identical objects. This ensures the quality of data without disrupting time-critical tasks.
- Seamlessly integrated workflows support customers in checking master data for accuracy and redundancy, enhancing the objects in line with the specified requirements and releasing them for distribution.
- Automatic distribution: Master data is distributed in a controlled way by events. The business context of an event determines which target applications the data is transferred to. If, for example, a sales-relevant attribute is changed, a distribution scenario ensures data consistency in all the sales applications in which this attribute is used.
Collaboration with the user departments is required when it comes to structuring the functions and related processes for data maintenance. Among other things, it is necessary to determine who is allowed to maintain, release, and distribute data, and where a duplicate check should be carried out. This makes it clear that master data maintenance is not just a matter for the IT department.
Process templates provide orientation
A central master data server will hit prove ineffective if employees simply create data records as they have done in the past. Organizational changes are also required. SAP MDM therefore supports the release and enhancement processes with the help of predefined workflows. For example, a buyer working locally searches on the central server for all vendors that the company works with, but cannot find the required data record. In this case, he or she can request a new data record centrally. This object must be released by a “master data specialist”. This dual control increases the quality of the data. With flexible process templates, SAP Master Data Management provides solution scenarios for a number of problems, which can be adapted to a company’s special requirements.