SAP HANA for Finance

July 3, 2012 by Gabi Visintin 0

As an IT consultant to the finance industry, Reply Deutschland, one of SAP’s Special Expertise Partners for banking, knows all about the increasing reporting and risk management requirements in the finance sector. Reply consultants Thomas Zachrau and Christian Hadders share their insights and predictions.

SAP.info: Banks are not exactly pioneers when it comes to adopting the latest technology. Are banks showing any interest in SAP HANA or in-memory technology in this relatively early stage?

Thomas Zachrau: Of course! Banks handle millions of customer and transaction records. Anyone who can process them in fractions of a second has a clear competitive advantage. And this is precisely where SAP’s in-memory technology comes in. SAP HANA not only speeds up today’s reporting and risk management use-cases, but it also makes new processes a reality that till now seemed impossible. That saves money and makes more business possible. We are talking to a lot of IT managers, identifying reliable business-cases and carrying out the proof of concept.

SAP.info: Which scenarios are these?

Christian Hadders: Basically where big data analyses need to be accelerated. Take calculations for liquidity risk and the sale of loan portfolios, or for banks’ capital and liquidity, as required by Basel III. Banks have to trawl through millions of data records to get the metrics they need. And that is without thinking about getting a 360 degree view of customer processes or time-dependency of master data. Until now, system performance has kept banks back from doing this. One important aspect is how SAP HANA can be connected to their current business intelligence or business warehouse infrastructure.

SAP.info: Is there ideal solution architecture?

Thomas Zachrau: Perhaps on paper. In reality, though, different architectures are equally valid – regardless of whether you run SAP HANA underneath an existing SAP NetWeaver Business Warehouse, or side-by-side with a traditional SAP NetWeaver BW for new tasks. It depends on the customer’s actual situation. Technical considerations are secondary. Much more crucial is the extent to which the business intelligence organization and the company’s processes have to be adapted.

SAP.info: Modeling the organization and its processes has always been crucial to getting the most of data warehouses.

Christian Hadders: Speed alone does not solve any problems. Even these days – when  everyone is talking about big data and in-memory – cost  effectiveness, quality, and compliance are still important. This is why there has never been a greater need for a business intelligence competency center, BICC for short – an independent and strategic organizational entity that coordinates all business intelligence. And in operations, the central application management services (AMS) have to be revised to keep costs under control and ensure stability.

Christian Hadders, Consultant at Reply Deutschland

SAP.info: What is the impact of SAP HANA?

Thomas Zachrau: The BICC’s basic processes are more or less the same but some of the methods, concepts, and technical approaches used have to be changed significantly. Many things go hand-in-hand and have to be decided for each company on a case-by-case basis. What’s particularly interesting is how the data should be modeled. If using the SAP HANA in-memory technology enables me to store all the data in the main memory, the question arises as to whether I need to go to the trouble of creating InfoCubes to stage data. Being able to streamline data modeling and create new reports much faster provides a strong case for making project methods agile processes. This also requires greater knowledge of business processes and analytics to identify suitable business cases requiring real-time processing on the one hand, and to be able to make business processes more sophisticated, with in-depth forecasting and analysis.

Dr. Thomas Zachrau, Consultant at Reply Deutschland

Having a simpler data model makes the job of the company’s application management service team easier and performance problems occur much less often. Generally, any mistakes in the data modeling are much less noticeable. Regarding the requirements for real-time processing, data is loaded at different times of the day, which means that you have to pay closer attention to the timing of data loading and running reports. Because there are fewer layers overall, fewer transport and transformation steps are needed, but each data loading step and the analysis of data become more complex. And you not should fall into the trap of simply letting data build up over time; good housekeeping and data life cycle management are still important. This is not just to ensure that you have better data. It affects total cost as well, as the SAP HANA license model is based on memory size.

SAP.info: When do you expect to have the first reliable results from live projects?

Christian Hadders: We will be able to present first results at the end of the year.  Our proof-of-concept projects already promise to deliver impressive results.

Tags: ,

Leave a Reply