Every day, companies’ IT departments are confronted with the complexity of SAP landscapes. Users and board members alike expect them to maintain optimal performance without cutting corners, while keeping a tab on operating costs. Ralph Treitz, CEO of benchmarking experts VMS, spoke with SAP.info and explains how benchmarking helps companies get to grips with the complexity of their SAP landscapes.
SAP.info: Complex and expensive. Complaints about operating SAP software are not unheard of – and certain interested parties like to pick up on them. But is it all true?
Treitz: You mustn’t confuse complex with complicated. Even the most simple of things can be presented in a complicated way. In contrast, complexity in application landscapes can’t be completely avoided.
SAP.info: Why isn’t it possible to “keep it simple, stupid”?
Treitz: The real business world is multifaceted and complex. If a software provider wants to support companies as best it can, this complexity inevitably becomes incorporated into the IT environment. SAP systems offer exponential opportunities in terms of differentiated business processes – which makes them genuine masterpieces in the world of IT. They contain a great deal of logic, creativity, and domain knowledge.
SAP.info: Regardless of this, SAP systems tend to grow once they are used in production operation. Response times get worse, and operating costs spiral out of control.
Treitz: Each modification, new implementation, custom development, or additional user community makes an SAP system more complex and increases operating costs to a certain extent. This is due to the very nature of things. At the end of the day, even projects that have a positive economic impact on business processes take their toll on the IT budget before they have any effect. Companies’ IT departments are now tasked with mastering this complexity.
SAP.info: How should an IT department react if user departments start to complain about applications performing poorly? Is benchmarking a solution?
Treitz: Yes, it is. But benchmarking is a tool that can be used to ultimately give the IT department sensible, targeted recommendations for making improvements. That’s why simple comparisons based on peer groups are of little help. Such procedures can be applied only to very general areas. For example, it’s certainly interesting to find out whether a company spends more or less on IT as a whole than its direct competitors. But as soon as you start delving into the details, the individual characteristics of each peer group member get in the way of making general meaningful statements. Misinterpretations then become inevitable.
SAP.info: So, even though a company uses standard software, it’s still impossible to make comparisons?
Treitz: Every SAP landscape is unique – even within an industry. Benchmarking needs to cater to this. If a benchmark only demonstrates that the hardware is 20% more expensive than the industry average, its value is doubtful. Instead of a simple industry comparison, the question we need to ask is more like: How much would other companies spend if they had to provide the mix of IT services that my company provides? The answers in such a DNA-level benchmark can then be used to derive suitable measures to optimize a company’s IT operations and to take a step toward best practices.
SAP.info: How exactly does a DNA-level benchmark – as advocated by VMS – differ from conventional benchmarking approaches?
Treitz: We use a stochastic model to evaluate SAP systems. The modeling tool gives us a clear picture of the current status of an SAP system. Small ABAP programs provide the necessary information – and they can measure an SAP system in great depth, from the millisecond it takes to save entered data through to the annual balance of costs and resources.
SAP.info: Can SAP Solution Manager be used for such tasks?
Treitz: SAP Solution Manager is the tool of choice for good quality daily operations. It has various ways of accessing technical data and processes to ensure that things run without a hitch. Comparing SAP Solution Manager with DNA-level benchmarking is like comparing a traffic police officer with a traffic planning officer. For smooth operations at any particular time, a company needs the one officer; and for analyzing the architecture and averting problems, it needs the other. Basically, our DNA-level benchmark picks up on the fundamental idea of the SAP EarlyWatch Alert service, and takes it to a strategic level – the level of costs, benefit, and IT quality.
SAP.info: What conclusions can you draw from comparing the current status with the desired status of an SAP landscape? What measures can be implemented to get fast optimization results?
Treitz: There’s no simple answer to this question. In only a very few cases can the root of all evil be found in too little hardware. That’s why we need to take a close look at the whole thing. It’s all interwoven – customizing, program processes, data structures, and data flows. For example, the right customizing of SAP applications can increase response times considerably – and without the need for additional programming. Shifting certain processes to peripheral times and nighttime can also speed up response times during normal day-to-day operations. But sometimes the problem is an organizational one – for instance, if the IT department finds out too late that a user department or a customer is set to use self-service functions more in the future. Here, action-driven benchmarking helps to identify and evaluate pain points. Particularly in times of tight IT budgets, evaluation is a decisive aspect. Only those who can clearly evaluate measures according to costs and economic impact can obtain the means necessary to make improvements.
SAP.info: But just having transparent SAP operations doesn’t give us any information about the quality that can actually be achieved. You already pointed to the difficulty of using the peer group approach.
Treitz: That’s right. The VMS benchmark base comprises data on around 2,000 measured SAP systems and has approximately 1,600 years of SAP operations recorded in detail. And performance highs and lows only become visible when we make comparisons with other systems that have a similar structure and a similar mix of applications. By broadening our horizons and looking at comparable organizations regardless of industry, we can find reliable answers about whether – for example – decentralized installations are better at branch offices or whether centralization makes more sense, or whether virtualization can bring tangible cost benefits, and so on. And by the way, cross-industry benchmarking isn’t new. In the 1990s, Dallas-based Southwest Airlines compared the processes for turning airplanes around (arrival, service, departure) with the pit stops in Formula One motor racing, to find out how to optimize at a fine-grained level. They were highly successful – and one of the first low-cost airlines was born.