In-memory computing was the main topic at SAP’s customer conference this past spring. Why the focus on it?
Gunter Liebich: It’s an idea whose time has come. After decades of development, we’ve reached the point where multicore systems and greater amounts of system memory can enable companies – even those with very large installations – to use in-memory databases. Since larger server memory and increased processing power now make it possible to analyze mass quantities of data in real time and instantly draw more comprehensive conclusions, we’re looking at tremendous growth in information available.
This typically involves processes that currently take a while. Here’s an analogy: For a long time, people tried to make chess computers that simulated human thought. The programs became more and more complex, until one day processors had gotten so fast that the developers went back to a simple computational approach. Since then, computers just run through all of the possible moves at an incredible speed; not even the greatest chess genius can hope to keep up.
That’s what it’s like with in-memory computing: Everything is getting faster and more streamlined. To process large amounts of data, people used to develop parallel applications to avoid bottlenecks. These days, this no longer takes place at the application level. As a whole, systems are becoming less complex. Plus, memory keeps getting cheaper.
Free download: The current article of SAP SPECTRUM Issue 3 | 2010.