In-Memory Computing – It’s Groundhog Day (All Over Again)!

In-memory computing is driving an inflection point in the IT industry quite similar to one we have seen in the past – hence the reference to one of my favorite Bill Murray movies, Groundhog Day. I was recently speaking with a member of my team, David P. Reed, about the implications of “real real-time analytics” on the IT industry, and specifically the need for large amounts of addressable computing memory. He shared a great story about his experiences from his time as chief scientist at both Software Arts (creators of VisiCalc) and Lotus (creators of 1-2-3). I thought I would share his story.

Back in the days of VisiCalc and Lotus 123, many thought the spreadsheet concept was trivial and not worth a serious person’s time. The thinking was that there were “better” analytic tools based on “real databases.” But it turned out that the “what-if” analytics and the instant “end-user programmability of large data sets” that spreadsheets provided were actually what made the PC revolution happen.

The whole point is that VisiCalc and then Lotus 1-2-3 could not have been done without fast access to “in-memory” data. Single-handedly, large spreadsheets were the program that drove the need for more DRAM and in effect “sold DRAM” to PC users. Spreadsheets needed to have all of the data in DRAM in order to make real-time analytics, decision-support and “what-if’ing work.

All of the major industry players at the time, Intel, Microsoft, IBM and Apple had to react. The need for larger spreadsheets caused Lotus, Intel and Microsoft to work together on the Lotus-Intel-Microsoft Expanded Memory Spec; extended memory boards were then sold with sufficient memory for Lotus 1-2-3 customers to build large complex models. At the same time, Apple increased the memory capacity of the original 128K Macintosh to 512K in part to support Lotus.

The world of business changed. Large financial companies put spreadsheet-style modeling (with its ease of programming) right into the middle of trading processes; real-time trading data was fed into spreadsheets and output trades were sent back to traders. At Lotus, they found ways to get large volumes of archival data into PCs, becoming the first data publisher on optical drives and CD-ROMs.

Here at SAP, we are experiencing an incredibly parallel story with those early days. In-memory computing is going to drive the industry to great changes. Businesses are going to need systems that have far more addressable memory, holding entire analytic “what-if” databases; and, more importantly, they need not be just off-line computations, but an integral part of the decision-making process. For the first time, with an integrated software suite, and in-memory data, one will be able to ask simple what-if questions, and track the overall impact across an entire enterprise in a fraction of a second.

It’s Groundhog Day (all over again)!