“One Size Doesn’t Fit All”

February 7, 2007 by admin

For almost two decades, enterprise architects assumed that for optimum performance in core applications like financial programs data must be organized in rows of tables. It’s the assumption that fuels designs of relational databases common to organizations throughout the world.
So if the row organization is good for transactional applications than it must be good for every other type of programs, too. Right? Actually, no, according to SAP. It turns out that business-intelligence analytical applications run much faster when the information is arranged in columns and then compressed to reside in fast random access memory (RAM) rather than on hard drives, another staple of the relational database approach.
“The paradigm over the last 15 to 20 years was that one solution – the hard-disk-centric RDBMS – is right for everyone,” says Vishal Sikka, executive vice president of architecture for SAP. “Now we’re at a point where we know that one size does not fit all. Different types of applications have different requirements for data management.”

A new approach

Enter the in-memory data-management architectures that SAP is developing. This approach differs from traditional relational databases in three ways. First, the column versus row organization, which can boost performance for BI analytics. “Analytical data is better off when it’s stored in columns because you typically look for information, such as regional sales numbers or product attribute data, in one particular column,” says Sikka. For example, while rows might list information relevant for each customer, it’s the column headings for total sales or quarterly profits that provide the larger view of performance important for spotting trends and opportunities.
Next, compression algorithms squeeze information to a fifth or less in size so companies can place complete data sets in RAM. Finally, in-memory data management handles both highly-organized data, such as statistics, and free-form facts found in text documents and e-mails. Enterprises increasingly need data-management options that address the entire range of formats that hold information.
These characteristics provide a potent blend of high performance and design flexibility. “Because data is stored in RAM, the access times are orders or magnitude faster than when [the application goes] out to a disk farm” to retrieve information, says Joshua Greenbaum, principle of Enterprise Applications Consulting, Berkeley, Calif. The information is “physically adjacent to the processor, which means a significantly faster environment that’s potentially less expensive. Large arrays of disk farms can be costly,” he adds.
SAP’s commercialization of the in-memory approach within the SAP NetWeaver Business Intelligence Accelerator (SAP NetWeaver BI Accelerator) uses a combination of hardware and software the company developed with Intel, Hewlett-Packard, and IBM (for microprocessors, servers, and storage, respectively). The appliance plugs into the SAP NetWeaver environment.
To take advantage of the in-memory environment, the SAP NetWeaver BI Accelerator indexes data columns retrieved from InfoCubes, the multidimensional data storage containers from the SAP NetWeaver Business Intelligence environment. The Accelerator lets organizations perform queries and analyses on the fly without first pre-aggregating information, a step common to relational databases architectures.
SAP says 30 companies used initial versions of SAP NetWeaver BI Accelerator, including Novartis, The Coca-Cola Co., and British Petroleum. Coca-Cola reported the time to launch a query against 60 million data dropped to three second from 30 seconds thanks to the accelerator, according to SAP.

Taking advantage of technological developments

In addition to acknowledging that different types of applications have different data-management needs, the in-memory approach also takes advantage of technological developments that have occurred over the last two decades, Sikka says.
Computing power has increased dramatically. For example, for each U.S. dollar companies invest in CPUs today they receive about seven MIPS (millions of instruction per second) of performance, or 143 times more than in 1990. Similarly, RAM, once a scarce commodity, now costs about 5 megabytes per dollar, a 250-time improvement from 1990’s ratios. Today’s 64-bit operating systems can access almost limitless reserves of memory. Thus, in-memory data management gives companies a high-speed and a practical design for boosting certain types of applications.
These advantages come at an important time in the evolution of BI programs. Demand for analytical applications continues to growth as more companies give staff members throughout the enterprise dashboards and near real-time summaries of key performance indicators to boost the efficiency of business processes. Late last year, technology researcher Gartner reported the BI platform market grew 15.5 percent to about USD $4 billion in license revenues. The market could reach $6.3 billion in 2010, Gartner added. “[Enterprises] are building bigger data warehouses, and they also want faster response times and to do more analytics,” Greenbaum says.

What’s planned for the future?

Because they can accommodate structured and unstructured data, in-memory data management offers a solution for enterprise search applications, which must handle all the various ways organizations store data. Accordingly, SAP’s next application of an in-memory architecture will be its Enterprise Search Appliance. SAP released trial versions of the search appliance at the SAP TechEd ’06 conference and plans now call for the commercial version to arrive in 2007, Sikka says.
The new appliance will allow users in large organizations to find information from within storehouses of structured and unstructured data. The application will be able to search both SAP and non-SAP applications and will include interfaces into external sources of information. Enterprises will be able to set controls that authorize what information each user can see based on his or her access profile or role in the company. SAP says the new search product will be able to work closely with SAP NetWeaver BI Accelerator, thanks to their common use of in-memory data-management architectures.
Longer-term, SAP is investigating how it can use the in-memory approach with SAP NetWeaver Master Data Management, the component within SAP NetWeaver that consolidates customer, supplier, product, and employee data into single, centralized views.
How widespread the application of in-memory data management eventually becomes is still being determined. Enterprises can continue to use older relational databases to support their transactional systems. But what’s already clear, according to Sikka, is that going forward organizations will have an alternative to the relational database model for a growing number of applications. “Relational databases worked well for transactional applications. But for modern analytics, enterprise searching, and master data management, we no longer consider it the right approach,” says Sikka.

Alan Joch

Alan Joch

Tags: , ,

Leave a Reply