How a Kinetic Infrastructure Facilitates the Use of Emerging Technologies

The promise of a great customer experience is no longer just for consumers. In our conversations with hundreds of thousands of Dell customers, we find that business leaders increasingly expect to get precisely what they need from technology vendors too.

Our internal research supports this finding. By a ratio of four to one, our customers believe that a modular IT architecture is the right way to get what they need. And we agree. Modular hardware brings compute, storage, and networking hardware together seamlessly, so IT experts can precisely provision technology resources to meet the company’s needs.

SAP has launched its new thought leadership journal Horizons by SAP, which brings together global tech leaders from various companies to share their perspective on the future of IT. In the coming weeks, one article from the journal will appear on the SAP News Center per week. Here, Ravi Pendekanti, senior vice president of Server Product Management and Product Marketing at Dell, explains how to reduce purchase costs with the help of kinetic infrastructures.

Modularity helps enable something that we call a kinetic infrastructure. Like kinetic energy, which can be transferred between objects and transformed into other kinds of energy, kinetic infrastructures enable companies to assign the right resources to the right workload. They can also change resources dynamically as business needs evolve.

Think of your data center: Whatever its needs or workloads are today, they will be different 12 months from now. As requirements change, a kinetic infrastructure allows you to add or subtract necessary technology elements to and from your IT landscape. It scales seamlessly to match the current workloads, data center requirements, and storage needs.

How does this approach differ from traditional IT approaches? With legacy solutions, companies typically buy compute power for peak performance, which only occurs occasionally. They start off with a fixed amount of storage, a certain number of CPUs or compute power, and an associated amount of memory.

When demand rises and resources are insufficient, the organization buys more technology – typically added as a bundle of compute, storage, and memory. If you increase memory, you probably have to buy storage and CPUs too.

In contrast, a kinetic infrastructure begins with exactly what the business needs now. As needs change, the IT team adds more resources on the fly. If the company needs more memory, IT adds only memory. There is no need to procure additional CPUs and storage.

As a result, kinetic infrastructures reduce the purchase costs of unnecessary technology, and they lower the cost of administration and maintenance. By acquiring and managing only what they need, companies also slash their total cost of ownership compared with legacy technologies.

Now Available: Horizons by SAP

Horizons by SAP is a future-focused IT journal. Thought leaders from the global tech ecosystem share their thinking about how new technologies and major business trends will impact our customers’ landscapes in the fast-arriving future. The first issue, available at, revolves around the implications and opportunities of modular IT.

Emerging Technologies on Modular Hardware

This hardware flexibility is especially valuable with data-intensive emerging technologies, such as artificial intelligence (AI), machine learning, and the Internet of Things (IoT). Let’s consider an example. When IT professionals think of deploying a machine learning application, they usually assume that they need many graphical processing units (GPUs). In a kinetic infrastructure with modular hardware, that’s not necessarily true.

Depending on your company’s requirements, you can perform machine learning with your existing processing capabilities. You can, for example, use field-programmable gate arrays, which consume less power and are more affordable but require programming skills. Or you can buy more GPUs, although they cost more and require more power.

There is no one correct approach to a machine learning deployment. When using modular hardware, you can make the best choice based on your technology needs, workloads, and budget.

In fact, kinetic infrastructure is so flexible that hardware concerns generally take a back seat to more fundamental issues, such as collecting the right data and choosing the right framework (for example, TensorFlow from Google or Microsoft Cognitive Toolkit). Spending time on these issues will help ensure that you gain maximum value from your machine learning, AI, or analytics application.

IoT Data Collection

Another critical emerging technology catalyzed by modularity is IoT. Experts estimate that there will be billions of devices in the world in the next few years. The devices alone have little value. But what about the data they produce? That’s where companies can really develop market-changing innovation.

For an IoT use case, organizations must collect and synthesize the data and analyze it at the data center. Some of the applications for this are impressive. I’ve visited some smart buildings with about 30,000 end points that collect all kinds of data on the facility environment. Based on analytics, businesses can lower the air-conditioning temperature in a crowded conference room or turn off the lighting after the workday ends.

With so many end points at the network edge, however, companies need to deploy the right servers in the right places. Not all devices are equal. Purpose-built servers can help capture and recognize the data that’s being processed. At the edge, your best choice might be a low-powered, one-socket server.

In the data center, a more modular approach will allow you to scale analytical capabilities to meet the changing needs of the business. In this case, a four-socket server in the data center might be the right option.

Companies also need a consistent, seamless way to manage all of these devices. Comprehensive solutions, such as our Dell EMC OpenManage Enterprise, manage everything from the edge to the data center. Based on open RESTful APIs, the tool plugs into all popular management frameworks. IT professionals can use it to manage complex deployments from the edge to the core to the cloud.

There’s another wrinkle to this, though. With applications like IoT, some companies are thinking about processing data at the edge instead of in the data center. Think about cases such as a connected sports stadium, an oil rig, or a factory floor. In these environments, it might not make sense to pump data back to the core for processing. At Dell, we’re working on new innovations to address this issue.

More Proactive, Predictive Capabilities Ahead

Looking ahead, I expect it won’t be long until we realize the implementation of a completely kinetic infrastructure where companies will be able to use exactly the resources required. A company that needs more processing power to perform machine learning will soon be able to plug in additional GPUs without the added payload of more CPUs, memory, or hard drives.

Today, most of the server deployments we see are very reactive. Something breaks, and companies deploy new servers to fix a problem. Over time, I expect to see IT become more proactive as predictive analytics becomes a key component of the kinetic infrastructure.

Modular IT will support this evolution. IT professionals will be able to use machine learning and deep learning to identify when the IT infrastructure needs more CPU resources or more memory, for example. It will also provide a tight loop mechanism that measures performance and recommends resource changes based on the specific workloads running. With that insight, IT leaders can deploy additional resources or correct problems before something breaks.

Our customers continue to demonstrate demand for a more dynamic environment that delivers business value through technology – in a faster, more differentiated way. As the number of end points grows, there will be more volumes of data to analyze and more technology to manage.

To deal with these challenges, we depend more than ever before on open, collaborative partnerships, such as the one Dell has with SAP and Intel. Being able to co-innovate openly and honestly with our partners through these established relationships will be especially important as the number of geographically dispersed devices grows and IT architectures become more complex.

Complexity is a huge factor in each of these deployments. But modular hardware solutions that create a single interface, streamline technology deployment, and simplify management are an essential foundation for creating an excellent customer experience.

Ravi Pendekanti is senior vice president of Server Product Management and Product Marketing at Dell.

This article also appeared on LinkedIn.