First things first, in case you missed SAPPHIRE NOW 2017, SAP announced SAP Leonardo as its digital business system that enables you to innovate at scale to redefine your business.

Translation: this is quite the portfolio of solutions to help any business go through a “digital transformation” by taking “systems of record” — all those back-end systems, like ERP or CRM, focused on managing workflow, business processes, and records — and transforming them into “systems of intelligence” that automatically detect, interpret, and act upon piles of data.

So then, where does that data come from and how is it acted upon exactly? Well, the Internet of Things (IoT) is definitely a key contributor. This is because so much data now is being generated by a geographically dispersed set of sensor-enabled “intelligent” devices or assets — such as vehicles, containers, manufacturing machines, tools, and more — being sourced, created, shipped, and monitored all over the world. It’s estimated that 45 percent of IoT-created data will be stored, processed, analyzed, and acted upon close to, or at the edge, of the network. Perhaps paradoxically, while the world has gotten smaller due to real-time communication, the world has also grown more complex with these massive data sets available as a result of everything being connected.

The trick is to capture and act upon the signal while not getting lost in all the noise that comes with it. This means real-time data processing really is necessary at the edge of the network, where it’s actually being generated. There’s no sense capturing and transmitting trillions of points of data every day if they’re just to be thrown away tomorrow with no insights derived or actions taken. So how is actionable insight best accomplished with high volumes of geographically distributed data?

Please welcome to the stage: SAP Edge Services, a new solution that expands the capabilities available as part of SAP Leonardo IoT Edge. SAP Edge Services enables business context and intelligence at the edge, close to where data originates. Let’s illustrate this concept with an example, consider this scenario: you manage several warehouses which use hundreds of conveyors to optimize material flow for creating goods and fulfilling orders. Having centralized access to real-time data could be really valuable in determining capacity, such as what percentage of conveyor capacity is in use at different times; usage patterns, such as how and when each conveyor is used; and compliance, such as whether each conveyor is completing orders to the precise specifications required.

But what happens if you send all this data generated to the cloud for processing and analysis? Is it economically feasible to transmit to the cloud? What if connectivity is interrupted? If an incident happens, for example, non-compliance, how long before it’s recognized if that data is treated just like all the other business-as-usual data? You can already imagine the answer to all of these questions. But let’s put a finer point on it: “dumb pipe” data transmission — especially in a world where everything is connected — amounts to expensive, slow, and intermittent outcomes.

Now reimagine the scenario with a series of Edge micro-services. Let’s start with the business essential functions: this is about contextualizing the data through integrating it with back-end systems in place to enrich existing business processes and trigger actions, such as work orders. Next, a persistence service is used for configuring what data to store locally and implementing a data aging policy. Then, a streaming service enables data streams to be analyzed based on definable conditions with adjustable time windows for identifying patterns as a basis for automated events. For example, certain conditions can initiate notifications to appropriate parties. More advanced analytics can also be triggered when further investigation is needed. For example, an alert sent to the cloud can be followed by analysis of pre-selected data contextually relevant for the aforementioned alert. And let’s not forget about the possibilities soon to come with the machine learning service, which will aggregate trends and patterns across multiple data feeds and locations to offer even deeper insights.

This is just one use-case example and a sampling of the Edge services available. The larger point is that by enabling business context and intelligence at the edge, close to where data originates, SAP Edge Services addresses latency, bandwidth, cost considerations, and much more. When it comes to geographically distributed data, it also addresses other requirements, such as ownership, geographic regulations, and privacy. It effectively merges the best of both worlds: the power of the edge is combined with the power of the cloud for near real-time use cases and deterministic performance of business processes.

In short, SAP Edge Services provides the ability to set and distribute policies from the cloud, so that raw data can be enriched with business context at the edge, converted into meaningful insights for decision-making, all while triggering immediate business action without requiring a round-trip to the cloud every time. Going digital just got even more powerful.

Learn more about SAP Edge Services here.

Read more news and coverage from the SAP Leonardo Live event

Bob Caswell is product manager for SAP IoT Smart Connected Business