In a recent Q&A with SAP, “The Value of Data and Analytics in Digital Transformation,” Dan Vesset at IDC makes an interesting observation. Deep into the exchange, he points out that today, “there is significantly greater acceptance by IT that it shouldn’t control all things analytics.”

The reasons why are intriguing. IT control, of course, is a long-standing issue when it comes to enterprise technology in general – and models have swung from one extreme to the other. In the interest of security, some IT groups wield considerable control over who uses what technology. In the interest of flexibility, others are quite open – allowing almost anyone to access anything with minimal controls.

Ideally, you want to find the sweet spot that balances these two extremes so that your company can minimize risk on the one hand while enabling the flexibility to innovate and serve customers more effectively on the other. This, I think, is fairly obvious. But the dichotomy between security and flexibility is not really what Dan has in mind.

Complexity and Control

What he has in mind is complexity. The fact is, many IT organizations exert control over enterprise technology due to its complexity. Let’s call it “control by necessity.” A classic case is analytics, which until recently has almost always been no more than a step away from the complexities of data management. Analytical tools were complex – because using them required a fair amount of data management expertise.

In the past, analytics involved data warehouses stored on disk where experts ran batch jobs on data subsets and generated reports that were then delivered to the business. Dashboards were a nice advance, but IT had to build them, and they quickly lost relevance. IT controlled analytics because they needed to.

The New Face of Analytics

Today, things have shifted. Today, businesses are using new technologies in new ways to bring analytics directly to the consumer. Dan runs through a few examples:

  • Advances in user interface design: Today’s state-of-the-art UIs take a cue from the mobile world, where data can be accessed and manipulated with touch-screen simplicity. Not only are interfaces increasingly designed with the user in mind; they’re designed by the user to meet individual needs and preferences.
  • In-memory databases: Rather than storing transactional data and analytical data in separate silos, companies can now store both kinds of data in active memory, where it is easier to work with. This is helping companies sense and respond to developments faster and more effectively.
  • Cloud computing: Modern analytics, Dan says, is architected for the cloud. New data platforms that run in the cloud – or at least deliver analytics for consumption via the cloud – are helping companies meet demand for data by business users with greater flexibility at lower costs.
  • Machine learning: Companies can now use intelligent algorithms to analyze process data, identify issues, and take action to improve processes – often without human intervention. This only makes things easier for consumers of analytics, who can now spend more time on higher-value tasks.

Bring Analytics to the Masses – but Start Small

Surely other technologies and trends have played a role in simplifying analytics for the business consumer. And we can expect more technologies to emerge over time. But whatever specific technologies a company chooses to adopt, Dan warns against big-bang digital transformation projects that are implemented enterprise-wide for ill-defined reasons.

Most companies are better off with targeted projects that fulfill defined needs or answer specific questions. Fortunately, today’s leading-edge cloud analytics platforms are designed for rapid expansion. The best approach is to get comfortable first, expand as needed, and then evaluate if cloud analytics is something that could help enterprise-wide.

For more on the issues covered in Dan’s interview, get the full text here. It’s worth a read.

This story originally appeared on The Digitalist.