If the forecasts are to be believed, we are standing at the doorstep of a new era. Thanks to artificial intelligence (AI), machines will soon drive our cars and our orders of cat food or paper towels will arrive at our local helipad, strapped to a tiny drone.
In other words, the future is almost here, we just have to tinker with the AI a bit more to get us over the doorstep.
But the era of AI has already arrived, albeit in ways more subtle than driverless cars and airlifted groceries. Thanks to recent advances in machine learning and deep learning, two subsets of artificial intelligence, AI now fuels everything from digital assistants — think Google Assistant, Alexa, and Siri — to customer relationship management (CRM) platforms. And demand is only growing. Worldwide, revenue from cognitive and AI platforms will nearly quintuple in the next five years, going from $2 billion in 2017 to $9.5 billion by 2022, according to International Data Corporation (IDC) forecasts.
AI requires two key ingredients: a colossal amount of clean, easily accessible data and enormous processing power. In the past, both were prohibitively expensive. Thanks in part to the ascendance of the cloud and advances in memory storage (moving from discs to flash), both data storage and data performance have gotten vastly cheaper and faster in recent years.
But the real fuel for the AI explosion has come from an unlikely corner: gamers’ insatiable appetite for ever more realistic video games. To render continually changing environments as a player moves through a virtual landscape requires enormous processing power. And that led to the development of ever more powerful graphical processing units (GPUs) capable of the parallel processing that AI and other advanced computing tasks require.
GPU makers have built their GPUs to handle highly parallel mathematical operations, the ones that generate beautiful graphics on your screen. These GPUs also happen scale down to simpler mathematical operations exponentially, which means they are naturally designed to handle even more simple mathematics — the kind of mathematics you actually find in machine learning.
“The processors are lightning fast,” says Eric Kavanagh, CEO of the Bloor Group. As a result, companies figured out that GPUs are very effective for a host of applications outside the gaming industry.
Analysts point to one GPU producer in particular: “The reason we’re in this new wave of capabilities is because of this incredible growth in compute power that was brought on by NVIDIA’s GPU,” says Patrick Moorhead, president and principal analyst at Moor Insights & Strategy.
Silicon Valley-based NVIDIA has around a 70 percent share of the world market for GPUs, according to some estimates. Its processors are used in everything from supercomputers used by scientists to model weather systems, nuclear explosions, and the early universe to the growing number of AI-powered enterprise applications being developed by big industry players like SAP.
With the combined power of GPUs and massive troves of raw data, today AI is off to the races. It touches every industry imaginable: healthcare, product design, search, retail, manufacturing — the list goes on.
A machine learning algorithm is able to “learn” much the way a human does. With practice and experience — read: data — it gets better and better at a given task over time. In other words, the algorithm can pore through mountains of historical data and recognize patterns and correlations. Machine learning and deep learning are essentially forms of pattern recognition, but done on a scale beyond human capacity.
SAP has access to a lot of the data required to properly train machine learning and AI systems. And SAP can do it “blindly” — without the need to know what the data actually contains.
“There’s a lot of value for this stuff in marketing,” says Kavanagh. Machine learning algorithms can look at a huge number of variables: When did a customer decide not to buy? What was the weather like that day? What time of day do they usually answer the phone? It can then deliver prompts to a sales or marketing team based on those patterns. And as the algorithm gets more and more data, it gets better and better. It ‘learns.’
“By and large, machine learning today is very useful for tackling very tedious tasks,” Kavanaugh says. In some cases, those tedious tasks were previously done by humans. Automation and AI is vastly increasing efficiency in some arenas, but it can also put people out of a job.
“I think it’s going to be important for us to think really hard about the re-skilling of society,” says Moorhead. “I don’t think we did a very good job, particularly in western nations, in the post-industrial revolution.”
But, Moorhead says, jobs will also be created. AI tackles some boring work that humans never dreamed of attempting — the data sets and number of variables are just too large and complex.
So machine learning is also delivering new information and correlations to the world. And that can spawn new kinds of jobs.
Andrea Mustain is a New York-based freelance writer for SAP.