4 Years, 10 Trends

August 1, 2012 by Heather McIlvaine 0

(Photo: Fotolia.com)

In a rapid-fire 60-minute webinar, Gartner analyst Raymond Paquet presented his list of the top trends in technology that everyone should be keeping an eye on over the next four years. Some of the trends are already well-established, like the adoption of tablets in the enterprise, but others haven’t really taken off yet. What’s the deal with the service desk trend, you ask?

Despite the eclectic look of the list, Paquet feels there is one common denominator driving all ten of these trends: content. More than eight trillion text messages were sent in the past year, and more than 30 billion pieces of content are added to Facebook every month. It is our need to access this (and other) content from anywhere, at anytime, and on any device that is creating change in the IT sector.

In this article, we give an overview of all ten trends. You can also skip directly to the ones that pique your interest by clicking on the links below.

1. Consumerization of IT and tablets

2. Infinite data center

3. Resource management

4. Mobility

5. Hybrid clouds

6. Fabric data centers

7. Complexity

8. Big data

9. Service desk

10. Virtual and software-defined networks

(Photo: istockphoto.com)

1. Consumerization of IT and tablets (2012-2014)

It used to be (in a time before some of us can remember) that the computer we had at work, was the one we wished we had at home. In many cases today, the reverse is true. Now, the cutting-edge technologies are being adopted in the consumer market first and in the enterprise second. This process is known as the consumerization of IT. And it extends not only to desktops and notebooks, but also to tablets.

There is a huge drive to adopt tablets in the workplace today. Some of this pressure comes from employees who use a tablet at home and want to experience that same convenience at work. But there is also a big push from executives to adopt tablets for their “cool factor,” says Paquet. He predicts that in the next few years, more and more companies will be implementing tablets for these and other reasons. As this trend becomes the norm, IT will have two main concerns: security and cost.

At the moment, Apple dominates the tablet market. Paquet questions whether iOS-based devices will meet the security needs of enterprises. In addition, users won’t be replacing their desktop with a tablet. Rather employees will simply have more devices: a desktop PC, a notebook, a smartphone, and a tablet. That means additional costs for the company. It also brings up a third complication. Tablets, unlike PCs, are essentially output-only devices. Will organizations have to redesign and perhaps deploy new applications that are better-suited for use on tablets?

Read about infinite data centers on the next page

2. Infinite data center (2013-2015)

As content continues to grow at an exponential rate, we also need more space to store it. Thus, data centers are expanding. Not in physical size, but rather in their computing power. Machines are becoming smaller and racks denser, making it possible to increase performance in the existing space. Of course, this means that data centers are also consuming more energy in order to power and cool the servers. So how can companies grow their data centers without blowing up their electric bill?

The first step will be to take advantage of new cooling technologies, such as in-row and in-rack cooling and liquid cooling. While forced-air cooling (fans) is the norm in most data centers, very dense environments are seeing an increase in liquid-based cooling. This method is 40 percent more efficient than simply forcing air through the systems, but it is expensive to implement. That – and people’s natural aversion to mixing electricity and water – could account for the relatively slow adoption of this technology.

Replacing servers more frequently is another way for companies to increase computing power while reducing energy costs. Most companies replace servers every four years, says Paquet. In the meantime, the newest servers will have upped their computing power without requiring any increase in kilowatts. By replacing servers every two years, for example, companies could get more CPUs for the same amount of power and cooling.

Read about the increasing importance of energy efficiency on the next page

(Photo: istockphoto.com)

3. Resource management (2013-2015)

Fitting in nicely with the trend of the infinite data center, resource management is going to become a critical issue for companies in coming years. As Paquet points out, data centers can consume 100 times more energy than the offices they support. It no longer matters whether the motive is to be green or just to save money, in the near future, all companies will need to look for ways to cut energy costs and better manage resources.

To do so, organizations should make sure to keep two things in mind. First, they should consider the ratio of computing power to energy consumption. It may be worth it to purchase a new server if you can get more power for the same amount of energy use. Second, they should look at the amount of work being done in the data center. Energy efficiency isn’t just determined by the amount of energy a data center consumes for power and cooling; it also depends on the output. If a data center is full of servers running at minimum or low capacity, then no amount of liquid cooling will improve efficiency.

Read about mobile trends on the next page

4. Mobility (2012-2015)

According to Paquet, we have entered a new era of personal computing – and it’s a mobile one. The majority of PCs today are notebooks and laptops, and many people also have smartphones and tablets. Users want to be able to share their information and access it across multiple devices. The easiest way to do that lies in the cloud. In the next few years, says Paquet, the cloud will replace the PC as the place where we store our personal content.

The problem with this shift? More data stored in the cloud puts more pressure on our data centers, which are already suffering from information overload (see trends 2 and 3). Other complications are going to occur when the enterprise takes on the mobility trend. Devices are essentially user-controlled, and to that end, IT departments will need to develop bring-your-own-device (BYOD) policies, security concerns or not.

App developers face a separate issue. In the past, you could assume that users would be running an application in a Windows system on a desktop or laptop. Now developers need to design applications that run on a variety of devices and operating systems. While HTML5 web apps offer one way to solve that problem, many users still prefer a native app environment.

Read about the future of hybrid clouds on the next page

5. Hybrid clouds (2013-2015)    

With the increasing pressure to both expand data centers and run them more efficiently, many people are singing the praises of the hybrid cloud. By 2014, Paquet estimates that 80 percent of enterprise cloud initiatives will be for private and hybrid clouds. For many companies, the appeal of the hybrid cloud is that the application is run internally as much as possible. Only during peak hours does the application scale out to a public cloud. In theory, that sounds ideal. But the problem with this arrangement, according to Paquet, is that most applications today aren’t built to scale out to this degree. Unless companies deploy software that takes advantage of cloud architecture, they won’t get the full benefits of this approach. Unfortunately, most companies aren’t doing so. Over the next two years, says Paquet, more than 60 percent of cloud adoptions in the enterprise will be to redeploy current applications unchanged.

With the right software, hybrid clouds do offer enterprises an opportunity to shrink their data centers, cutting costs and reducing energy consumption. Still, companies should think carefully about the risks associated with a public cloud environment. What happens when there is a service interruption? How will this affect the application and users? Is there an agreed-upon response time in place?

Read about fabric data centers on the next page

6. Fabric data center (2012-2016)

Fabric computing is being touted as the foundation of next-generation enterprise IT architectures. According to Paquet, many organizations are already moving to a fabric-based infrastructure. This type of computing involves a set of storage, networking, and processing functions that are linked or “woven together” by high-bandwidth interconnects (where bandwidth speed is the total information flow over a given period of time).

If managed appropriately, companies can use a fabric-based infrastructure to provide just enough capacity and energy for optimum efficiency in the data center. (See trend 2 to learn why this is a big deal.)

Read about Glass’ Law of complexity on the next page

7. Complexity

Many people have heard of Moore’s Law, which but have you ever heard of Glass’ Law? This rule says that for every 25 percent increase in a system’s functionality, there is a 100 percent increase in complexity. Code is getting bigger and more complicated. Running Microsoft on VMware, for example, entails over 100 different performance and capacity settings, resulting in millions of possible combinations of these settings. According to Paquet, there’s no push in the industry to make things simpler.

Why is that bad? When you make a system more complex, the cost of running and maintaining it inevitably goes up. And the quality of service goes down. According to Paquet, a reduction in system complexity has an incredible return on investment. In the next years, companies should focus on making simplicity a priority.

Read about the opportunity/problem presented by big data on the next page

 

8. Big data (2012-2014)

At the beginning of the webinar, Paquet proposed that content is the driving force behind these ten movements in the tech industry. In a way, though, content is a trend in and of itself. Big data is probably the most talked-about topic in IT right now. Studies reveal that companies making data-driven decisions show higher productivity gains than other factors can explain. That’s reason enough for most companies to start running big data analyses. The challenge lies in making the data available for analysis, i.e. storing and managing the data.

The amount of data to be stored is growing at a rate of 65 percent annually. Eighty percent of that will be unstructured data, which is fundamentally more difficult to manage. Companies need to find a way to store big data without overloading their data centers (see trends 2 and 3). Since unstructured data usually isn’t the mission critical information that requires high performance storage, Paquet suggests companies use a lower-cost tier of storage to save money. In addition, most unstructured data is untouched after 90 days. Companies could easily archive this data, which means they wouldn’t have to keep replicating (and paying for) information they don’t use.

Read about the transformation of the service desk on the next page

9. Service desk (2014-2015)

The consumerization of IT is not only bringing tablets to the enterprise (see trend 1), it’s also disrupting other areas of the business, such as the service desk. More and more companies in the consumer market today are providing customer service help online and via social and mobile tools. People are getting used to tweeting rather than calling the airline if their flight is cancelled. When they want to figure out how to set up a new app on their tablet, they turn to the ultimate crowdourcing support tool, Google. And with all of it, they have come to expect rapid response times. While real-time support may not be realistic in the enterprise, the service desk will need to adapt to these other expectations going forward.

By 2016, Paquet says, service desks in the enterprise will be taking a proactive rather than reactive approach to IT problems. More focus will be placed on training and educating users on their devices to improve productivity. When a problem does arise, users will be able to contact the service desk via their mobile device.

Read about the advent of software-defined networks on the next page

(Photo: istockphoto.com)

10. Virtual and software-defined networks (2014-2016)

Lagging somewhat behind the arrival of on-demand software in the enterprise, the virtualized network is the missing piece needed for a truly virtual data center. A software-defined network, or virtual network, increases agility in the data center by separating the software from the underlying hardware. You are able to manage the entire network without being restricted by individual application silos. This allows you to more efficiently direct network traffic based on the user, application, geographic load, and other factors.

Tags: , , , , ,

Leave a Reply