Less like a data warehouse and more like an intelligence fulfillment center, hyperscale environments are reshaping how companies face the IT landscape. Do customers know what they are and why there is so much buzz around them?
What is Hyperscale?
The faster businesses must shift, pivot, and respond to change, the more flexible the IT architecture must become. However, an increasing hardware footprint, continuous adoption of new technologies, and growing volumes of Big Data can make on-demand scaling of digital capabilities incredibly unsustainable for any single business.
For many businesses, the answer to such a growing challenge is the adoption of a hyperscale environment. The design takes advantage of the power of converged networking and software-based control elements of traditional hardware resources – all deployed on a foundation of virtual machines. It also seamlessly provisions and adds computing speed, memory, connectivity, and storage capabilities to one or more data source that is a part of a broader ecosystem of distributed, grid, or local computing.
Created primarily for a cloud landscape, hyperscale computing is becoming a must-have when building comprehensive IT architectures. It can turn data into intelligent assets the moment it is captured while also scaling and mapping digital capabilities instantly across a distributed storage system.
How is a Hyperscale Environment Supported?
A hyperscale environment is mostly a grid of small and affordable servers specialized on a specific task. Its high-availability nature is the polar opposite of virtualization, where several virtualized instances of applications are running across vast landscapes. The spectrum of usage types is more significant for hyperscale computing, and other applications can be defined.
Take for example the two elements of a hyperscale environment:
Software-defined networking technology enables a dynamic, programmatically-efficient configuration for network management. By improving network performance and monitoring, software-defined networking enables a cloud-like landscape that addresses the static, decentralized, and complex natures of traditional networks with flexibility and intuitive troubleshooting.
For example, software-defined networking may be used to centralize network intelligence in one network component by disassociating the forwarding of a data unit from the routing process. In this case, the routing process consists of one or more controllers that act as software-defined networking’s brain – where all intelligence is incorporated. However, centralized business intelligence can negatively impact security, scalability, and elasticity.
Software-defined storage is computer data storage software for policy-based provisioning and data storage management, independent of the underlying hardware. The technology includes a form of storage virtualization to separate storage hardware from the software that manages it. It also enables an environment that can govern policies for features such as data de-duplication, replication, thin provisioning, snapshots, and backup.
Additionally, software-defined storage-related hardware may or may not also have abstraction, pooling, or automation software of its own. It may recommend software such as:
- Virtual or global file systems when software-defined storage is implemented as software-only, along with commodity servers, with internal disks
- Storage virtualization or resource management if the software is layered over sophisticated large storage arrays
- Intelligent abstraction to automate protection and recovery if the policy and management functions include artificial intelligence
Software-defined storage can also be implemented through appliances over a traditional storage area network (SAN), deployed as network-attached storage (NAS), or applied as object-based storage.
Why is Interest in Hyperscale Providers Growing?
Companies that choose to adopt a hyperscale environment are often confronting several challenges. However, the main question is how a hyperscaler data center is different from one running on premise or through a traditional cloud provider. The answer? It is all about size and scale.
Most businesses tailor their data centers based on current needs and available budget. However, this approach leaves little room for addressing future growth with precision. Plus, there is little visibility on how to scale an environment without degrading the applications running on it. Whether hosted applications will perform better with 1,000 servers or 100,000 servers, no one knows for sure.
On the other hand, a hyperscaler data center works best when designed for applications that behave with similar CPU usage, memory consumption, and storage needs. If a hyperscaler data center is integrated tightly with an on-premise data center, for example, the deployment of new systems and environments is often swift and straightforward to achieve. Projects are also supported during the earliest stages and can jump-start the move from proof of concept to productive operation.
Systems that need to be quickly started up and run down are best placed within a hyperscaler data center. Those systems usually require few services and little attention. Yet, when peak performance is needed, some hyperscale providers allow applications to access resources, when needed, and scale down to “normal” performance needs.
Hyperscaler scenarios can also help support the need to spread access to the hyperscaler data center across many geographical locations. For many businesses that act in a regulated market or must comply with regional data-management requirements, the trustworthiness of the hyperscaler hinges on whether laws and regulations are followed perfectly.
Is a Hyperscale Environment in Your Future?
When considering a path to digital transformation, adopting a hyperscale environment is just one of many possibilities to consider. Yet, it is an innovation that is worth watching as data management and application performance become increasingly critical to long-term business success.
Next in the “Enterprise Architecture and Landscape Strategy” series, consider a use case that can help you fully embrace the value a hyperscale environment can bring to your business.
Thomas Hauschildt is chief enterprise business architect at SAP.