consolidation, IT, data loss, business news, data, hyperscale

The number of large data centers operated by hyperscale companies has surpassed 1,000, now accounting for 41% of global data center capacity, according to new data from Synergy Research Group.

Over half of this hyperscale capacity is in owned data centers, with the rest leased in facilities. In contrast, non-hyperscale colocation centers hold 22% of the total capacity, leaving on-premise data centers with just 37%, a significant drop from nearly 60% six years ago.

Looking ahead, Synergy predicted that by 2029, hyperscale operators would dominate with over 60% of total data center capacity, while on-premise facilities will shrink to just 20%.

AWS

Despite a slow decline in colocation’s share of total capacity, colocation centers will continue to see steady growth.

Meanwhile, on-premise data centers will experience a sharp decrease in their share, dropping nearly three percentage points per year, although their actual capacity will remain relatively stable.

John Dinsdale, chief analyst and research director, Synergy Research Group, explained the key factors driving the rapid growth of hyperscale data centers over traditional on-premise facilities include a huge increase in the use of cloud services.

The key benefits to enterprises of cloud services include flexibility, speed, rapid access to emerging technologies and lower cost of operations.

He noted in 2012 enterprises spent twelve times as much on their data center hardware and software as they did on cloud infrastructure services, while today they spend three times more on cloud services then they do on their own data center infrastructure.

Add to that the huge growth in SaaS and consumer-oriented digital services such as social networking, e-commerce and online gaming, and the result is the burgeoning growth in hyperscale data centers.

“Enterprises used to view data centers as being strategically important assets; now most view them as a pain in the rear,” Dinsdale said. “It is usually easier and better to either push workloads onto public clouds or to push data center hardware into offsite colocation facilities.”

He explained similar strategies are playing out pretty much everywhere around the globe, though at somewhat different speeds and sometimes with a slightly different focus.

“While there will always be an element of hybrid operations, the overarching theme is to push as many workloads as reasonable onto public clouds, and to minimize ongoing investments in on-premise data centers,” he said. 

Ari Weil, cloud evangelist at Akamai, said when it comes to the potential risks and benefits for companies relying more heavily on hyperscale and colocation data centers, there are very few benefits to relying more heavily on hyperscalers than businesses already do.

“There is the sugar-high benefit of a robust ecosystem of products, advice and service providers to streamline new application development and scaling,” he said. “Heavily relying on hyperscalers, more than anything else, leads to concentration risk.”

He said there are the downsides of a lack of portability and inefficient architectures that are difficult to orchestrate and manage and lead to high costs.

“But the more material threat lies in the fact that the centralized cloud providers have built their platforms with solutions that encourage service consumption and quickly lead to lock-in,” Weil said. 

This, in turn, increases the dependency on a single provider for uptime and availability, and magnifies the risk that a security vulnerability or compromise will extend to a company’s core business.

“In many cases, predatory partnership and marketplace financial models also lead to the hyperscalers directly competing with the very companies they purport to enable,” he cautioned. 

Weil said with rapidly growing ecosystems of both AI models as-a-service and open source model libraries such as Hugging Face, intensive in-house model training will see a decline in most enterprises.

He predicted prioritization would shift to model operations, preventing anomaly and drift, and lightweight optimization/augmentation of models on an ongoing basis to power business critical processes such as forecasting, churn modeling, document summarization, customer service optimization (chatbots), and other prediction-based workloads.

“Ultimately, we’re going to see a blend of hyperscale, on-premise and edge to address an organization’s specific needs and priorities and optimize their IT infrastructure,” Weil said. 

Techstrong TV

Click full-screen to enable volume control
Watch latest episodes and shows

Qlik Tech Field Day Showcase

SHARE THIS STORY

RELATED STORIES