<img alt="" src="https://secure.insightful-enterprise-intelligence.com/783141.png" style="display:none;">

Access NVIDIA H100 in minutes from just $2.06/hour. Reserve here

Deploy 8 to 16,384 NVIDIA H100 SXM GPUs on the AI Supercloud. Learn More


Published on 23 Dec 2022

Data Centre Wars and Ditching the x86



Updated: 9 May 2024

In my last article I wrote about how hyperscale computing and the cloud are quashing the worry about Moore’s limitations of chips in terms of processing power.

So, how else can we ensure computing can keep up with our imaginations as we approach a new dawn in technology?

It starts from the heart; or in other words: the data centres themselves.

Currently, the data centre market is dominated by Intel and AMD, the market leaders in CPUs. To give you an idea, in 2020, this pair captured a whopping 92% market share within the data centre industry. However, according to a recent article, NVIDIA is expected to lead the data centre market by 2030. So, what’s going on?

NVIDIA's GPU Dominance

The quick answer is that NVIDIA are GPU behemoths. In 1999, NVIDIA introduced the GPU, a chip designed to quickly handle a large amount of data in parallel. GPUs were initially developed for rendering more realistic-looking video game graphics, but they have since become an important component of data centres.

Specifically, NVIDIA GPUs have become the gold standard for accelerating workloads like analytics, artificial intelligence, and scientific computing. In fact, NVIDIA currently controls 90% of the market for supercomputer accelerators. The versatility of their GPUs allows NVIDIA to serve a diverse customer base and capture a significant share of the data centre market – a perfect recipe for success.

The Shift from CPUs to GPUs

In just 2 years, the market has begun to change drastically. Back in 2020, data centre budgets were still heavily skewed towards CPUs, with these chips comprising 83% of total spend on processors. Since then, we have seen a huge growth in AI, and this has helped spur Intel and AMD to undergo a downturn; a clear example of this can be seen within Intel's data centre and AI solutions - their revenue is down 16% year over year.

This kind of shift in data centre technology is a steady one, not a disruptive one, as replacing data centre hardware can be costly and time-consuming. As a result, older models of CPUs and GPUs will likely remain in use for a long period of time, even as newer models are introduced. Take NVIDIA’s T4 GPUs, for example; it is still one of AWS’s most popular offerings, even though it is already nearly 5 years old. 

Speculators believe that the spend on data centre CPUs will drop from 83% to 40% over the next decade. In other words, by 2030, GPUs will not only be the dominant data centre accelerator - they will also be the dominant processor.

NVIDIA's Grace CPU and the Rise of ARM

NVIDIA’s recent announcement of their Grace CPU should also bring warning signs to AMD and Intel. This processor works off ARM cores, which packs 10x the performance of today’s popular CPU’s working off an x86 architecture. To this end, speculators are estimating that x86 market share will drop from 92% to 27% by 2030 as the technology becomes more readily available.

In the years ahead, NVIDIA should see strong demand as GPUs become the most prevalent data centre processor, and ARM chips become the most prevalent data centre CPU. It’s looking as though the company has set the precedent for data centre technology this decade; Intel and AMD will have to work hard to position themselves correctly and adapt to new technologies to maintain their leadership positions.

Get Started

Ready to build the next big thing in AI?

Sign up now
Talk to an expert

Share On Social Media

Hyperstack - Thought Leadership link

16 May 2024

After months of anticipation for ChatGPT 5, OpenAI has instead released ChatGPT 4-o - a ...

Hyperstack - Thought Leadership link

16 Apr 2024

AI is taking over the world by storm but what does it mean for your business? When ...

Hyperstack - Thought Leadership link

12 Mar 2024

Global interest in generative artificial intelligence has skyrocketed over the past two ...