AMD Set To Lose Nvidia As A Customer (NASDAQ:AMD)

Semiconductor Maker Advanced Micro Systems Reports Quarterly Earnings

Justin Sullivan

First of all, when one thinks of Advanced Micro Devices (NASDAQ:AMD) and NVIDIA (NASDAQ:NVDA), one usually thinks about “competitors”. This is most obvious when one thinks about GPUs (Graphical Processing Units), which go into servers and PCs/laptops, most often to service AI, gaming needs or HPC (High Performance Computing) workloads.

Less well-known, is the fact that NVIDIA is also an AMD customer. How can this be? Well, NVDIA creates servers which, as part of processing AI or HPC workloads in the server room, typically use AMD EPYC CPUs paired with NVIDIA GPUs.

The CPU is used because AI workloads aren’t all necessarily performed optimally in the GPUs, and also to feed the GPUs with data when workloads are, indeed, fit to be performed there.

Now, first of all, unlike when Intel (INTC) lost Apple (AAPL) as a customer for its CPUs, NVIDIA doesn’t look like a large AMD customer. Neither NVIDIA nor AMD list the other company as being relevant as a supplier or customer. However, this loss still has relevance, because it is part of a broader trend (just like with Apple). Let me explain.

The ARM-x86 War

As I’ve written earlier (I, II), there’s the real chance that the x86 architecture, sold by Intel and AMD, is entering into a life and death fight with CPUs based on an ARM architecture / instruction set.

These CPUs, which can either use core designs licensed by ARM itself, or use custom core designs which just leverage the ARM instruction set, are direct competitors and substitutes to x86-based CPUs. ARM-based CPUs are best known for dominating the smartphone market. However, they’re now also starting to be seen in laptops, namely Apple’s entire Mac line.

Moreover, now ARM alternatives are also starting to appear – in a competitive position – in the server room. A good example here would be Amazon.com’s (AMZN) Graviton line, already on its third generation.

It’s in this context that this article fits. NVIDIA, by launching the NVIDIA Grace CPU, is also taking yet another ARM-based CPU product into the server room. In this case, NVIDIA is striking mostly at GPU and HPC (High Performance Computing) workloads, by combining its new Grace ARM-based CPUs with its own GPUs. Of course, in the previous generation NVIDIA combined its GPUs with AMD EPYC CPUs. It is in that sense that AMD is losing NVIDIA as a competitor.

That said, the NVIDIA Grace CPU will also be available in a CPU-only server configuration, so AMD isn’t just set to lose NVIDIA as a customer, but also to gain it as a new server room competitor. Of course, the same problem applies to Intel.

What’s Special About NVIDIA’s Grace CPU?

Well, to start, it’s promised for H1 2023, so this isn’t a pie-in-the-sky competitor, but something set to arrive soon.

Secondly, it’s based on a new ARM server core architecture just unveiled by ARM, the ARM Neoverse V2 core. Hence, if NVIDIA is able to deploy these cores in H1 2023, other vendors are likely not far behind. Thus, ARM-based competition in the server room is set to intensify even outside of NVIDIA’s offering.

Also, NVIDIA likely made this choice to dump x86 for its own alternative because such provided a favorable compromise on many levels, such as performance cost, cost of operation or margins.

When it comes to performance, ARM-designed (not Apple-designed) ARM cores are still not quite on a par with x86 cores. As I wrote regarding Amazon.com’s Graviton3, what makes them immediately competitive in cloud environments, is the pricing model. The large cloud providers rent virtual cores. 1 virtual core corresponds to 1 (out of 2) threads on a physical x86 core, versus 1 thread on a physical ARM core. This greatly skews things in favor of ARM.

However, NVIDIA isn’t renting cores. Hence, something else must have caught its attention beyond just good promised integer performance. In my view, that thing was the integration of the ARM cores with NVIDIA NVLink 4 fabric, making for tremendous performance (bandwidth) improvements when moving data between the GPU, CPU and memory.

As for costs, NVIDIA sending its own chips for a third party to fabricate eliminates an entire middleman, thus it makes the chips cheaper for NVIDIA.

On operating costs (for cloud customers), ARM offerings are typically more power efficient. Hence, again, they’re an easy sell (for cloud customers) as long as ARM-based offerings aren’t terribly behind on other metrics. It’s also worth a mention that NVIDIA is planning on using low-power LPDDR5 DRAM with its Grace CPUs.

Finally, on margin, both for NVIDIA and for Apple, if you’re not buying your chips from a chip vendor like Intel or AMD, and instead are designing them yourself, you’re keeping the chip vendor’s margin.

Conclusion

The event of AMD losing a server room customer, NVIDIA, and gaining a server room competitor, has deeper meaning. It’s yet another step in the ARM-x86 war where ARM has started to make large inroads into the x86 market.

This has been seen in Amazon.com’s Graviton iterations for the cloud market. It has been seen in Apple’s wholesale replacement of x86 CPUs in its PCs and Laptops. And it is now again being seen in NVIDIA replacing the x86 CPUs in its offerings with its own ARM CPUs.

In my view, ARM’s designs as of late have actually lost a bit of mojo, while the intense competition between AMD and Intel has resulted in comparable progress in the x86 world. However, even though this has happened, ARM designs are now broadly comparable with x86 offerings, with Apple’s custom advantage being large enough for Apple to even replace x86 wholesale.

Soon after (late 2023), another event is likely to shake the ARM-x86 war: The arrival of the first Nuvia-designed ARM-based custom CPUs. These promise a large jump in performance and efficiency, and should lead to a renewed push by ARM as well.

Finally, on top of more ARM-based competitors dividing the computing pie, AMD and Intel are now also potentially competing for a smaller pie due to an impending economic slowdown.

Be the first to comment

Leave a Reply

Your email address will not be published.


*