Share

Why AI chip maker Nvidia can’t afford to rest on its laurels

AI isn’t going anywhere; and neither is Nvidia's competition
Why AI chip maker Nvidia can’t afford to rest on its laurels
Nvidia is the clear winner of the AI boom. But it's under increasing threat from a slew of competitors, old and new

Generative AI is the definitive technology of the decade. On the hardware side of things, though, it’s Nvidia that has made the most of the AI obsession. Its graphical processing units (GPUs) are ideal for efficiently handling AI workloads. So much so that the chip maker rode on the demand for its GPUs to become the most valuable tech company. And now its rivals want in on the action. 

Nvidia is not only facing pressure from established rivals, including some of its biggest customers, but also from startups like Groq. Yet, even as they continue to purchase silicon from Nvidia, virtually all of them are developing chips specifically for handling AI workloads. 

At its Ignite conference back in November 2023, Microsoft announced the Microsoft Azure Maia 100 AI Accelerator. The chip is designed specifically to train large language models (LLMs). Microsoft says Maia is designed to “power some of the largest internal AI workloads running on Microsoft Azure”. Microsoft Azure is the company’s cloud computing platform. Importantly, besides Azure, Microsoft is also working with OpenAI to optimize its hardware for the AI company’s software.

Read: Nasdaq hits two-year high on AI optimism as Nvidia leads the charge

Later in the same month, during its Reinvent 2023 conference, Amazon’s cloud computing subsidiary, Amazon Web Services (AWS), announced its new Trainium2 chip. Their chip is also built specifically to train AI models. It’s also roped in Databricks and Amazon-backed OpenAI-competitor, Anthropic, to build models with Trainium2.

AI ecosystem

Both Amazon and Microsoft have made it abundantly clear that they aren’t looking at the new silicon as replacement for existing suppliers (read: Nvidia). Their intent for the new chips is to offer another choice to their customers. 

That’s how things will be for the time being at least. But while they continue to throw money to buy chips from Nvidia, they also continue to fork resources to enhance their portfolio of AI chips.

And there’s a good reason for this. 

The AI boom led to a multi-fold increase in demand for GPUs, and their supply hasn’t kept up with the demand, for a variety of reasons. So while it’ll take the likes of Microsoft, Amazon and AMD some time to catch up to Nvidia, they are all betting they can get a foot in the door. At least as long as the demand outweighs the supply, which is how it’ll be for quite a while. The tech industry’s love affair with AI is unlikely to wear off anytime soon.

Out of reach

But it’s not just about hardware, says Michael Ashley Schulman, partner & chief investment officer at Running Point Capital Advisors. Nvidia, he explains, dominates the market, thanks to superior software and the extensive AI infrastructure already built around its products. 

He says the company has a multifaceted approach built around its hardware that helps it sustain its leadership position. He points to their recently announced new GPU, which is backed by their new software strategy.

“The company launched Nvidia Inference Microservices (NIM), a suite of software designed to simplify deploying AI models, similar to Apple’s approach of integrating hardware and software,” says Schulman. “This suggests Nvidia’s ambition to become a platform player like Microsoft, Apple and Google.”

This resonates with Manoj Sukumaran, principal analyst at Omdia. “There is a reason why Nvidia is the most preferred silicon vendor for AI,” says Sukumaran. “They have created an ecosystem of software, optimized silicon, (and) interconnected technologies over the past several years.” 

He is particularly impressed with NVLINK, which he says is unrivaled in the market. Although Compute Express Link (CXL) could possibly emerge as a challenger, Sukumaran says it isn’t yet mature enough.

“So if anyone is trying to make a custom AI silicon which can scale out like the Nvidia H100 or B100, they will have to develop silicon IPs in many of these areas,” says Sukumaran. “It is not an easy task and takes time.” 

Look over its shoulders

Schulman believes Nvidia is under increasing pressure to maintain its AI chip dominance. 

“While they dominate the AI chip market now, companies like Intel are launching competitive products,” says Schulman. “Additionally, major tech companies like Meta and Google are developing their own custom chips and accelerators, seeking to lessen their dependence on Nvidia.”

However, Sukumaran argues Nvidia shouldn’t worry about custom silicon from hyperscalers (large cloud service providers like AWS, Microsoft and Google). He believes their hardware would only be suitable for their custom use cases. 

“But to make it appealing to the broad market, I don’t see any major competition to Nvidia currently other than AMD,” says Sukumaran.

He believes AMD could take some of Nvidia’s share for AI inference. However, their lack of a scale-out interconnect like NVLINK limits its use. AMD’s answer to NVLINK is Infinity Fabric, but Sukumaran says it currently operates only between 8 GPUs. This prevents AMD from targeting large AI training clusters. 

“And the AMD software stack also needs a lot of improvements,” says Sukumaran. “So Nvidia has a clear advantage.”

Times change

While it might appear bulletproof now, there are chinks Nvidia’s armor. 

Schulman says since Nvidia is a fabless design company, its chief near-term weakness could be its heavy reliance on Taiwan Semiconductor Manufacturing Company (TSMC) for its chips. 

Read: AI should not be feared, governments must embrace it, says NVIDIA founder Jensen Huang

It’s no surprise Nvidia was willing to acquire ARM, the British semiconductor company, back in 2020. In fact, it made an eye-popping offer of $40 billion, before regulators axed the deal. The company has been looking at options to lessen its reliance on TSMC.

Schulman points to Saudi Arabia’s recent commitment to create a $40 billion AI fund spearheaded by the Public Investment Fund (PIF). 

“(The investment) is an indicator of regional support and the long runway ahead for the next evolution of AI and how it will transform services, manufacturing, finance, and business,” says Schulman.

For more news on technology, click here.

The stories on our website are intended for informational purposes only. Those with finance, investment, tax or legal content are not to be taken as financial advice or recommendation. Refer to our full disclaimer policy here.