For the last three years, Nvidia (NVDA 2.82%) has been the most dominant company in the artificial intelligence (AI) landscape. The company's graphics processing units (GPUs) are the backbone on which generative AI is developed.
Hyperscalers including OpenAI, Oracle, Meta Platforms, and cloud platforms such as Microsoft Azure and Amazon Web Services (AWS) have collectively spent hundreds of billions of dollars clustering Nvidia GPUs inside of data centers to build their AI infrastructure.
While Advanced Micro Devices is largely perceived as Nvidia's chief rival in the AI chip market, a new threat is emerging: Alphabet (GOOGL 1.58%) (GOOG 1.49%). The internet giant is making waves in the semiconductor industry thanks to rising interest in its custom hardware, known as tensor processing units (TPUs).
Let's take a look at what Alphabet's entrance in the chip market means for Nvidia as investments in AI infrastructure continue to unfold. Should Nvidia investors be worried? Read on to find out.
Image source: Getty Images.
What is the difference between a GPU and a TPU?
Nvidia's GPUs are versatile pieces of hardware. These chips are designed to work in clusters in parallel with the company's CUDA software architecture. Taken together, Nvidia's ecosystem can be used to train large language models (LLMs) or help build more robust applications across AI robotics, autonomous driving, and quantum computing.
TPUs, by contrast, are much more specialized. Rather than being a purpose-built piece of hardware, Alphabet's TPUs should be classified as a custom application-specific integrated circuit (ASIC). This makes TPUs useful in extremely bespoke workloads such as deep learning.

NASDAQ: GOOGL
Key Data Points
Alphabet's TPUs are seeing strong demand, but there's a catch
Throughout the AI revolution, one of Alphabet's biggest winners has been its cloud division: Google Cloud platform. In recent months, Google Cloud has won notable deals with OpenAI as well as a $10 billion contract with Meta Platforms.
What investors might not fully comprehend is that the introduction of hardware now makes TPUs an interesting selling point within Google's broader cloud ecosystem.
Notably, Apple used TPUs to train its Apple Intelligence models. Meanwhile, Anthropic announced plans to expand its usage of Google Cloud in a deal featuring up to 1 million TPUs. Lastly, rumors are swirling that Meta is considering complementing its existing reliance on Google Cloud by deploying its own TPU clusters.
On the surface, accelerating TPU demand from big tech might sound alarming for Nvidia investors. However, there are some finer details that smart investors shouldn't overlook.
While Anthropic's relationship with Google Cloud is significant, the company also has strong ties with Nvidia. Just weeks ago, Anthropic agreed to purchase $30 billion of compute capacity from Microsoft Azure, which runs heavily on Nvidia's GPUs.
On top of that, OpenAI recently struck a $38 billion deal with AWS to access Nvidia's new GB200 and GB300 chips.
Furthermore, while Nvidia does not reveal the specifics around its customer concentration, many analysts on Wall Street suspect that the hyperscalers -- including Alphabet -- are among its largest chip buyers.
These are important details to understand. While TPUs represent potentially transformative dynamics in the AI chip market, many of their users -- including Google itself -- actually complement this custom hardware with Nvidia's general purpose GPUs. Against this backdrop, TPUs don't appear to be replacing GPUs at all.
Image source: Getty Images.
AI chips is not a winner-take-all market
Management consulting firm McKinsey & Company is forecasting AI infrastructure to be a $7 trillion market by 2030, with roughly $5 billion of this spend allocated toward AI workloads.
Given the hyperscalers are accelerating their capital expenditures (capex), I feel confident that demand for Nvidia's GPUs and accompanying data center services will remain robust for the foreseeable future.
GOOGL Capital Expenditures (TTM) data by YCharts.
It's possible -- even likely -- that rival accelerators from AMD in combination with custom ASICs such as TPUs will eventually erode Nvidia's pricing power in the chip market down the road.
Nevertheless, the introduction of TPUs is not a checkmate move by Google. If anything, complementing existing Nvidia infrastructure with custom chip designs reinforces how large of an opportunity AI infrastructure is becoming. In other words, AI chips are not a winner-take-all market.
For these reasons, I don't think Nvidia investors need to panic. The company still remains the king of the chip realm and appears well positioned to thrive in the AI infrastructure era.





