Amazon (AMZN 0.80%) CEO Andy Jassy offered plenty of valuable tidbits for investors in his second annual letter to shareholders. One of them was regarding the hot semiconductor industry, the very bedrock upon which all computing technologies are based. Amazon has actually been investing in its own in-house semiconductor designs for years, and it's been increasingly focused on high-performance machine learning processors which are used, among other things, to power the large language models (LLMs) that services like ChatGPT are built upon.

Nvidia (NVDA -1.84%) stock has been booming recently based on optimism about its long-term revenue growth prospects from LLMs, as its graphics processing units (GPUs) are leading the charge in this department. But could Amazon's chip investments mean trouble for Nvidia? 

Jassy on what makes a good investment

Amazon Web Services (AWS) was a pioneer in the rapidly expanding cloud industry. Today, it's AWS that pays the bills at the Amazon empire, and helps fund plenty of other business investments.

But how does Amazon decide to invest in a new venture like semiconductor design? As Jassy explained in the 2022 annual shareholder letter, there are four primary metrics:

• If we were successful, could it be big and have a reasonable return on invested capital?
• Is the opportunity being well-served today?
• Do we have a differentiated approach?
• And, do we have competence in that area? And if not, can we acquire it quickly?

Apparently, the answer to each of those four questions was "yes" when AWS considered designing data center chips -- the computing hardware that powers the cloud. To kick start its silicon dream, it quietly acquired Israeli chip design start-up Annapurna Labs in 2015 for $350 million.

How Amazon's investment has paid off

Annapurna Labs has designed numerous chips for AWS, including its Graviton processors -- ARM-based chip alternatives to the CPUs provided by Intel and AMD. But what about computing accelerators such as Nvidia's GPUs, which are powering new AI services like ChatGPT?

That's where the AWS Trainium and Inferentia chips come in. Neither of these computing accelerators rival Nvidia's latest-and-greatest designs in sheer computational power (neither do Alphabet's Google Cloud in-house chips). But that wasn't Amazon's primary goal when developing Trainium and Inferentia. Cost-effectiveness was the objective.

As their names imply, Trainium is geared toward training LLMs in how to behave, using massive amounts of data. Inferentia is for inference, which is where the bulk of the computing work is done after an AI model is trained. Inference is how that trained AI program makes decisions (like when you ask ChatGPT a question, and it comes back with an answer) based on what it has already learned.

AWS is using Trainium and Inferentia for itself, but has also made its more cost-effective accelerators available to customers. Jassy said in the shareholder letter that common AI models trained with Trainium "are up to 140% faster" than similar GPU systems "at up to 70% lower cost." And as for AI inference, Jassy said its Inferentia chips have "saved companies like Amazon over a hundred million dollars in capital expense" since their introduction in 2019.

Put simply, Amazon's $350 million investment in Annapurna back in 2015 looks like it will have an incredible long-term payoff for AWS, and for shareholders. 

Pushing Nvidia to do better?

Competition is a great thing because it keeps business leaders pushing their companies toward continuous improvement. Nvidia is going to make hay from its most cutting-edge GPUs geared for advanced AI, but it has lots of other chips it can keep improving too. For example, in March, it launched new L4 GPUs geared toward AI inference, complete with a software stack to help optimize various AI workloads and lower the total cost of ownership for cloud providers and customers.  

Indeed, while Amazon AWS and other cloud providers stir the pot with announcements about their own silicon designs, AWS remains a major Nvidia customer. AWS's in-house chips currently fill a small niche within the cloud titan's operation.  

Of course, rising competition from its fellow tech giants is a big risk for Nvidia. But it's far from defenseless. Plus, cloud computing and AI are still early on in their adoption curves. As Jassy pointed out at the conclusion of his shareholder letter, though AWS revenue was $80 billion in 2022, "about 90% of Global IT spending" is still made in on-premises systems that have yet to migrate to the cloud.

In other words, though Amazon AWS has made quick progress on designing chips in-house, there is plenty of new business to go around. Nvidia will be just fine.