It's been a good week for shareholders of Nvidia (NVDA 1.69%). The market has been worried about the powerhouse artificial intelligence (AI) company, but management assuaged many fears about its future opportunities and its ability to thrive despite regulatory issues that are weighing on its financial statements.
CEO Jensen Huang gave many positive updates, and they don't just bode well for Nvidia, but for the long-term potential of generative AI -- and more specifically, AI superstar Amazon (AMZN 0.80%).
The age of generative AI
The market wasn't sure what to expect from Nvidia's latest earnings release. Although the semiconductor giant is the dominant chipmaker in the industry, there have been a few updates that some believe mar its investing thesis.
The market was disturbed when Chinese AI platform DeepSeek came out earlier this year, making people question whether or not Nvidia's high-priced chips were really necessary for generative AI development. And some chip shipments have been disrupted by regulatory efforts to keep the U.S.'s best equipment out of China.

Nvidia CEO Jensen Huang. Image source: Nvidia.
But performance has continued swimmingly at Nvidia, and it just released another fabulous round of earnings for the 2026 fiscal first quarter (ended April 27). Revenue increased 69% year over year to $44.1 billion, better than the $43.3 billion analysts were expecting, and adjusted earnings per share (EPS) were $0.96, better than the $0.93 target from analysts. Including a one-time charge related to deliveries it couldn't ship due to regulations about what chips it could send to China, EPS was $0.81, still way ahead of $0.60 last year.
But it's the long-term outlook that keeps Nvidia's investing thesis strong. On the earnings call, Huang said: "We know that AI is this incredible technology that's going to transform every industry from, of course, the way we do software to healthcare and financial services to retail to, I guess, every industry, transportation, manufacturing. And we're at the beginning of that."
How Nvidia is ramping up
Nvidia is now addressing the demand that it has helped create. Its most advanced chips, Blackwell, are selling fast, and they're in high demand from its large cloud computing clients like Amazon and Microsoft, the two largest global cloud providers.
Nvidia's chief financial officer, Colette Kress, said on the earnings call that Microsoft alone was already using tens of thousands of Blackwell GPUs, and Blackwell accounted for 70% of data center sales in the first quarter.
The company is already working on its next generation of GPUs, with each iteration more powerful, to handle the breakneck speed of AI development. Blackwell Ultra is already rolling in the current quarter, and Nvidia is working on even more powerful chips that can handle the inference part of generative AI, where the greatest demand is right now.
How this helps Amazon
Huang's sentiments echoed what Amazon CEO Andy Jassy recently said about the future of generative AI:
I spent a fair bit of time thinking several years out. And while it may be hard for some to fathom a world where virtually every app has generative AI infused in it -- with inference being a core building block just like compute, storage, and data base, and most companies having their own agents that accomplish various tasks and interact with one another -- this is the world we're thinking about all the time. And we continue to believe that this world will mostly be built on top of the cloud, with the largest portion of it on AWS.
Amazon Web Services (AWS) is the largest cloud services provider in the world, with 30% of the market, according to Statista. It inks new deals with high-profile clients all the time, including Adobe, Uber Technologies, and Cisco Systems in the first quarter.
It's investing more than $100 billion into its AI business this year to be prepared to handle demand. Since most of the generative AI app development happens on the cloud, Amazon will benefit from that shift, as long as it's ready with best-in-class capabilities.
Jassy often describes what it looks like: There's a three-layer system, with developers building their own large language models (LLMs) at the bottom for the most powerful generative AI platforms, a middle layer where clients use Amazon's LLMs, and a top layer where small businesses can use ready-made solutions for less intensive needs.
Amazon has literally thousands of features to meet every demand and budget. "Before this generation of AI, we thought AWS had the chance to ultimately be a multi-hundred-billion-dollar revenue run rate business," Jassy said. "We now think it could be even larger."
Amazon is developing its own, cheaper chips for smaller clients looking for budget-friendly options, but it's also one of Nvidia's biggest clients, and it's going to maintain that relationship because it needs the chipmaker's best-in-class products for its own big clients.
Huang and Jassy are on the same page in envisioning what the future is going to look like. Nvidia's strong results and investments in its product line are good news for Jassy and Amazon as they build out to benefit from the shift.