Data centers are the defining infrastructure investment of the artificial intelligence (AI) era. The largest technology companies are spending at levels that far exceed any previous build-out in the industry’s history, and that spending flows directly into the revenue of AI companies that supply chips, servers, and cloud capacity.
The results have not been equally distributed. Nvidia (NVDA -1.56%) has captured the dominant share of AI chip spending. Broadcom (AVGO -4.11%) has built the second-largest AI semiconductor business through custom chip deals. Intel (INTC +1.14%) is fighting to recover lost market share. Cloud operators running the infrastructure are planning capital expenditures of hundreds of billions of dollars in 2026 alone, and they might be coming under pressure to start showing a return on that investment.
How revenue has stacked up between and among hardware suppliers and cloud operators could offer investors clues into which industries and companies are poised for growth.
How Do AI Companies Make Revenue From Data Centers?
Data center revenue can be measured in two groups of companies:
- AI chip and hardware suppliers, which make and sell the equipment that data centers run on.
- Cloud operators, which build and operate data centers and sell access to computing capacity.
Their revenue models are distinct, and the figures aren’t directly comparable.
Revenue reflects each company's reported data center or AI hardware segment. Nvidia and Broadcom figures are on different fiscal-year calendars than the others. Cloud services revenue reflects fees charged to businesses for computing capacity. This is not directly comparable to chip segment revenue. Sources: company earnings releases.
AI Chip and Hardware Suppliers: Revenue by Company
Five companies account for the bulk of AI chip and data center hardware revenue. Nvidia sits well above the rest. The others are competing for the remaining share of the fast-growing market.
Cloud Operator Data Center Revenue: AWS, Azure, and Google Cloud
Chip suppliers sell to the data center market, which are driven by hyperscalers. Amazon (AMZN -0.87%) Web Services (AWS), Microsoft (MSFT -1.57%) Azure, and [Alphabet’s (GOOG -0.58%)] Google Cloud are the largest buyers of AI chips and the primary providers of AI computing capacity.
Their AI data center cloud revenue is not directly comparable to data center revenue from chip companies – it reflects what businesses pay to use infrastructure, not the cost to build it. Both figures describe revenue measured at different points of the AI supply chain.
Amazon Web Services
- AWS Q4 2025 cloud revenue was approximately $35.6 billion, up 24% year over year. Full-year 2025 cloud revenue was approximately $128.7 billion.
- AWS is the largest cloud provider by revenue and is a primary destination for enterprise AI workloads, with an annualized run rate above $140 billion.
- Amazon announced a $200 billion capital expenditure plan for 2026, the majority directed at AWS and AI infrastructure.
Microsoft Azure
- Microsoft's Intelligent Cloud segment reported $32.9 billion in Q2 FY2026, with Azure and other cloud services growing 39% year over year. Full-year FY2025 Intelligent Cloud revenue was approximately $106.3 billion.
- Microsoft's partnership with OpenAI gives it privileged access to frontier AI models and has accelerated enterprise adoption of Azure AI services.
- Capital expenditure hit $37.5 billion in a single quarter (Q2 2026), up 66% year over year, with an annualized run rate approaching $100 billion.
Google Cloud
- Google Cloud Q4 2025 revenue was $17.7 billion, up 48% year over year, the fastest growth rate among the three major providers. Full-year 2025 cloud revenue was approximately $59 billion.
- Google Cloud's growth rate reflects accelerating enterprise adoption of AI services and the expansion of its Gemini model platform across business customers.
- Alphabet revised its 2025 capital expenditure guidance upward three times, reaching $91 billion to $93 billion, compared to $52.5 billion in 2024.
The capital expenditure plans across these three businesses function as a forward-demand signal for chip suppliers. The four largest hyperscalers – Amazon, Microsoft, Google, and Meta Platforms (META -3.77%) – combined are expected to approach $600 billion in capital expenditure in 2026. Goldman Sachs projects that total hyperscaler capital expenditure from 2025 through 2027 will reach $1.15 trillion, more than double the $477 billion spent from 2022 through 2024.
What Is Next for Data Center Revenue?
Forward guidance points to continued growth in data center revenue. For example:
- Nvidia guided for total revenue of $78 billion for Q1 FY2027.
- Broadcom expects AI chip sales to approximately double year over year in the current quarter.
- AMD projects more than 60% annual data center segment growth over the next several years.
- The hyperscalers are collectively planning close to $600 billion in 2026 capital expenditure.
The biggest structural question for chip suppliers is whether hyperscalers will eventually build enough of their own silicon to reduce third-party GPU demand. Google has been developing its own Tensor Processing Units (TPUs) since 2016. Amazon has its Trainium training chip and Inferentia inference chip. Microsoft is developing its own AI accelerator. The incentive for hyperscalers is clear: reduce dependence on a single supplier and lower long-term costs by taking chip design in-house.
The constraint is equally clear: Nvidia's CUDA software platform has a decade-long head start, and most AI developers build on it by default. Even hyperscalers that have custom chips in production still buy large quantities of Nvidia GPUs to serve customers locked into CUDA-based workflows.
There are other risks to AI data center revenue worth tracking as well:
- AI model efficiency: Models are becoming more capable per dollar of compute. If efficiency gains outpace demand growth, infrastructure spending could moderate. Historically, cheaper compute has expanded total usage rather than replacing it, but that is not guaranteed.
- Export controls: U.S. restrictions on chip sales to China are already creating revenue headwinds for Nvidia and AMD.
- Capital expenditure monetization: Hyperscalers are spending at historically high levels relative to operating cash flows. If AI services revenue does not grow fast enough to justify the investment, capital expenditure plans could be revised downward, reducing demand for chip suppliers.
FAQs
Sources
- AMD (2026). "Fourth Quarter and Full Year 2025 Financial Results."
- Alphabet (2026). "Fourth Quarter and Fiscal Year 2025 Results."
- Amazon (2026). "Fourth Quarter Results."
- Broadcom (2025). "Fourth Quarter and Fiscal Year 2025 Financial Results."
- Broadcom (2026). "First Quarter Fiscal Year 2026 Financial Results."
- Broadcom (2025). “Broadcom (AVGO) Q4 2025 Earnings Call Transcript.”
- Goldman Sachs (2025). “AI: In a Bubble?”
- IBM (2026). "Fourth Quarter 2025 Results."
- IBM (2026). "Financial Reporting."
- Intel (2026). "Fourth Quarter and Full-Year 2025 Financial Results."
- Intel (2026). "Investor Relations."
- Microsoft (2026). "Cloud and AI Strength Drives Second Quarter Results."
- Nvidia (2026).NVIDIA Announces Financial Results for Fourth Quarter and Fiscal 2026
- Nvidia (2026). "Financial Reports."