Futures
Access hundreds of perpetual contracts
TradFi
Gold
One platform for global traditional assets
Options
Hot
Trade European-style vanilla options
Unified Account
Maximize your capital efficiency
Demo Trading
Introduction to Futures Trading
Learn the basics of futures trading
Futures Events
Join events to earn rewards
Demo Trading
Use virtual funds to practice risk-free trading
Launch
CandyDrop
Collect candies to earn airdrops
Launchpool
Quick staking, earn potential new tokens
HODLer Airdrop
Hold GT and get massive airdrops for free
Launchpad
Be early to the next big token project
Alpha Points
Trade on-chain assets and earn airdrops
Futures Points
Earn futures points and claim airdrop rewards
Data Center Revenue: Nvidia Made More in One Quarter Than AMD, Intel, and IBM in a Year
Data centers are the defining infrastructure investment of the artificial intelligence (AI) era. The largest technology companies are spending at levels that far exceed any previous build-out in the industry’s history, and that spending flows directly into the revenue of AI companies that supply chips, servers, and cloud capacity.
The results have not been equally distributed. Nvidia (NVDA -0.74%) has captured the dominant share of AI chip spending. Broadcom (AVGO -1.19%) has built the second-largest AI semiconductor business through custom chip deals. Intel (INTC -3.75%) is fighting to recover lost market share. Cloud operators running the infrastructure are planning capital expenditures of hundreds of billions of dollars in 2026 alone, and they might be coming under pressure to start showing a return on that investment.
How revenue has stacked up between and among hardware suppliers and cloud operators could offer investors clues into which industries and companies are poised for growth.
How Do AI Companies Make Revenue From Data Centers?
Data center revenue can be measured in two groups of companies:
Their revenue models are distinct, and the figures aren’t directly comparable.
Revenue reflects each company’s reported data center or AI hardware segment. Nvidia and Broadcom figures are on different fiscal-year calendars than the others. Cloud services revenue reflects fees charged to businesses for computing capacity. This is not directly comparable to chip segment revenue. Sources: company earnings releases.
AI Chip and Hardware Suppliers: Revenue by Company
Five companies account for the bulk of AI chip and data center hardware revenue. Nvidia sits well above the rest. The others are competing for the remaining share of the fast-growing market.
Loading paragraph…
Cloud Operator Data Center Revenue: AWS, Azure, and Google Cloud
Chip suppliers sell to the data center market, which are driven by hyperscalers. Amazon (AMZN +1.63%) Web Services (AWS), Microsoft (MSFT -0.13%) Azure, and [Alphabet’s (GOOG +1.61%)] Google Cloud are the largest buyers of AI chips and the primary providers of AI computing capacity.
Their AI data center cloud revenue is not directly comparable to data center revenue from chip companies – it reflects what businesses pay to use infrastructure, not the cost to build it. Both figures describe revenue measured at different points of the AI supply chain.
Amazon Web Services
Microsoft Azure
Google Cloud
The capital expenditure plans across these three businesses function as a forward-demand signal for chip suppliers. The four largest hyperscalers – Amazon, Microsoft, Google, and Meta Platforms (META -0.80%) – combined are expected to approach $600 billion in capital expenditure in 2026. Goldman Sachs projects that total hyperscaler capital expenditure from 2025 through 2027 will reach $1.15 trillion, more than double the $477 billion spent from 2022 through 2024.
What Is Next for Data Center Revenue?
Forward guidance points to continued growth in data center revenue. For example:
The biggest structural question for chip suppliers is whether hyperscalers will eventually build enough of their own silicon to reduce third-party GPU demand. Google has been developing its own Tensor Processing Units (TPUs) since 2016. Amazon has its Trainium training chip and Inferentia inference chip. Microsoft is developing its own AI accelerator. The incentive for hyperscalers is clear: reduce dependence on a single supplier and lower long-term costs by taking chip design in-house.
The constraint is equally clear: Nvidia’s CUDA software platform has a decade-long head start, and most AI developers build on it by default. Even hyperscalers that have custom chips in production still buy large quantities of Nvidia GPUs to serve customers locked into CUDA-based workflows.
There are other risks to AI data center revenue worth tracking as well:
FAQs
What is data center revenue?
How do data centers make money?
What is the biggest data center company?
What are data center stocks and AI data center stocks?
Sources
About the Author
Lyle Daly is a contributing Motley Fool stock market analyst covering information technology and cryptocurrency. Lyle has been a contributor at the financial services company since 2018. His work has been featured on USA Today, Yahoo Finance, MSN, Fox Business, and Nasdaq. Before joining The Motley Fool, he wrote for financial brands including Intuit.
TMFLyleDaly
X@LyleDaly
Lyle Daly has positions in Alphabet, Broadcom, Meta Platforms, and Nvidia. The Motley Fool has positions in and recommends Advanced Micro Devices, Alphabet, Amazon, Intel, Meta Platforms, Microsoft, and Nvidia. The Motley Fool recommends Broadcom. The Motley Fool has a disclosure policy.