There's no denying the success that NVIDIA Corporation (NVDA 6.18%) has produced over the last couple of years. In its most recent quarter, gaming revenue increased by 49% and that wasn't even the headline. Data center revenue, the money the company earns from artificial intelligence (AI), increased a whopping 186% and grew to 21% of the company's total revenue.

NVIDIA's graphic processing units (GPUs) have provided the power necessary to train AI systems. Those advances have been driving the gravy train, and NVIDIA has been along for the ride. So it makes sense that the company would continue to focus on opportunities in the nascent field.

NVIDIA recently announced the NVIDIA GPU Cloud (NGC), a cloud platform that would provide developers with a comprehensive set of software tools to train their own AI systems. The company says "NGC will accelerate and simplify deep learning development by making it easier for developers to conduct deep learning training, experimentation, and deployment." Taking this step, however, may put NVIDIA on a collision course with the very companies that brought it the level of prosperity it enjoys today.

An NVIDIA DGX-1 AI chip, lying on its side.

The NVIDIA DGX-1 AI supercomputer in a box. Image source: NVIDIA.

Only the biggest players need apply

Until recently, AI has been the domain of companies with the components necessary to make it work. First, it required massive data sets to train the system. Next, it needed raw, unbridled computing power, the type that none but the largest few companies could afford. Finally, it took the expertise necessary to develop the algorithms and the software models required for AI to learn.

To put this in perspective, only companies such as Alphabet Inc.'s (GOOGL 10.22%) (GOOG 9.96%) Google, Amazon.com, Inc. (AMZN 3.43%), and Microsoft Corporation (MSFT 1.82%) have had the resources necessary to develop these robust AI systems. They also happen to be some of NVIDIA's biggest customers, which use its GPUs not only in the realm of AI but also to power their cloud offerings. By offering these services, NVIDIA is venturing onto their cloudy turf.

It's important to point out that NVIDIA isn't building its own cloud infrastructure but will lease cloud space from these public providers.

Hey! You! Get off my cloud!

Cloud computing is big business, and growth rates are staggering. By enticing customers to come directly to NVIDIA, they'll be bypassing some of NVIDIA's biggest customers and thereby cutting into their profits.

Amazon Web Services (AWS), Microsoft Azure, and Google Cloud Platform are the three biggest players in the cloud computing industry. Amazon is the only company to segment out its cloud revenue, with AWS banking $3.66 billion in the most recent quarter, a 43% increase over the prior-year quarter. AWS represents about 10% of Amazon's total revenue, but nearly 90% of its operating income. 

Microsoft bundles its Azure cloud revenue in with server products and enterprise services into its intelligent cloud segment, which increased to $6.8 billion in the most recent quarter, an 11% year-over-year increase, though Microsoft indicated that Azure revenue grew 93%. This could be from a much smaller base, and without the necessary reference, it's impossible to quantify. 

NVIDIA Telsa V100 data center GPU.

NVIDIA Tesla V100 data center GPU. Image source: NVIDIA.

Google Cloud Platform is contained in the other revenues segment, which amounted to $3.1 billion in the most recent quarter, a 49% increase over the prior-year quarter. A large portion of that growth is undoubtedly driven by Google's cloud-computing customers.

These numbers indicate just how much is at stake. A report by Synergy Research Group estimates that Amazon controls 33% of the worldwide cloud market, more than the next five competitors combined. It also reported that Microsoft and Google had achieved growth rates that exceeded 80% in their cloud service businesses. 

Turf war on Cloud Nine?

I don't expect there to be any sort of direct reprisals as the result of NVIDIA's decision, but there could be consequences. Google developed a tensor processing unit to use during the "inference" phase by its AI systems, the execution of the tasks for which they've been trained. Its latest version is also capable of the actual training the of systems, a job that previously required a GPU. This is only one of a number of companies working on chips of their own, hoping to unseat NVIDIA as the king of the hill, so this push-back within the industry could represent the start of a trend.