Editor's Note: This is the first of a three-part series on the colossal changes taking place in the IT world.

NVIDIA (NVDA -3.33%) has become well known to investors in the past few years. The S&P 500's best-performing stock of 2017 has been featured in an endless number of articles and won numerous accolades for developing graphics processing units (GPUs) that helped welcome a new era of artificial intelligence.

If you've owned the stock for several years, you might be reading this article from a yacht docked at your own private island.

Let's take a look at how NVIDIA got to where it is today. 

A fictitious image of a processor that reads "GPU"

Image Source: Getty Images.

The ascent of the GPU

Originally a hardware company that sold to gaming enthusiasts, NVIDIA took the global stage by making the benefits of its GPUs actually usable for other businesses. In business-consultant speak, NVIDIA successfully "crossed the chasm" by selling to the mass market, or "transitioned from the early adopters to the early majority."

In techie speak, the company used its general-purpose GPUs to replace more traditional pre-installed CPUs in computers and made those computers more efficient for the users. Central processing units (CPUs) that ran instructions in series were suffering from declining performance improvements as they got incredibly small, but couldn't go much further, sparking the popular catchphrase, "Moore's Law is dead."

But NVIDIA's GPUs ran in parallel, dividing the workload and driving down the overall wattage required to process data. Intel, which built an empire over decades by vertically integrating to sell high-performance CPUs, suddenly found itself on the wrong side of computing innovation.

NVIDIA's real turning point came in 2006 when it unveiled its CUDA ecosystem. Developers could now write code in their native programming language (C or C++), and NVIDIA could match their software to GPU hardware that could run the instructions more efficiently.

The applications this opened up revolutionized the industry. Anything that relied on processing images or video footage became a natural fit for NVIDIA's GPUs, which had no problem rendering the huge amount of data. Software running on the GPU was better equipped to even recognize features within images, such as edge detection or nuances of color.

This became known as inference training, or using a computer to correctly identify new objects in its surroundings based on previous images it has seen. One application of this was self-driving cars, which could recognize what a stop sign was and actually brake. Oncologists could now better detect cancerous growths in humans diagnostic images and better create personalized patient treatments.

NVIDIA quickly became the king of inference training. Companies had tons of images and needed the computing horsepower of GPUs to make sense of them.

This deluge of new applications brought a wave of new revenue for NVIDIA, which could price its GPUs at a premium since they had limited competition. The company's revenue and operating margin both doubled between 2015 and 2018. Its stock skyrocketed tenfold during the same time frame.

NVIDIA became one of the most recognized and most coveted stocks in the entire market. Short-sellers wouldn't bet against it, and its market potential seemed unlimited.

An artist's depiction of a neural network, with wall-shaped binary codes forming walls that meet together in the center.

An artistic depiction of a neural network, which runs on inference training. Image Source: Getty Images.

The data center opportunity

One of the most lucrative opportunities for GPUs turned out to be the data center. Businesses were storing and processing a huge trove of their customer, financial, and operational information. Having the processing power to do that well allowed companies to carve out competitive advantages.

There was also an industrywide transition underway. Rather than building out their own data centers, companies could hire cloud-computing service providers such as Amazon Web Services or Microsoft Azure to handle the storage and processing requirements of their cloud-based apps. These server farms relied on NVIDIA processors.

The "cloud titans" like Amazon and Microsoft are heavily invested in deep learning and artificial intelligence (AI), so winning them would be extremely lucrative. There is just so much information that needed to be digested and learned from. More information means more processors, which means more GPU sales. The cloud vendors are demanding of high performance and are flush with cash, so they are willing to pay for premium hardware.

NVIDIA has benefited from a lot of excitement being generated for this new opportunity. Its data center revenue is hitting record highs and grew 58% year over year. Its GPU accelerators are being deployed at the world's fastest supercomputers in the U.S., Europe, and Japan.

Check out the latest NVIDIA earnings call transcript.

Investors should watch to see if NVIDIA's momentum in the data center will continue. This market will prove to be a key battleground for all computing-hardware providers.

What's next?: In part two, an examination of why NVIDIA's days of incredible growth may be over.