If you've been following NVIDIA (NVDA 0.35%) for any length of time, you have probably figured out that the company's greatest strength is also its biggest threat: artificial intelligence (AI).

The company pioneered the graphics processing unit (GPU) that transformed pictures in video games from boxy monstrosities to near life-like images. NVIDIA's gaming business was the foundation on which the company was built, and it provided stable, if somewhat unpredictable growth.

The equation changed in early 2017, when the company's data center business, driven by widespread adoption of AI, began producing triple-digit year-over-year growth. The same parallel processing power that made GPUs perfect for rendering images worked just as well for the data-intensive requirements of AI.

This ongoing success has attracted the attention of a multitude of would-be competitors, from start-ups to big tech, all hoping to develop a product that will counter the dominance of the GPU. However, at least one analyst doesn't think NVIDIA investors have anything to worry about.


NVIDIA GeForce TITAN X GPU. Image source: NVIDIA. 

Chips are just the beginning

I'll be the first to say that I have voiced my concerns before about NVIDIA's newfound growth and the challenges it faces in maintaining its dominance in the AI chip space. NVIDIA's stock has been on fire, up over 500% in just the last two years alone and at an extreme valuation, with a trailing 12-month price-to-earnings ratio of nearly 50x.

Hans Mosesmann, an analyst with Rosenblatt Securities, has a different take based on NVIDIA's just-completed GPU Technology Conference. In an interview with Barron's, Mosesmann argues that NVIDIA is innovating so quickly in terms of the processors and companion software that it's "near impossible for people to keep up with." 

NVIDIA introduced a number of new products and platforms at its annual event, which is widely considered the industry's premiere developer conference. During the keynote, NVIDIA CEO and founder Jensen Huang proclaimed, "We're not a chip company; we're a computing architecture and software company." It's this focus on not only the processors but the entire AI ecosystem that is laying the foundation for the company's continued success.

Bigger, better, faster

Don't mistake Huang's pronouncement as indicating that the company is resting on its laurels when it comes to its groundbreaking GPUs. NVIDIA introduced the latest upgrade of its Tesla V100, which the company calls "the most advanced data center GPU ever built to accelerate AI, high-performance computing, and graphics." The upgrade boasts double the memory of its predecessor, making it even better for training deep-learning AI models.

The company also debuted the latest edition of its DGX, NVIDIA's "AI supercomputer in a box." The DGX-2 server combines 16 Tesla V100 GPUs and is designed specifically for AI applications. It has the processing power found in 300 servers taking up 15 racks of data center space -- but is 60 times smaller and 24 times more power-efficient. 

NVIDIA DGX-2 AI Supercomputer in a box.

The DGX-2 has the processing power of 300 servers. Image source: NVIDIA.

Just try to keep up

Mosesmann said:

[NVIDIA goes] to the limits of what they can possibly do in terms of process and systems that integrate memory and clever switch technology and software, and they go at a pace that makes it impossible at this stage of the game for anyone to compete.

The results speak for themselves. NVIDIA's data center segment, which houses its AI growth, represented less than 7% of the company's overall revenue in fiscal 2016, but it now accounts for 21% of total sales -- and the business grew more than 100% year over year in its most recent quarter.

We're still in the early innings of AI, and with the rapid pace of development, somebody could still build a better mousetrap, as the saying goes. But if NVIDIA keeps up its break-neck pace of innovation and builds an ecosystem that becomes the gold standard, it's hard to argue that NVIDIA will continue to dominate AI -- at least for the foreseeable future.