When it comes to hardware technology, NVIDIA (NVDA 3.71%) is at the leading edge of developments that are poised to reshape the world for at least the next decade. Using its graphics processing unit (GPU) roots for high-end video games -- an early data-dense computing task that has paved the way for breakthroughs in other areas -- NVIDIA is a busy chip designer and software developer these days. 

Normally chock-full of news, the company's annual GPU Technology Conference aired on YouTube this year, with CEO Jensen Huang delivering the pre-recorded keynote from his kitchen (due to COVID-19). It was especially packed with new releases and is worth watching -- especially for investors who have been watching this gaming company transform itself into the computing engineer powerhouse of the future.

A new collaborative engine for creators

While the biggest announcements won't directly concern video game enthusiasts, it was appropriate that the event first unveiled a new real-time simulation and collaboration platform -- since video gaming and professional visualization sales still made up nearly 60% of NVIDIA's revenue in its last reported quarter.

Called Omniverse, the cloud-based service is based on Pixar's (part of Disney (NYSE:DIS)) 3D imaging framework. The hub powers collaborative work and was built to integrate with other creative applications. Partners include software providers for engineering and architectural design, video game production, and entertainment content creation. Using the RTX GPU lineup -- best known for powering AI-enhanced graphics and ray tracing -- creatives can edit and simulate computer-generated images and graphics remotely with their teams.

A robotic arm holding an NVIDIA chip

Image source: NVIDIA.

Data centers, AI, and the future internet

The bulk of the event focused on those business lines that fall under NVIDIA's data center segment -- which made up just under a third of sales at the end of 2019. Thanks to some recent moves and its ongoing research and development, data center and cloud business holds the greatest potential for NVIDIA in the decade to come.

NVIDIA thinks the data center is the new basic computing unit, and that was the reasoning behind buying Mellanox. Mellanox specializes in networking hardware, an equally important component of computing power. After all, while NVIDIA GPUs are able to process massive amounts of information very quickly, that processing speed gets bottlenecked and goes to waste if the network itself can't keep up.  

Thus, combining with Mellanox, NVIDIA has created a data center computing processor that it thinks will be a basic computing chip of the future along with the CPU (for general purpose computing) and GPU (for specialized and data-intensive processing tasks). The new data center hardware is powering software that's mapping genetic code (NVIDIA said it helped map out the genome of the coronavirus), data monitoring as it travels within a data center, and machine learning.

Huang addressed two specific areas that are a focus in the world of AI and the incredibly complex software algorithms that power it: Recommenders (algorithms for internet and other search) and conversational AI.

Recommenders used to predict what a user is looking for on the internet may sound like old software tech. It's not. AI-based recommendation that uses previous choices and prediction of what someone might like is a complex piece of software that is getting deployed in data centers. Some examples are in the realm of entertainment. If someone watches a movie, a streaming platform might recommend another one like it. If someone liked a song, the music streaming service might recommend a certain playlist. Basically, the internet is becoming a personalized predictive assistant. 

To that end, NVIDIA built an application system called Merlin to democratize this complicated data processing for any business to use and apply to its own system. With the data floating around out there on the web only getting bigger, NVIDIA thinks Merlin and the new data center GPUs (the A100 data center GPU, using Taiwan Semiconductor's (TSM 2.71%) 7nm chip architecture that delivers 1.6 terabytes per second of bandwidth) and network hardware that support them will continue to be in high demand.  

On to conversational AI -- one of the most challenging machine learning applications that is only just beginning to reach the stage where it's applied to a data center to start solving problems. NVIDIA created another application framework called Jarvis to analyze conversation, respond, and simultaneously animate a chatbot that imitates real-life facial expressions made during speech. In just a few hundred milliseconds, conversation and corresponding graphics can be processed and output. Much like Merlin, Jarvis lets algorithms be augmented with an organization's own data to make a custom automated service -- like call centers, smart speakers, or video conferencing technology. Which begs the question: Was that really Jensen Huang talking, or was it a bot?  

The future is "smart everything"

Lastly, there were some updates to NVIDIA's edge AI and robotics platform EGX and the autonomous vehicle platform AGX -- both of which are also mostly contained in the data center business segment.

Trillions of devices will eventually be operating around the world, connected to a network, and creating and processing data. Data centers will generate the AI algorithms, and then push that intelligence to devices located remotely around the globe -- known as the "smart everything" revolution, versus the smartphone revolution that started a little over a decade ago. Many of these devices and sensors will be on all the time, and data centers will need to be located close to the action in the field for quick response. The EGX platform was designed as a compact unit with an NVIDIA GPU and Mellanox networking hardware. Think of it like a mini cloud data center for embedment in any device.

Factory automation and quality control and distribution centers are early use cases for EGX. Huang mentioned partnerships with Ericsson (ERIC -0.76%) to power 5G network hardware, and BMW (BMWYY -1.23%) chose NVIDIA EGX to power its new auto manufacturing processes. As I've discussed in the past, the U.S. Postal Service is also using EGX to sort the mail. Between cloud and the intelligent edge, the opportunity in the decades ahead is massive.  

Last but not least was the AGX platform for autonomous cars, which is being streamlined and is now using NVIDIA's Ampere chip architecture found in the other data center chip designs. NVIDIA's idea here is to start thinking of not just the car, but rather the automation of everything that moves -- from delivery robots to taxis to public transit shuttles. This is a massive industry that encompasses trillions of dollars of spending each year. In my opinion, it's why big tech hasn't started making vehicles (or acquired an automaker) as many investors have speculated they might. The industry is so massive that controlling the most profitable part of the future of transportation and selling to all of the manufacturers just makes more economic sense. AGX chip designs and software stacks are open for all of NVIDIA's partners to use -- from vehicle manufacturers to parts makers to autonomous vehicle software start-ups.  

Besides the massive and still fast-growing scale of NVIDIA's reach in the world of tech, there is one final point I'd make about this year's conference. NVIDIA is certainly still a GPU designer, one especially rooted in gaming. However, over the years focus has shifted to software development. The aim is to make a developer's life easier and focused on innovation and creating new uses for NVIDIA's wares, rather than being stuck writing time-intensive basic code. And the software NVIDIA is making available is complex and leading the charge on multiple technology fronts -- from healthcare to communications to manufacturing operations. More than just a basic semiconductor stock, this is an innovation engine creating massive opportunity for itself, and that's the reason this is part of my core portfolio holding for the long term.