NVIDIA (NASDAQ:NVDA) reported third-quarter fiscal 2020 results after the market closed last Thursday. The graphics processing unit (GPU) specialist's revenue fell 5% year over year to $3.01 billion, and earnings per share (EPS) adjusted for one-time items slipped 3% to $1.78.
Year-over-year declines were expected due to temporary issues in the company's gaming and data center businesses that began about a year ago. On the positive side, both the top and bottom lines were up solidly from the second quarter. Moreover, strong year-over-year growth is on track to resume in the fourth quarter.
Earnings releases only tell part of the story. NVIDIA's Q3 earnings call left me feeling even more confident that its stock is poised to be a big long-term tech stock winner from here. Here are three artificial intelligence (AI) topics from the call that you should know about.
1. AI inferencing revenue continues to grow briskly
From CEO Jensen Huang's remarks:
We had a strong Q3 in hyperscale data centers. ... [W]e shipped a record number of V100s and T4s, and for the very first time we shipped more T4s than V100. ... In fact, our inference business [revenue] is now a solid double-digit [percentage of data center revenue] and it doubled year over year. [Emphasis mine.]
T4s are NVIDIA's data center inference GPUs, and V100s are its training GPUs. Inferencing is the second of the two-step process of deep learning, a burgeoning type of AI, that involves machines applying their training to new data. (Training is the first step.) NVIDIA's GPUs reign supreme in the market for AI training, but it's only been within the last two years that they've made inroads into inferencing, which has traditionally been dominated by CPUs.
Huang's statement is amazing when you consider that as recently as the second quarter of fiscal 2018 -- less than two years ago -- he said that "0% of our business [is] in inferencing." Now it comprises a "solid double-digit" percentage of a business that has an annual run rate of $2.9 billion.
2. Google's breakthrough BERT model is driving demand for NVIDIA's GPUs
From CFO Colette Kress' remarks:
Hyperscale [data center business] is being driven by conversational AI, the ability of computers to engage in human-like dialog, capturing context and providing intelligent responses. Google's breakthrough introduction of the BERT model, with its super-human levels of natural language understanding, is driving...demand for our GPUs on two fronts.
BERT (Bidirectional Encoder Representations from Transformers) is a natural-language processing (NLP) model. It was widely hailed as a breakthrough because it excels at understanding context, which is a weak point for traditional NLP models. So, for instance, like you did, it should understand from the context in Kress' quote that she's talking about an NLP model, not some good-looking guy named Bert walking down a fashion runway.
Alphabet's (NASDAQ:GOOG) (NASDAQ:GOOGL) Google has begun using BERT internally to handle searches. More importantly for NVIDIA, it open-sourced BERT, in the English language, in November 2018. So many entities have begun using BERT to train their language-processing systems for question answering and other applications.
The "two fronts" Kress mentioned refer to AI training and inferencing.
3. CEO: "The Intelligent Edge will likely be the largest AI industry in the world."
While AI inferencing has traditionally been done in data centers, it's increasingly being performed "at the edge," which means by devices that are located where the data is being collected.
While I quoted Huang above, it was Kress who summed up NVIDIA's quarterly activities on the AI edge front:
[W]e announced a software-defined 5G [the fifth generation of cellular network tech] wireless RAN [radio access network] solution accelerated by GPUs in collaboration with Ericsson. With this opens up the wireless RAN market to NVIDIA GPUs. It enables new AI applications as well as AR [augmented reality], VR [virtual reality], and gaming to be more accessible to the telco edge.
We announced the NVIDIA EGX Intelligent Edge Computing Platform. With an ecosystem of more than 100 technology companies worldwide, early adopters include Walmart, BMW, Procter & Gamble, Samsung Electronics, and [...] the cities of San Francisco and Las Vegas.
The company also announced a partnership with Microsoft on intelligent edge computing aimed at helping various industries "better manage and gain insights from the growing flood of data" they collect. Moreover, after the quarter ended, NVIDIA announced that the U.S. Postal Service is adopting its AI tech to help it process package data more quickly and accurately.
It's not too late to buy NVIDIA stock
It's reasonable to wonder if it could be too late to buy NVIDIA stock. After all, despite its tumble a year ago (from which it's rebounding nicely), this is a stock that has returned 995% over the last three years through Nov. 18, making the S&P 500's respectable 68.6% return look paltry.
But all indications are that this stock party is far from over. The company is profiting from many growth trends that are only in the early innings: data center AI, driverless vehicles, gaming, and various AI edge applications. (There is some overlap in these categories.)