You may have heard of ChatGPT, the artificial intelligence-powered Chatbot recently released to the public by start-up OpenAI. Released on Nov. 30, the shockingly good AI chatbot reached 1 million users within five days. In the wake of ChatGPT's release, college professors are now fearing the rise of AI-generated college essays, and software developers may fear the rise of ChatGPT's AI-generated coding capabilities.

Microsoft (MSFT -1.27%) had already invested $1 billion in OpenAI back in 2019, and the cloud giant is now reportedly in talks to invest another $10 billion into the company, so it obviously sees big promise in this new, advanced AI engine. Last week, Microsoft released an OpenAI service on its Azure platform, which developers can now incorporate into their software designs, and Microsoft itself is looking to infuse its current software products, from Office to Bing, with ChatGPT's capabilities. Last week at the World Economic Forum, Microsoft CEO Satya Nadella claimed AI would go "mainstream" in "months, not years."

With AI now appearing to reach a threshold where it becomes table stakes in a wide range of enterprise and consumer applications, one can be sure every large tech giant will now be investing heavily in AI to compete.

If the AI wars are beginning, that means certain "arms dealers" -- the semiconductor companies that make AI work -- should profit handsomely, perhaps even more so than Microsoft or other large tech platforms.

As they say, it was the producers of picks and shovels that got rich during the California gold rush in the 1800s, more so than the prospectors themselves. Applying that logic to AI, the following "picks and shovels" semiconductor stocks should benefit handsomely in the years ahead.

Nvidia

Artificial intelligence requires the massive parallel processing capabilities of graphics processing units (GPUs) to run, and the outright leader in GPUs is Nvidia (NVDA -10.01%). What's interesting and compelling is that ChatGPT doesn't even currently run on Nvidia's latest chip, the H100, or "Hopper," which was just released late last year. Rather, the current incarnation of ChatGPT runs on the older A100 chips, released two years ago.

If you think ChatGPT is impressive now, get ready. The H100, which began shipping in October, is projected to perform AI "training" functions at nine times the speed of the A100, and "inference" -- the act of an AI reacting to a question or other stimulus -- at 30 times the speed. In addition, the H100 promises 3.5 times better energy efficiency and three times lower total cost of ownership.

Nvidia is also unveiling its first CPU this year named Grace, which is specifically designed to work in conjunction with Hopper and Nvidia's networking-oriented data center processing units (DPUs). Together, these complete AI systems should enable ultra-fast networking and movement of data within a system, in addition to ultra-fast GPU processing.

While Nvidia's gaming chip revenue plunged in the most recently reported third quarter because of the video game pandemic hangover and cryptocurrency bear market, its data center revenue was up an impressive 31%. With the release of Hopper H100 chips and the AI wars kicking in, look for Nvidia to benefit in 2023. That's especially true after its stock has traded down to a more reasonable valuation following its 50% decline in 2022.

Taiwan Semiconductor Manufacturing and ASML Holdings

The production of Nvidia GPUs depends on these two foreign companies, Taiwan Semiconductor Manufacturing Corporation (TSM -3.45%) and ASML Holdings (ASML -3.32%). You might recognize TSMC as one of Warren Buffett's most recent stock picks, as the Oracle of Omaha bought $4 billion worth of its stock last summer.  

First on ASML, which has a monopoly on key technology used in producing leading-edge semiconductors, called extreme ultraviolet lithography (EUV). As leading-edge chip companies began making semiconductors with transistors less than 10nm apart, the extremely fine lasers of EUV technology became necessary. Given that artificial intelligence applications require the most transistor-dense, power-efficient chips in order to run, the burgeoning AI wars will put demands on chip foundries to produce more and more leading-edge chips. That means more purchases of ASML's machines.

Man with tablet with graphic of brain rising out of it.

Image source: Getty Images.

Of course, there are many more steps in creating a leading-edge chip than just lithography. Lithography must work in conjunction with masking equipment, etch and deposition machines, metrology and inspection machines, and advanced packaging in order to create these incredibly complex chips without any defects.

That's really, really difficult to do, and it's why Taiwan Semiconductor has a competitive moat of its own, thanks to its years of experience as the world's largest outsourced foundry. Years ago, TSMC surpassed Intel (INTC -2.40%) in producing leading-edge chips, and it only appears to be widening that lead given TSMC's size and scale advantage, as well as Intel's current difficulties from its exposure to the weak PC market. As of the fourth quarter, TSMC produced more than 56% of the world's semiconductors, with an even higher share on the leading edge.

On its recent earnings call with analysts, TSMC management emphasized its high-performance computing segment for AI customers as the reason for its optimism over a recovery in the semiconductor market in the second half of 2023.

So, both ASML and TSMC work in conjunction, but both are competitively advantaged in producing the most advanced chips required for AI processing. Value investors such as Buffett may gravitate to the lower-valued TSMC, which trades at only 13.7 times earnings. Meanwhile, growth investors may gravitate toward ASML's technology monopoly, asset-light business model, and smoother growth prospects, which has enabled a higher valuation of 42 times earnings.

Micron Technology

Finally, all that data processing requires huge amounts of memory and storage, which should benefit memory producer Micron Technology (MU -4.61%).

Micron's results are currently in freefall and its earnings will likely be negative for the next two quarters, as the historic plunge in PC sales along with weakness in smartphones and consumer electronics has overwhelmed the memory market's delicate supply demand balance.

Still, Micron's stock has miraculously stayed resilient, at around levels reached last June, even as its results have gotten much worse. This is because the stock now trades just above Micron's book value, and the market tends to be forward-looking. Meanwhile, Micron is one of only three global players that produce DRAM memory at scale, and one of only five players that produces NAND flash storage.

With limited competition, both Micron and peer SK Hynix have announced drastic spending cuts for 2023, which should help restore the supply demand balance in the back half of the year. While leader Samsung said late last year it was looking to maintain its investment in memory chips despite the downturn, analysts now doubt it will do that, especially in the wake of its worse-than-expected fourth quarter earnings preannouncement. In fact, on Jan. 16, Digitimes reported Samsung would cut some of its NAND production, perhaps giving into the reality of the situation.

In addition, Micron achieved a technology lead over this limited competition last year. Over just the past six months, Micron became the first memory maker to achieve 1-beta DRAM chips and the first to produce 232-layer NAND flash chips. With a current technology lead and all market participants now appearing to cut back on production, Micron should benefit in the back half of 2023 and beyond, as demand for memory-intensive AI servers kicks into high gear.