The artificial-intelligence (AI) industry was jump-started nearly one year ago with the release of chatbot ChatGPT.

That seemed to usher in a new era, in which generative AI models made with billions or trillions of parameters will soon be embedded in every application.

We are still in the very early innings of this AI era, which means competition will be intense. That's why investors need to stay on top of the latest trends and keep abreast of who might win their share of this fast-growing pie.

Last week, two AI leaders made major AI announcements. Even better, these companies still trade at reasonable valuations, making them solid buys today.

Alphabet to invest in Character.AI

On Friday, Reuters reported that search and cloud giant Alphabet (GOOGL 0.37%) (GOOG 0.32%) is in talks to invest in AI startup Character.AI.

Alphabet has recently attracted some investor concern around its AI standing after Microsoft (MSFT 2.22%) invested significantly in OpenAI, the parent company of ChatGPT. That big bet made Microsoft the exclusive cloud provider to OpenAI, and seemingly the current lead dog in the AI race.

That has led to fears chatbots might take some usage away from Google's core search engine. But so far in 2023, Google Search has been performing well, with no noticeable market share loss to Microsoft's Bing.

However, in terms of the cloud computing wars, it does appear Microsoft's perceived lead may be leading to more wins for Microsoft Azure with regard to Google Cloud. In the third quarter, Microsoft's cloud reaccelerated to 29% growth, while Google saw its own cloud unit decelerate to 22% growth.

Of course, Google isn't standing still, and many developers and investors are waiting intently for the release of Google's upcoming multimodal large language model called Gemini, which should be out at the end of this year. On its recent conference call, Google CEO Sundar Pichai also noted that over half of generative AI startups are customers of Google Cloud.

One of those customers is Character.AI, which trains its models on Google's in-house-designed Tensor Processing units. Those are Alphabet's proprietary chips that customers can use as an alternative to Nvidia general-purpose GPUs.

Character.AI's chatbots can mimic celebrities such as Billie Eilish or even fictional anime characters, while also allowing users to customize their own chatbot personality. The "chatbot with a personality" mirrors the route taken by other start-ups such as Grok, the chatbot of xAI, Elon's Musk's AI company, whose personality is modeled on The Hitchhiker's Guide to the Galaxy. As chatbots become more pervasive, it appears some are banking on personality being a key differentiator.

The concept seems set to appeal to younger users, as 60% of Character.AI traffic comes from users between 18 and 24. Character.AI has also seen 100 million unique visits to its site within the first six months of launch.

It should be noted that Google isn't going to acquire Character.ai, but it is considering investing in a funding round, likely along with other venture capital firms in the form of convertible notes. But it follows the pattern of large cloud companies investing in AI start-ups in exchange for some degree of exclusivity and long-term contracts to use specific clouds to train and run their models.

Alphabet only trades around 20 times 2024 earnings estimates, but it's actually a bit cheaper than it looks. Its earnings are depressed by ongoing losses in its moonshot "Other Bets" portfolio, and the company has a lot of excess cash.

Outside of those factors, Alphabet's core businesses really trade closer to a market multiple, which seems too cheap for the world's dominant search engine, a cloud computing platform that just turned profitable, and a number of venture-like investments such as the Character.AI bet, which could lead to significant future growth.

Letters A and I on circuit board.

Image source: Getty Images.

Micron unveils super-fast high-capacity DRAM

In addition to Google, Micron Technology (MU 2.11%) also scored a new product win last week, unveiling its new 128 GB RDIMM DRAM module built on 32GB DDR5 DRAM dies.

As an investor, one may be confused, as high-bandwidth memory (HBM) is often cited as the high-growth DRAM product needed for artificial intelligence training and inference. However, large-module DDR5, or D5, actually comprises the majority of DRAM used in AI servers today. Back on its June conference call, Micron management noted that 75% of the memory in AI servers today was DDR5, not HBM.

That is not to say Micron is ignoring the HBM market. In fact, the company has recently released specs of its new HBM product, which currently outdoes any HBM on the market market today, and should be ready for initial shipments in early 2024.

RDIMM stands for registered dual in-line memory module, which is a type of server module consisting of multiple DRAM dies, with a register in between the memory and the controller. That register helps improve signal quality at lower power, which enables higher capacity for the DRAM modules, thereby making it good for data analysis and high-performance computing.

Micron's new D5 distinguishes itself in a couple important ways from other memory. First, it's built on the 1-beta node, with Micron being the first DRAM company to achieve production on the 1-beta node late last year. Second, Micron is using an alternative in-chip packaging to through-silicon vias (TSVs), which is the current industry standard for stacking DRAM modules.

As a result, Micron's new RDIMM module offers better specs compared to competitive DRAM products on the market today. The new module boasts 45% improved density, 24% improved energy efficiency, 16% lower latency, and a 28% improvement in AI training performance relative to current alternatives on the market.

The overall memory industry is currently in a huge slump, because of the historic decline in PC and smartphone unit sales over the past 18 months following the pandemic. However, all memory producers have pulled back significantly on production, just as demand seems to be bottoming and turning up.

AI applications, while a small part of the memory business, are now growing very, very fast, as AI requires massive amounts of memory and storage. That could turbo-charge the next memory up-cycle, and Micron's leading technology could make it an even bigger winner among peers.