There's no question that Advanced Micro Devices (AMD -2.52%) has failed to keep up with rival NVIDIA (NVDA 2.09%) in the data-center GPU market. While AMD sells data-center GPUs of its own, aimed at accelerating a wide variety of workloads, estimates for NVIDIA's share in the data-center market run as high as 95%.

NVIDIA's data-center GPUs are powerful, but its biggest advantage is software. It's the ecosystem that NVIDIA has built around its data-center GPUs that has ultimately made them the de facto standard. NVIDIA CUDA, a proprietary parallel-computing platform from NVIDIA that only runs on its own GPUs, has been around for 16 years. NVIDIA has been laying the foundation of its data-center dominance for a long time.

AMD goes after the AI chip market

Data-center GPUs can be used for many types of workloads, but artificial intelligence is quickly becoming the most prominent. Training the large language models that make services like OpenAI's ChatGPT possible requires tens of thousands of ultra-powerful GPUs working in tandem to churn through enormous amounts of training data.

Just like the broader data-center GPU market, NVIDIA is dominating the AI GPU market. The company's data-center segment generated $4.3 billion of revenue in the company's most recent quarter, and NVIDIA expects total revenue to surge as it struggles to keep up with insatiable demand for its AI chips.

Right now, NVIDIA's GPUs are the path of least resistance for any company looking to train AI models for two reasons. First, NVIDIA's latest H100 AI chip is by far the most powerful option available. And second, the software ecosystem is mature.

But this doesn't mean that AMD can't make inroads in the AI chip market. Northland analyst Gus Richard believes that AMD will be able to ride the AI wave to higher GPU revenue thanks to its upcoming MI300 series AI chips and its open-source software push. Richard expects that NVIDIA will remain the overwhelming market leader, but he sees AI providing enough benefit to AMD to push the stock up to $150. That price target represents a 30% gain from the current price.

AMD plans to launch the Instinct MI300X GPU later this year, which is tailor-made for training advanced generative AI models. The chip will come with 192 GB of HBM3 memory to facilitate the largest AI models. AMD will also roll out the Instinct MI300A, which features its Zen 4 CPU cores in addition to a powerful GPU.

On top of announcing new chips, AMD revealed its ROCm software ecosystem for data-center accelerators. Unlike NVIDIA's CUDA, AMD's ROCm is an open platform capable of supporting multiple vendors and architectures. This could be appealing to those who don't want to be locked into NVIDIA's ecosystem, although time will tell whether ROCm can gain significant traction.

Is AMD stock a buy?

Analysts in general are optimistic about AMD stock, and AI represents an enormous potential growth opportunity for the company. But it's important to note that the stock is already quite expensive. Valued at $185 billion, the stock trades at 8 times the average analyst estimate for 2023 revenue and 40 times the estimate for adjusted earnings per share.

The stock seems even more expensive given AMD's current growth challenges. Total revenue was down 9% year over year in the first quarter, and a steeper 20% plunge is expected in the second quarter. Profits are tumbling as well, driven lower by extremely weak demand for its PC CPUs.

AI chips could certainly help AMD return to growth, but the company likely has a long road ahead as it tries to chip away at NVIDIA's dominance. While AMD could turn out to be a solid investment, a lofty valuation makes the stock risky.