Nvidia's (NVDA -0.13%) dominance of the market for data center graphics processing units (GPUs) has supercharged the stock in 2023. Shares of the semiconductor giant have nearly tripled so far this year, which seems justified given how artificial intelligence (AI)-fueled demand is going to drive tremendous growth in the company's revenue and earnings.

Nvidia's rival Advanced Micro Devices (AMD -2.34%), on the other hand, has also clocked impressive gains of 76% this year. However, AMD's gains have been driven largely by the broader rally in semiconductor stocks rather than the company's financial performance. AMD has struggled on account of its reliance on the personal computer (PC) market, and it has been late to the AI market where Nvidia has been the dominant force with a market share of 95% in GPUs used for machine learning.

However, there is room for more than one player in the massive market for AI GPUs, which is why it is too early to count AMD out of this race. The good part is that AMD is already sampling its data center GPUs with customers, and one source believes that they could be good enough to provide customers with an alternative to Nvidia's chips.

AMD could run Nvidia close in the AI accelerator market

Graphics cards are used in data centers to accelerate AI workloads, such as training large language models (LLMs) that power applications such as chatbots, as well as for inferencing where the trained models are put to use for making predictions or responding to user queries.

Nvidia has been the go-to supplier of these GPUs so far, as is evident from the company's huge market share in this space. That's not surprising as the company's A100 and H100 data center GPUs are based on advanced 7-nanometer (nm) and 5 nm manufacturing nodes, and as a result they can compute huge amounts of data and remain power efficient at the same time.

AMD's current generation Instinct MI250 data center accelerators (the company's term for data center graphics cards), on the other hand, are based on a 6 nm process. The chip is also equipped with 128 gigabytes (GB) of high-bandwidth memory (HBM) that's needed for tackling AI workloads.

AI software start-up MosaicML points out that this MI250 accelerator can deliver 80% of the performance of the competing Nvidia chip -- the A100. It is worth noting that ChatGPT was trained using an estimated 30,000 Nvidia A100 GPUs. More importantly, MosaicML believes that software optimization from AMD could help the MI250 accelerator match the performance of Nvidia's A100 chip.

AMD pointed out on its May earnings conference call that the MI250 data center GPU is gaining traction among customers. The chip has been deployed in a supercomputer in Finland for training LLMs, with AMD pointing out that the MI250 and its third-generation Epyc server processors have been used in this supercomputer "to train the largest finished language model to date."

The company also added that "customer interest has increased significantly" for its next-generation instinct MI300 GPUs for AI training and inference of large language models. AMD has set its sights on Nvidia's flagship H100 data center GPU with its MI300 accelerator, equipping the latter with significantly more high-bandwidth memory (192 gigabytes as compared to 80 GB in the H100). The MI300 also has a higher memory bandwidth of 5.2 terabytes per second as compared to the H100's 3.2 TB per second.

MosaicML pointed out in its research that AMD has been doing well to optimize the software on its data center GPUs. So, stronger hardware combined with the company's software expertise could help AMD's MI300 chips give Nvidia's data center GPUs stiff competition when they arrive later this year.

Is it a good time to buy AMD?

Next Move Strategy Consulting estimates that the market for AI chips could be worth $304 billion by 2030, up from just under $29 billion last year. The market is expected to clock annual growth of 29% through the end of the decade. So, it is still early days in the AI chip market, which is why AMD's delay in gaining traction over here shouldn't be taken as a negative.

If AMD can deliver a powerful chip at an aggressive price point, it won't be surprising to see customers lining up to buy them. However, it would be better for investors to actually wait for AMD's data center chips to make a dent in the market as the company is currently struggling to increase revenue and earnings.

The chipmaker has guided for a 20% year-over-year decline in revenue for the second quarter of 2023 to $5.3 billion, thanks to weak PC sales. It also expects a 4-percentage-point decline in the adjusted gross margin to 50% for the previous quarter. Those numbers are in stark contrast to Nvidia, whose revenue is expected to jump a terrific 64% year over year in the ongoing quarter to $11 billion, and adjusted earnings are expected to more than double.

With the PC market expected to remain under distress in 2023, AMD badly needs its data center chips to take off to justify the impressive stock market surge it has clocked this year. AMD is currently trading at an expensive 609 times trailing earnings thanks to its poor quarterly performances and rising stock price. However, investors have seen how quickly Nvidia's fortunes have turned around thanks to AI, and AMD could replicate the same if it manages to scale up the performance of its chips to identical levels.

That's why investors would do well to keep an eye on AMD's second-quarter results, which are due within the next month, as potential signs of a turnaround owing to an increase in demand for its data center chips could give the stock's rally a nice shot in the arm.