When OpenAI released the ChatGPT chatbot about a year ago, that act unleashed a new zeitgeist in an unsuspecting world. Artificial intelligence (AI) is still the talk of the town, of Wall Street, and of Silicon Valley, and that's especially true for the generative AI technology that powers systems like ChatGPT.

Investors soon figured out that Nvidia (NVDA 6.18%) provided the AI acceleration hardware that made it all possible, with spectacular results. Nvidia's stock price has more than tripled in 2023, lifted by actual sales of AI-specific processors and the expectation of continued dominance in this red-hot market segment.

But Nvidia shouldn't rest on its laurels. It isn't the only chip designer in the market, and certainly not the only one with an interest in that lucrative AI opportunity.

The latest challenger to enter the ring and dispute Nvidia's AI mastery is Samsung Electronics (SSNL.F -28.74%). The Korean tech titan has partnered up with Naver (OTC: NHNC.F) -- an online entertainment giant from the same country -- to develop both hardware and software that's supposed to match or beat the best tools available today.

Specifically, Samsung and Naver claim that the upcoming AI chip will be eight times as energy-efficient as Nvidia's H100 accelerator.

That's not the same thing as a straight-up performance record -- but a more power-efficient solution may actually pose an even greater threat to Nvidia's throne. Here's why.

The efficiency edge in AI computing

In the realm of high-performance AI computing, efficiency is key. Pure performance doesn't really matter, because you can always throw more hardware at the number-crunching problem.

The supercomputers that train ChatGPT-style AI systems are equipped with thousands of Nvidia A100 accelerators with nearly 7,000 processing cores each. The real challenge is to supply enough power to run these beasts and then cool down the resulting space heater. The OpenAI/Nvidia system draws 7.4 megawatts at full power, comparable to a packed cruise ship crossing the seas or a large steel mill.

Therefore, the AI giants are really looking for a power-sipping solution that can deliver better results per watt.

Samsung and Naver's claim of an AI chip that is eight times more energy-efficient than Nvidia's H100 could represent a paradigm shift. In a world increasingly conscious of energy consumption and cost, a more efficient chip doesn't just mean lower power bills; it means a smaller carbon footprint, a more compact physical installation, and the ability to deploy more powerful AI systems without prohibitive energy costs.

Nvidia's dominance challenged

Nvidia has long been the go-to provider for AI acceleration hardware, a fact reflected in its soaring stock price and market position. However, as Samsung and Naver step into the arena with their promise of a groundbreaking energy-efficient AI chip, Nvidia faces a new kind of competition. It's no longer just about who has the fastest chip on the market; it's about who can deliver performance in the most efficient way possible. And this time, Nvidia may not be the clear winner.

This development is not just a two-horse race. Companies like AMD and Intel are also in the fray, each with their own AI chip solutions. The AMD Instinct MI300 line comes with more memory and lower power requirements than the previous generation. Intel's Gaudi3 solution focuses on faster raw performance and a next-generation network tying the processors together. Everyone has a unique master plan.

But those alternatives never claimed to blow Nvidia's power efficiency out of the water. Samsung and Naver's focus on low power requirement could set a new standard, compelling others to follow suit -- but still giving the Korean duo a mighty first-mover advantage. As AI technologies become increasingly integrated into various sectors, from healthcare to finance, the demand for efficient, powerful, and cost-effective AI computing will only grow.

What's next?

This is all theory so far. Benchmark tests and real-world installation results will come in 2024 and beyond, as Nvidia's rivals crank out mass-market volumes of their new AI chips. Investors are left doing informed guesswork on how closely each company's claims will match the final performance, from power draws and raw number-crunching performance to next-level connectivity and other potential game-changer ideas.

Will Samsung and Naver's AI efficiency-centric chip deliver on its promises? How will Nvidia and other competitors respond? Time will tell, but one thing is clear: The AI chip market is evolving, and with it, the landscape of artificial intelligence itself. The next few years will be crucial in determining the direction of this technology and its effect on the world.

I can't say for sure which company (or companies) will dominate the AI hardware market in the long run, but Samsung just joined the ranks of potential winners. If you didn't think of Samsung as a leading chip designer before, the company just joined that elite list.