David Tepper is one of the most successful investment managers on Wall Street. His Appaloosa Management hedge fund has produced gross annualized returns of more than 28% since its inception in 1993. That far outpaces the S&P 500's annualized return over the last 32-plus years of about 10.6%.
Tepper is best known for buying distressed debt from companies close to bankruptcy. In fact, Appaloosa was considered a junk bond investment boutique in the 1990s. That contrarian approach often extends to his stock portfolio as well.
That said, he's not so set on swimming against the current that he won't buy stocks that are part of an obvious trend like artificial intelligence (AI). AI stocks like Nvidia (NVDA -0.42%) and Advanced Micro Devices (AMD -2.16%) have soared in value over the last few years. And Tepper made quite a bit of money on those stocks.
But he's been selling them recently in favor of another AI chipmaker instead, possibly taking a bit of a contrarian stance against the two big GPU makers.

Image source: Getty Images.
The essential infrastructure behind the AI revolution
Graphics processing units (GPUs) are computer chips or systems that have proven exceptionally adept at crunching all the data that goes into training and running large language models. GPUs are designed to do the types of calculations needed for training AI algorithms, and they can run the processes in parallel, making them far more efficient than a standard CPU, which uses serial processing.
Nvidia has long been a leader in GPUs, dating back to the days when they were mostly just used for high-end visuals in gaming (hence why the G stands for "graphics"). As the processing needs of large language models grew exponentially larger, it's seen incredible demand for its leading GPU systems.
Even after the strong growth in 2023 and 2024, Nvidia's data center revenue climbed another 73% year over year last quarter. With strong operating leverage, the company has seen its earnings zoom higher, and investors have rewarded it. It's now the most valuable company in the world by a substantial margin, worth over $4 trillion.
But AMD is starting to make progress in catching up to Nvidia. The company's MI400 chips coming next year could offer better price performance than Nvidia's current Blackwell line of chips. While Nvidia will be on to its next-generation Vera Rubin platform by then, AMD is offering Nvidia's biggest customers a viable alternative, which could keep its pricing from climbing substantially higher.
AMD's stock hasn't performed nearly as well as Nvidia's. After peaking in early 2024, the stock crashed more than 60% to its low in April this year. The first quarter could have been a great opportunity to buy the stock, especially for a contrarian investor looking to take a stance against Nvidia's continued dominance.
But Tepper sold his entire stake in AMD during the first quarter, a position first established in the second quarter of 2023. He also continued to cut his Nvidia stake, leaving him with just 3% of the shares he held for Appaloosa in mid-2023. Instead, he's betting on a different chipmaker that poses an increasing threat to the dominance of GPUs in AI data centers.
The AI chipmaker Tepper's buying instead
While GPUs are extremely flexible and capable of handling all sorts of tasks, many of the biggest companies developing leading-edge AI capabilities are working on custom-made silicon that can handle specific tasks far more efficiently than power-hungry GPUs. These application-specific integrated circuits, or ASICs, represent a significant threat to GPUs, as hyperscalers like Meta Platforms and Alphabet's Google design more advanced chips capable of handling AI training and inference.
The capabilities of ASICs are expanding. Meta says its custom chips, which have historically handled machine learning AI, are expanding to training large language models after starting with machine learning algorithms and moving on to AI inference. Google trained its large language model Gemini on its own chip designs, and it just released its first Tensor Processing Unit (TPU) designed for AI inference in April.
The company helping Meta and Google design their ASICs is Broadcom (AVGO -1.12%). On top of that ASIC business, Broadcom is also the leading networking chipmaker. Networking is an essential piece of AI data centers, as solid network performance ensures all the data gets to the expensive GPUs or ASICs quickly and efficiently. These businesses are spending billions on those chips, so they don't want them sitting idly any longer than necessary.
Broadcom also has an enterprise software business, led by virtual machine software VMWare.
That is to say, Broadcom offers a more diversified chipmaker compared to Nvidia or even AMD (which also has a strong CPU business). That may be why Tepper took a small stake in the company during the first quarter, as it's a leading competitor in AI chips while offering some downside protection with its VMWare business.
Still, Broadcom stock is expensive. It trades for a forward earnings multiple close to 40. That's right in line with Nvidia and slightly less expensive than AMD. The company arguably holds more upside if ASIC designs capture more real estate in data centers over time. Consider the potential efficiency gains of ASICs versus GPUs, which seems likely to happen in the long run.
But investors may have to settle for more slow and steady growth compared to Nvidia or AMD. As such, it's worth keeping an eye on Broadcom's stock to see if it falls back down to a more attractive price before following Tepper into the stock.