While it may seem that investors' views on Alphabet (GOOGL +0.06%) (GOOG 0.05%) and its future as an artificial intelligence (AI) leader changed overnight, it has been working behind the scenes for more than a decade to put itself in its current position. However, the company is just starting to benefit from the edge it has developed in AI infrastructure and cloud computing, and its lead is likely going to widen from here, making it the stock to own for the next phase of AI infrastructure.

NASDAQ: GOOGL
Key Data Points
Alphabet has actually been working on AI since 2011, when it founded a research lab called Google Brain. Here it developed its deep learning TensorFlow framework, which today is used for training large language models (LLMs) and running inference within Google Cloud. Meanwhile, it acquired British AI lab DeepMind in 2014. Together with Google Brain, which it merged with in 2023, it helped lay the building blocks and code for its Gemini LLM.
The company released its TensorFlow machine learning library in November of 2015 and unveiled its tensor processing units (TPUs) the next year. These custom application-specific integrated circuits (ASICs) are preprogrammed chips designed specifically for machine learning and AI workloads optimized for Google Cloud's TensorFlow framework. The company used its early TPUs to run its own internal workloads and then began renting them out to customers as part of Google Cloud's infrastructure-as-a-service solution in 2018.
Over the next decade, Alphabet would continually improve upon its chip designs. Its TPUs are now in their seventh generation at a time when many competitors are just starting to recognize the performance and cost benefits that ASICs can bring with AI workloads and beginning to design their own AI ASICs. Meanwhile, no AI ASICs are as battle-tested as Alphabet's TPUs, which have been used for its massive internal workloads (such as search and YouTube), as well as to train its own highly advanced Gemini models.
Image source: Getty Images
A structural advantage
With the best custom AI chips, together with one of the world's top foundational AI models, Alphabet has a huge advantage in the next phase of AI. No other competitor has this combination, which gives it a massive structural cost edge that creates a flywheel effect.
By training Gemini on TPUs, Alphabet is getting more bang for its buck on its capital expenditure (capex) spending than competitors, which largely rely on Nvidia's much more expensive graphics processing units (GPUs) to train their models. This gives it a better return on its investment, which then lets it plow more money into improving its TPUs and AI models, which creates more demand for them from customers. At the same time, its TPUs can also lower inference costs for both itself and its customers, making its overall operating model the most cost-competitive. This all leads to a virtuous cycle that competitors cannot match.
Meanwhile, having its own world-class AI model allows Alphabet to capture the entire AI revenue stream, unlike cloud computing rivals Amazon and Microsoft, which rely more heavily on third-party LLMs. The company's pending acquisition of leading cloud security company Wiz will only extend its ecosystem advantage.
The big winner in the next phase of AI infrastructure isn't going to be a simple chipmaker or cloud computing company; it's going to be a company that is vertically integrated and can optimize the entire AI model training and inference process. That company is Alphabet, which has a multigenerational head start with its custom AI chips and the deepest vertical integration. That makes the stock a long-term buy, even after its strong run this year.