Moore's Law has long been a guiding light underpinning technology innovation, but could this principle be rendered outdated as inflation continues to soar? In this segment of Backstage Pass, recorded on Dec. 14, 2021, Fool contributors Asit Sharma, Rachel Warren, and Demitri Kalogeropoulos discuss.  

10 stocks we like better than Walmart
When our award-winning analyst team has an investing tip, it can pay to listen. After all, the newsletter they have run for over a decade, Motley Fool Stock Advisor, has tripled the market.*

They just revealed what they believe are the ten best stocks for investors to buy right now... and Walmart wasn't one of them! That's right -- they think these 10 stocks are even better buys.

See the 10 stocks

Stock Advisor returns as of 6/15/21

Asit Sharma: If you've watched any episodes that we've had so far, you may have realized that I'm obsessed with Moore's Law, which is not a law but a prediction by Gordon Moore, who was a very prominent scientist engineer at Intel. He is still around.

He made this prediction in 1965 that the number of transistors in an integrated circuit will double every two years, and this is really important.

Not so much that it's proven a law, but that it's proven to be something that we can at least achieve for some time going forward because it means that we can pack more transistors on chips and we can keep advancing the rate of technological change and increase our computing power every year.

I tend to follow news stories about this. I noticed this week, as I have over the past few weeks, that it's really starting to heat up. There are a number of companies playing in this space. The usual suspects, like Intel and Apple and Taiwan Semiconductor, but even companies like Samsung and IBM are bringing their own technology to this fore, and I enjoyed.

Funnily enough, this first articles from phonearena.com. [laughs] But it's called "Samsung and IBM Have a Plan to keep Moore's Law Alive." Basically this article is talking about the limitations of chips when the chip itself meets metal, the resistance there and how we run out of opportunities in 2D space, even though we keep going to smaller nanometer standards. If you think about those physical limitations, you have several potential solutions.

Some involve quantum mechanics, some involve really cool things that have to do with incredible precision laser technology, and I think ASML is at the forefront of those. But this is something novel I hadn't heard about before.

IBM, Samsung, are thinking about stacking basically these 2D chips or going to the next dimension as a way to get past the physical resistance, the laws of physics that prevent us from saying with certainty that we can keep Moore's Law alive any longer. I've seen various predictions that between these great companies like ASML, which has this incredibly complex machine.

The high-end AUV machine, that's for making really intricate circuitry on a chip. We might be able to keep Moore's Law alive till 2023, 2025. Not beyond that. I haven't see much beyond that.

Although companies like Intel are saying that we could extend it even past that. I wanted to point this out, not just to say that it's sitting in this obscure journal, but also this is the world's largest technical organization. The IEEE, I don't know what the three Es stands for, also had some articles out in the last couple of weeks.

One was called "AI Training Is Outpacing Moore's Law." This is about Nvidia, which we've mentioned before as being a key player in the race to extend Moore's Law using machine learning as it designs its own chips circuitry.

What I'm trying to communicate out of all this is that companies that have anything to do with end devices are starting to partner up. I mentioned Samsung. Google is also a big player in this space.

They are teaming up with the companies which traditionally associate with the manufacturer of chips like Intel, like Nvidia. Even Apple is getting into this game, which I think I talked about a couple of weeks ago.

All to say, if we accept the proposition that this virtual revolution or this virtual opportunity space is going to be big over the next several years, we can't rest that on a premise of hype, because the metaverse is exploding for example.

If we don't have the computing power, a lot of the technologies that people are promoting just now won't pan out as seriously great investments over a 10-year period.

This is something I'm paying attention to simply because for all the other investment theses that I'm looking at to work out, we're going to need to get better, supply chain constraints or not.

I should say before I hand this open back over for comment, that all of the supply chain constraints in the chip industry is starting to be very good for U.S. because we've multiple companies, Intel and Taiwan Semiconductor are the two biggest that are spending billions to put manufacturing capacity here in the U.S.

So we're not as reliant on chips coming from Asia. That's just a sidelight there. But thoughts and comments on this guys, and we'll move to the stocks.

Rachel Warren: I can jump in really quick. This is not an area I follow particularly closely so I find it interesting to learn more about it. I found though as you were mentioning with supply chains and this correlation to Moore's Law, I found this interesting article on Yahoo! Finance from not that long ago with a quote by the co-founder of a company called DataTrek.

His point was that inflation, which for the first time computers have actually contributed to that, which historically that's not been an area that has touched inflation at all, but it is now because of the semiconductor shortage, that the ongoing shortages in that area and how that's impacted computers has actually resulted in computers becoming 8% more expensive than a year ago and has actually derailed Moore's Law to a certain extent.

It's interesting to see that other take on this, that Moore's Law isn't something I've studied very closely and I don't follow a lot of semiconductor stocks. But it's interesting because I know that that law is essentially the point being that as technology products get better and faster, their production costs diminish.

But that isn't really a trend that we're seeing right now so fascinating to follow and it will be interesting to see how these companies, as they team up, how they're able to keep up with maintaining production costs without derailing long-term profit and revenue goals.

Demitri Kalogeropoulos: Yeah, that is interesting. Also that you know, I don't have a lot of engineering background or anything in this space either. It's been a long time since I dropped out of the engineering major in college and switched to business [laughs]. But I do remember being introduced to that, I'm pretty amazed that it's still working.

I'll just say from a consumer standpoint, I recently upgraded my iPhone after five years, I had the iPhone and I think it was the 8, now I'm in the 13, I'm blown away by the difference.

It looks like the same, but the computing power is just amazing, particularly the way it processes, cameras and the pictures and things like that, it's absolutely amazing to think that we can double that, at least for another year, that processing power.

You hear about Apple's new M1 chips and I know I've seen some reviews online about their latest lineup of Macs and MacBooks that have these chips and how they've just blown away all the previous speed tests and things like that.

It's really interesting to see that and be reminded, but how far we've come in just a few years and how we're at least on track to double that again next year and hopefully they will keep finding ways to do it again.

Sharma: It's really interesting if I conflate your two points. Apple got into this game because it wanted more control over its own supply chain. Because for them, their biggest headwind to sales is just not having enough capacity when they release a new model.

Some years ago they decided to invest in their own chips and chip design. Of course, the manufacturing is still subcontracted. But now they're a contributor to Moore's Law which again, for those of you who might have just joined isn't really a law, but a prediction that the industry rallies around.

I do find that interesting, Rachel, all of this is so theoretical, it's predicated on ready supply of all the components. Even if you are Nvidia and you are putting together supercomputers, which are helping you advance Moore's Law, if you yourself are telling your engineering department that we'll delay those tests a bit. We're going to use all available capacity to service customers right now.

That slows down the progress toward this so very interesting and I know for those of you who are avid Cathie Wood fans, they've for many years predicated a lot of their predictions about technology on this idea that production costs decrease, chips get more efficient.

So, when you throw a wrench in that it really makes the law or the prediction become something a bit more brittle than everyone assumed it was. Fascinating, because this won't be the last time I talk about this.