POST OF THE DAY
Video & PC Games
In Reply To:
Sony's Revolutionary PS3 Chip

Format for Printing

Format for printing

Request Reprints

Reuse/Reprint

By LongHook
March 7, 2003

Posts selected for this feature rarely stand alone. They are usually a part of an ongoing thread, and are out of context when presented here. The material should be read in that light. How are these posts selected? Click here to find out and nominate a post yourself!

David,

Basically you're stating some things that were voiced by many several years ago, around the time X-Box was announced. I think Gabe Newell (Valve Software) even used the term "shot across the bow" describing PS2 and its threat to Windows.

At the time, most people blew off these concerns, and people like me continued to dump money into companies like NVDA, AMD, INTC, etc.

I fear a lot of investors in semi companies such as the above are caught up rationalizing about "short term volatility", in the hopes that one day those companies will re-emerge as dominant forces, pricing and margins will return to where they once were, and all will be good again.

Here's a hint: that's not going to happen.

The reason is simple -- it's not a case of companies competing against each other for the consumer dollars, it's a case of companies competing against the consumer.

This bears repeating -- the consumer's elasticity or lack thereof when it comes to pricing tolerance is really the mitigating factor today, NOT the relative pricing and performance of chips. And this elasticity is based on both absolute dollars and value. Which means even if the economy is rip-roaring again, we're not going to see $3000 PCs become the norm again, because there's no value in such an expense.

Whether AMD existed or not would make almost no difference to Intel, because the market will bear a certain price for CPUs, and that's it. If Intel was the monopoly it was in the late 80s, it could set prices however it wanted, and consumers wouldn't care, they'd either not buy or simply hold onto older hardware, because there is almost no compelling reason to upgrade from a 1.8GHz P4 to a 3.06GHz P4.

In fact, there is almost no compelling reason for a typical consumer to upgrade from a P3/800 to anything faster today. I personally use a dual P3/933 for software development in the office; a P3/1000 for work and play at home; and an Athlon XP1800 for music composition. I own a dozen computers, and not a single one is a Pentium 4.

The ONLY time I've felt a need to upgrade is when I want to play a new game that won't run well -- at which point I just don't play that game -- or when I want a dual processor machine instead of a single processor machine because I want responsiveness.

Computers are pretty damned fast today, and enthusiasts like to keep saying "That's what we always say, but there's always that killer app". Folks, we've been waiting some three years for a killer app, and short of games, it hasn't shown up. As much as a pig as Windows and Office would like to be, the hardware can keep up.

When we went from 286 to 386, there was a huge difference in practical performance. From 386 to 486 to Pentium there was a fairly major leap. Pentium to Pentium Pro it started to slow down, and then it's been flattening ever since. Computers have become far faster than any typical consumer needs except for high end games.

And it's not just the CPUs. I used to think that "Nvidia will become the Intel of microprocessors and take over the GPU world". And hey, you know, I might be right -- but it doesn't matter. Because NVidia won't be able to get away with charging $300 for graphics boards forever, as much as they and their investors would like to think otherwise.

The $100 graphics board is more than good enough to play a typical PC game, and game developers know that if they exceed the system requirements of the mass market PC (and this really means i845G integrated graphics and worse) they'll get slaughtered by returns. So they'll keep their system requirements down by writing scalable software.

To make matters worse, the increasing graphics power we're getting A.) isn't being leveraged and B.) isn't particularly noticeable to the typical player. We're no longer seeing first order improvements in technology, it's all second order stuff but at first order pricing -- and that's the definition of "outside the sweet spot".

On the AMD board I see this attitude of "when [some magical event occurs], AMD will be 'just like Intel'". But they're missing the greater point, which is that Intel itself won't be like Intel, and that Intel is destined for long term problems as well.

Thankfully these companies are often run by intelligent people, people a lot savvier than the dork individual investor =) Andy Grove, Craig Barrett and Bill Gates are not stupid. Neither is Michael Dell. Hector Ruiz and Jerry Sanders (AMD) are stupid, which is something I'm finally coming around to -- they're basically saying they want to compete with Intel on a low margin market with huge R&D expenditures, and that's what they'd be happy with. Smart strategy there guys...NOT.

The smart companies have recognized that commoditization is going to hurt them in the long run. Intel, long ago, decided to branch out into communication and other areas that people were calling "money sinks", but the reality is that Intel had and has no choice. They can't charge whatever they want for CPUs anymore, so they have to find new areas where they can raise margins and volumes again, and where people are willing to pay for faster chips.

This is one reason Intel has been so gung-ho about getting things like voice recognition and what not out to the public -- these things eat CPU, and that's the only way Intel can start making mad margins like it did before. Barring such killer apps, Intel is screwed with a capital 'F', unless they manage to find a new niche to mine.

Microsoft recognized the threat of both open source and low cost consoles years ago, starting with WebTV and Java. Microsoft can read the writing on the wall, and that is exactly why they've started branching out into non-traditional areas such as PDAs and consoles. Microsoft knows that 10 years from now the PC as we know it today will cease to exist and instead we'll have some suite of devices far more similar to TabletPC, PDA and console than the ugly, loud and hot boxes being dumped on the public now.

Michael Dell has seen the same thing -- which is why Dell sells PDAs, racks, NAS, servers, etc. They know the regular PC business is getting commoditized to the point that they're hardly making money on their low end stuff, and they need to branch into higher margin and more future thinking territories if they wish to continue to remain relevant. Gateway and HP are not the major threats to Dell's existence; it's consumer expectation that will end up driving a dagger through Dell.

Whether these companies pull it off remains to be seen. But right now we're going to look back on the millennium break point (1990-2010) as a huge inflection point, where computing ceased to be something that was nifty, neat and different and suddenly became something that was typical, average, taken for granted and inexpensive.

Ask cell phone manufacturers and ISPs what happens when the new technology becomes a commodity. Ask the RIAA what happens when consumers finally balk at paying absurd prices for commoditized stuff they can find elsewhere for cheap.

I see all this, and my own gaming company recognizes this and is trying hard to gather enough capital to move ahead of the tidal wave that's going to hit the traditional PC gaming sector.

-Hook

P.S. Thanks for that post, I've been meaning to dump this for a while, just wasn't expecting to do it here =)


Become a Complete Fool
Join the best community on the web! Becoming a full member of the Fool Community is easy, takes just a minute, and is very inexpensive.