NVIDIA to Intel: Your Days Are Numbered

Apparently NVIDIA (Nasdaq: NVDA  ) CEO Jen-Hsun Huang has yet to learn the nuances of the phrase "underpromise and overdeliver." Speaking about the "GPU Computing Revolution” as the keynote at the "Hot Chips 21" symposium, the ebullient Huang boasted that graphics processor (GPU) computing performance would increase to 570 times its current level over the next six years, while traditional central processors alone (CPUs) would "merely" triple their power in that time. The website TG Daily reports that Huang further boasted that this leap in power could enable such futuristic technologies as real-time universal language translation and "advanced forms of augmented reality."

As Keanu would say: "Whoa"
So, should you throw out that dusty copy of Rosetta Stone (NYSE: RST  ) software and prepare for the coming technological utopia? I wouldn't just yet. While Huang has led NVIDIA brilliantly over his tenure as co-founder and CEO, he's a bit prone to hyperbole and a sense of grandeur. Earlier this summer, he boasted to The Wall Street Journal that NVIDIA "doing graphics only is a disservice to humanity." Huang is convinced NVIDIA's technology will power the future, leading to what he believes will be an "addressable market [that] could swell to $26 billion [annually]."

Here comes the rabble-rouser
You might have noticed NVIDIA in the news lately following Microsoft's (Nasdaq: MSFT  ) announcement of the new Zune HD featuring NVIDIA's computer-on-a-chip, Tegra. While the company is set to battle such competitors such as Qualcomm (Nasdaq: QCOM  ) , Texas Instruments (NYSE: TXN  ) , and Freescale Semiconductor in the mobile phone space, Huang's comments once again bring into focus the larger prize NVIDIA has been eyeing: graphics processors as the central hub of everyday computing needs.

Huang's grandstanding on this topic has ruffled some feathers at central processor kingpin Intel (Nasdaq: INTC  ) , which has its future tied to the idea of the central processor being the workhorse of modern computers. Straddling the middle ground is AMD (NYSE: AMD  ) , which has been trying to combine the central and graphics processor in its new Fusion platform.

However, don't think Intel's ceding the graphics booty for AMD and NVIDIA to fight over. Sometime next year, Intel plans to roll out its Larrabee processor, which will be based on the x86 architecture common in central processors. It's a radically different approach, but Intel thinks the familiarity of Larrabee's architecture will make it easier to build software around its platform. Whether that's the winning approach or not remains to be seen.

More avenues of growth
There's a lot of noise for investors to sort through here, but the important thing to remember is that more and more computing needs are being offloaded to graphics processors. Examples of this movement abound, from the new OpenCL framework, which shares resources between the central processor and graphics processor more effectively, to industries such as oil exploration and finance choosing powerful graphics processors as their weapon of choice.

While the sheer magnitude of Huang's performance prediction might be more attention-grabbing than feasible, we've witnessed another shot in the central-processor-vs.-graphics-processor battles.

Foolish bottom line
Semiconductor investors along the value chain should take notice, and while bold predictions shouldn't be necessary to get their attention, the shift to a new kind of computing world has already begun. Watch this fight closely; Intel has a lot more to lose than it might let on.

Eric Bleeker owns shares of NVIDIA, but no other companies listed above. NVIDIA is a Motley Fool Stock Advisor pick. Intel and Microsoft are Inside Value selections. The Fool's disclosure policy has set the bold prediction of increasing in power by a thousandfold in the next three years -- look out, Jen-Hsun Huang!


Read/Post Comments (3) | Recommend This Article (20)

Comments from our Foolish Readers

Help us keep this a respectfully Foolish area! This is a place for our readers to discuss, debate, and learn more about the Foolish investing topic you read about above. Help us keep it clean and safe. If you believe a comment is abusive or otherwise violates our Fool's Rules, please report it via the Report this Comment Report this Comment icon found on every comment.

  • Report this Comment On September 02, 2009, at 1:02 PM, fl1180 wrote:

    Jen-Hsun Huang has been maniacal about this for a few years now. To get your existing code to talk efficiently to these graphic processors you have to recompile all you existing code...not easy.

    If Intel couldn't convince people to do that with Itanium, then what makes anyone think Nvidia can do it. Intel's picking the x86 route because they've already learned their lesson.

  • Report this Comment On September 02, 2009, at 8:12 PM, pramanuj wrote:

    It will be interesting to see how Intel scores with X86 on its graphics processors. What it is actually diong by going the x 86 way is to " slap" multiple CPU cores on a chip and hope that this gives enough parellelism to convert a sequential x86 to a parallel computing unit.

    Wonder how that is going to work

  • Report this Comment On September 03, 2009, at 12:38 PM, Kirkaiya wrote:

    While Intel has made very few large-scale "bad bets" in the past - the most obvious being the Itanium architecture in collaboration with HP - I think they have a pretty high chance of at least becoming a major player in the high-end GPU market.

    The Larrabee GPU is expected to ship with 32 or 48 cores - and keep in mind that Intel demoed an 80-core CPU (non-x86, research only) two years ago. A Larrabee 48-core GPU built on a 32 nm process would crank out 2 - 4 TFlops, depending on the speed of the cores. That's on par with the upcoming ATI Radeon 5800 series, expected this fall.

    So - while I don't think NVidia nor AMD/ATI are going to have their butts handed to them - Disney/Pixar is already using pre-production Larrabee, and Intel is making a serious play at the GPU space.

Add your comment.

DocumentId: 976402, ~/Articles/ArticleHandler.aspx, 7/29/2014 7:12:02 AM

Report This Comment

Use this area to report a comment that you believe is in violation of the community guidelines. Our team will review the entry and take any appropriate action.

Sending report...


Advertisement