Apple Deals NVIDIA a Crippling Blow

Don't let it get away!

Keep track of the stocks that matter to you.

Help yourself with the Fool's FREE and easy new watchlist service today.

According to a CNET report today, NVIDIA (Nasdaq: NVDA  ) is losing whatever toehold it still has in Apple (Nasdaq: AAPL  ) systems. This will hurt.

Apple replaced NVIDIA graphics in most of its iMacs earlier this year, opting instead for chips by Advanced Micro Devices (NYSE: AMD  ) . Now, sources indicate the next wave of low-end Macbooks will be powered by Intel (Nasdaq: INTC  ) processors with improved graphics capabilities built in, which would make an additional NVIDIA chip entirely superfluous.

Thin and light notebooks such as Apple's ultra-svelte designs don't have any room to waste on needless chips, so an all-in-one Intel solution would fit the bill perfectly. Previously, main processors have not come equipped with graphics-crunching features, and the built-in graphics in system chipsets have never been powerful enough to do the heavy lifting in multimedia-heavy-use cases. Intel's Sandy Bridge architecture changes all that, making integrated graphics a real alternative for Apple.

AMD's Fusion products have even more graphics power than Intel's latest and greatest, but Apple has a long-standing agreement with Intel and is not likely to replace the very core of its computers with a rival supplier on short notice. You'll still find AMD graphics in most high-end Macs, though.

So Apple gets some more design headroom than before, and AMD is largely watching this commotion from the sidelines, unaffected. But if CNET's sources are correct, NVIDIA is losing a very attractive contract here, not only in terms of direct sales, but also in the prestige that comes with being a key Apple supplier.

NVIDIA is in the process of becoming a mobile processor powerhouse, assuming its hopes for the Tegra processor are realistic. There's also a push into big-iron server systems and supercomputing by way of the Tesla product line. Meanwhile, its bread-and-butter graphics technologies are in danger of becoming irrelevant. AMD has been wiping the floor with NVIDIA's graphics products recently, whether you're looking at performance, efficiency, or sales, and the OEM design wins are few and far between.

Can NVIDIA still save its once-dominant graphics products, or should the company just resign itself to a future in mobile and server computing instead? Discuss in the comments below.

Fool contributor Anders Bylund holds no position in any of the companies discussed here. Intel is a Motley Fool Inside Value pick. Apple and NVIDIA are Motley Fool Stock Advisor selections. The Fool owns shares of and has bought calls on Intel. Motley Fool Options has recommended buying calls on Intel. The Fool owns shares of Apple. Try any of our Foolish newsletter services free for 30 days. We Fools may not all hold the same opinions, but we all believe that considering a diverse range of insights makes us better investors. You can check out Anders' holdings and a concise bio if you like, and The Motley Fool is investors writing for investors.

Read/Post Comments (42) | Recommend This Article (12)

Comments from our Foolish Readers

Help us keep this a respectfully Foolish area! This is a place for our readers to discuss, debate, and learn more about the Foolish investing topic you read about above. Help us keep it clean and safe. If you believe a comment is abusive or otherwise violates our Fool's Rules, please report it via the Report this Comment Report this Comment icon found on every comment.

  • Report this Comment On December 09, 2010, at 2:31 PM, doctorsynthesis wrote:

    This is not at all a "crippling blow"....the margins from chipsets in consumer computers are low already. On top of that, Nvidia just scored a host of new laptop deals with leading PC notebook makers to offset any loss in their notebook presence.

    I find the notion of the "prestige" associated with being an apple supplier just absurd! Moreover, the article's comments regarding AMD wiping the floor with Nvidia's graphics products is about 6 months outdated....Nvidia is now the leader once again....there is no point in citing an ancient article in a tech company write-up....get your facts updated before posting! And yes....Tegra 2 will be raining down in 2011!

  • Report this Comment On December 09, 2010, at 2:39 PM, austec wrote:

    There is no surprise that Apple went with Sandy Bridge, even though Sandy doesn't even have the proper technology to support DX11.

    Intel paid $200 million to IBM to delay/abaondon an AMD product release

    Intel paid Dell $1BILLION in a single quarter to keep Dell from buying AMD processors

    Intel, through subpoenaed court documents, apparently bribed/coerced EVERY major OEM, as well as major tech retailers (such as MediaMarkt) to drop their plans for AMD products.

    So... You think Intel doesn't have a "special under-the-table" exclusivity and marketing "arrangement" with Apple? Think again!

  • Report this Comment On December 09, 2010, at 3:26 PM, gslusher wrote:

    "Thin and light notebooks such as Apple's ultra-svelte designs don't have any room to waste on needless chips, so an all-in-one Intel solution would fit the bill perfectly."

    Then, why, pray tell, does the MacBook Air have an NVIDIA graphics chip along with an older Core 2 Duo processor? A similar combination is in the 13" MacBook Pro, while the larger 15" and 17" MacBook Pros use Intel i5 or i7 chips PLUS an NVIDIA graphics chip.

    It turns out that the integrated chip actually requires MORE room--and more power--than the older separate chips, and gives less graphics processing capability. See:

  • Report this Comment On December 09, 2010, at 3:26 PM, TheBlindCat wrote:

    The rumor does not jibe well with Apple's most recent real-world behavior. Apple seems pretty well invested in open-cl in Snow Leapord and beyond, so much so that they opted for Nvidia in the new MacBook Air. We may say SandyBridge as an option in the low-end MacBook to throw Intel a bone but the lack of GPU open-cl support makes me a little skeptical, more FUD?

    According to Anand:

    RE: OpenCL? by Anand Lal Shimpi on Friday, August 27, 2010

    Sandy Bridge's GPU does not support OpenCL. This is strictly a graphics play, Intel doesn't have an announced GPU compute strategy outside of what it's doing with Larrabee.

    Take care,


  • Report this Comment On December 09, 2010, at 3:39 PM, rav55 wrote:

    Yeah like nobody saw this coming.

  • Report this Comment On December 09, 2010, at 3:51 PM, TheBlindCat wrote:

    Arrgh. Not "open-cl" rather OpenCL.

  • Report this Comment On December 09, 2010, at 3:52 PM, rav55 wrote:

    "This is not at all a "crippling blow"....the margins from chipsets in consumer computers are low already. " What planet are you from????

    Apple didn't dump just Nvidia chipsets, they dumped Nvidia graphics! Sandy Bridge doesn't require a chipset or a graphics GPU.

    Nvidia just got dumped and that is huge. First Apple...

  • Report this Comment On December 09, 2010, at 3:56 PM, rav55 wrote:

    OpenCL??? That is irrelevant to x86.

    OpenCL simply allows the GPU to be used in highperformance computing.

    Stop throwing words and research what you are talking about.

  • Report this Comment On December 09, 2010, at 4:19 PM, TheBlindCat wrote:

    For what it is worth, I think it is more likely that Apple would go with AMDs Fusion Chip over SandyBridge since AMD has always been a strong supporter of OpenCL.

    OpenCL is of course recently supported by Intel on it's x86 chips, just not it's current GPU.

    As the links provided by @gslusher state:

    "The 9400M also had the benefit of being CUDA-compatible, which was more important down the line when Apple rolled support for the CUDA-like OpenCL into Mac OS X 10.6."

    Which was more or less confirmed by Jobs himself:

    "As Steve Jobs recently explained in one of his increasingly frequent, succinct e-mails to customers: 'We chose killer graphics plus 10 hour battery life over a very small CPU speed increase. Users will see far more performance boost from the speedy graphics.'"

    And we all know who makes the best drivers for OpenGL and OpenCL, don't we?

  • Report this Comment On December 09, 2010, at 4:20 PM, doctorsynthesis wrote:

    Reply to "rav55":

    First off, a chipset refers to nvida's graphics and associated controller that were mated to intel cpus prior to the introduction of the integrated memory controller in the core i-series....and the profit margins from this business makes up the smallest portion of nvidia's revenue....understand this before you accuse others of what planet they're from.....secondly, apple is not dumping nvidia graphics....why on earth would they do that given their focus on professional graphics that is dominated by nvidia....what apple is rumoued to be doing is phasing out low end nvidia graphics from their notebook lineup....their desktop line up will surely continue to include nvidia graphics (along with AMD) as the end, any financial hit nvidia takes from losing apple notebook share in graphics will be a tiny drop in the bucket.....tegra will be nvidia's bread and butter of the future....clearly, rav55 is uninformed of facts and reality....and how businesses operate.....

  • Report this Comment On December 09, 2010, at 4:30 PM, TheBlindCat wrote:

    +1 @doctorsynthesis

    And a little more "research"

    Now a new technology in Mac OS X Snow Leopard called OpenCL takes the power of graphics processors and makes it available for general-purpose computing. No longer will graphics processors be limited to graphics-intensive applications such as games and 3D modeling. Instead, once developers begin to use OpenCL in their applications, you’ll experience greatly improved speed in a wide spectrum of applications.

  • Report this Comment On December 09, 2010, at 5:08 PM, TheBlindCat wrote:

    Gosh this "research" stuff is fun!

    OpenCL: What you need to know

    Graphics technology is at the center of Apple’s Snow Leopard efforts

    The following link encapsulates a good summary of the issues with some interesting comments:

    "One of the key underlying questions revolves around OpenCL, a software framework that can exploit a GPU’s inherent ability to run certain applications much faster than a standard central processing unit, or CPU. OpenCL has been touted as enabling “developers to tap the vast gigaflops of computing power ” in graphics processors, according to Apple’s Web page highlighting features of the OS X “Snow Leopard ” operating system. OpenCL, for example, can be used in Apple’ iLife titles, such as iPhoto for scene parsing and face recognition.

    And OpenCL has been somewhat of a trump card for graphics chip supplier Nvidia, which already has support for the technology in its chips. Though Intel plans to support Open CL natively in its processors and has released Alpha drivers and a software development kit for Open CL , that support, as stated publicly, is CPU-centric and still at a nascent stage of development. However, Intel is also working on OpenCL for the graphics part of Sandy Bridge, according to sources."

  • Report this Comment On December 09, 2010, at 5:15 PM, rav55 wrote:

    OpenCL is a software standard that allows gpu to be used as cpu's but NOT in PC's or laptops but rather with customised software for specific applications. In other words you can't run Windows on it. BUT you can run Apple OS applications.

    AND chipsets are NOT GPU's. They are the chips that ALLOW the cpu to communicate with the GPU. On the ATI bus you have a CPU connected to a chipset you may also have a discret gpu or motherboard integrated gpu. A chipset is not a GPU.

    With the Sandy Bridge Nvidia LOST ALL chipset business BUT the do have a license to plug in discrete GPU's into the Sandy Bridge bus for greater graphics performance.

    Invidia used to produce Chipsets for both AMD and Intle motherboards BUT not any more.

    Losing GPU buisness is HUGE, colossal and is the start of the crumbling of Nvidia foundation their reason for being: discrete Graphics.

    It's like MacDonalds loosing their hamburger (GPU) business just after they lost the fry (chipset)business.

  • Report this Comment On December 09, 2010, at 5:35 PM, doctorsynthesis wrote:'re all over the place. You're missing the whole point about how (potentially) losing this segment is not going to hurt nvidia at all in terms of their revenue. Secondly, apple is not shutting nvidia out of their products (namely desktops) as I already iterated in my previous post. I am not sure why you are so adamant on promoting this rumour of apple dumping nvidia from its notebook lineup a big a drop in the bucket as I said already. I hope you do not make real investment plans based on such unsubstantiated conclusions and extrapolations. Read this article:

  • Report this Comment On December 09, 2010, at 5:54 PM, rav55 wrote:

    Yeah the Barrons artical is about CHIPSETS and yeah Nvidia noted that it was not developing CHIPSETS any longer.

    BUT we are not talking about APPLE dropping Nvidia CHIPSETS which is a minor business. IAPPLE is dropping NVIDIA GRAPHICS.

    What is Nvidia???

    A GRAPHICS COMPANY trying to morh into a handset and netbook cpu company with the unused Tegra.

    It is not likely that Apple will use Nvidia in the next refresh of high end desktops due to "bumpgate" or the junk GPU's that Nvidia tried to pawn off on Apple a couple of years ago. See below.

    If you want to keep on buying Nvidia that's your business. I wrote these comments to correct 2 innaccuracies.

    1. Chipsets ARE NOT graphics. Chipsets are Motherboard Core logic chips that allow the cpu to "talk" to the GPU. They are pretty cheap too.

    2. And OpenCL doesn't really matter much in the broad scheme as in six months AMD, Intel and Apple will both make use of that standard.

    Apple probaly is still a bit miffed at Nvidia:

    "On October 9, 2008, Apple Inc. announced on a support page that some MacBook Pro notebook computers had exhibited faulty Nvidia GeForce 8600M GT graphics adapters.[34] The manufacture of affected computers took place between approximately May 2007 and September 2008. Apple also stated that it would repair affected MacBook Pros within three years of the original purchase date free of charge and also offered refunds to customers who had paid for repairs related to this issue."

    Nvidia basically told Apple to go pound silicon.

  • Report this Comment On December 09, 2010, at 5:55 PM, rav55 wrote:
  • Report this Comment On December 09, 2010, at 5:56 PM, TheBlindCat wrote:

    @rav - reading your commentary I am reminded of a conversation I once had with an employee.

    Several of us were on a construction site when this guy asked us "What's that for?"

    Several of us looked at him and our jaws just dropped. We said (in unison and I guess the incredulity was pretty obvious in our tone )

    "It's a pile driver!"

    To which he responded ... "I know it's a pile driver, but what does it do?"

    Why was I reminded of that, um-mm, no reason in particular.

    However regarding your comment ...

    "OpenCL is a software standard that allows gpu to be used as cpu's but NOT in PC's or laptops but rather with customised software for specific applications. In other words you can't run Windows on it. BUT you can run Apple OS applications."

    1) OpenCL is not limited to GPUs.

    2) Yes OpenCL is relevant going forward in both PCs and (gasp) laptops.

    3) I buy a Mac to run Mac OS, I know some misguided people run Windows on Macs, but that's just silly (IMHO) and this post is about Apple wrt Intel, Nvidia and to a lesser extent AMD chips/chipsets.

    4) OpenCL not just for breakfast, er ... I mean supercomputing anymore.

    "There is widespread assumption through the industry that all the tier-1 mobile OEMs will soon produce products that use and support OpenCL. Consequently it's now a pretty much mandatory requirement for next-generation GPU IP to support OpenCL."

    5) Any algorithm that would benefit from the massively parallel processing that typically sits idle, even in a smart-phone or tablet device, will benefit from the adoption of OpenCL. Notice I said "algorithm", does not matter whether the algorithm is part of an application or an OS or whether the algorithm runs on Snow Leopard, Windows or even Android. I'll give you a fun one: Apple could decide to use facial recognition to unlock your iPhone instead of making you key in a pass-code. This would be an OS level call that would most assuredly benefit from the same massive parallelism that modern games require of today's GPUs.

  • Report this Comment On December 09, 2010, at 6:06 PM, TheBlindCat wrote:

    @rav55 - Fess up, you are really Charlie Demerjian im disguise, aren't you?

  • Report this Comment On December 09, 2010, at 6:08 PM, rav55 wrote:

    Exactly my point. In a few months OpenCL which is an open standard will be recognised by all cpu designers and integrated into the design.

    Advantage: nobody. Disadvantage: anyone who designs silicon without it.

    How many years have you been waiting to recite your little construction story? LOL

  • Report this Comment On December 09, 2010, at 6:13 PM, TheBlindCat wrote:

    @rav55 - A long, long time ... thanks for that ;-)

  • Report this Comment On December 09, 2010, at 6:18 PM, rav55 wrote:

    What I am trying to understand is why there is such an optimistic outlook regarding Nvidia.

    They have nothing new besides the normal refresh of GeForce. Tegra is just another player in an all-ready over crowded niche. The major players being ARM, Intel and AMD.

    They offer no new technology and are being slowly squeezed out of their market wheelhouse: GPU's. And they all ready have conceeded the chipset market. They used to be the best core logic provider far faster than Intel or AMD. In fact Nvidia core logic made both Intel and AMD shine.

    The only real saving grace for them is the use of GPU's for supercomputing but even that advantage is will be erroded as the mid price point will no longer help subsidize the development costs of new GPU silicon . Volume sales is what keeps GPU's cheap as well, er, chips. Start losing that volume then the unit cost starts to rise making gpu's unattractive. The mid price point will be eliminated with Sandy Bridge and Fusion. Of course legacy boards will still be marketed for a few more years.

    So just why is the outlook for Nvida so rosey?

  • Report this Comment On December 09, 2010, at 6:23 PM, TheBlindCat wrote:

    @rav55 - I realize that one day Intel will support OpenCL, just like one day they will make a GPU that is worth a rats bottom parts and drivers that are competitive. The point I am trying to make is that Nvidia and AMD do BOTH today and will both do it even better tomorrow. I'll believe Intel can do it (without Nvidia's help) when I see it.

    Advantage: Nvidia with a wildcard spot for AMD.

  • Report this Comment On December 09, 2010, at 6:29 PM, TheBlindCat wrote:

    "What I am trying to understand is why there is such an optimistic outlook regarding Nvidia. "

    @rav55 - I'm exploring that very question in a series of blog postings over the next couple weeks. The next installment deals with software related ip.

  • Report this Comment On December 09, 2010, at 7:56 PM, doctorsynthesis wrote:

    rav55: i can see you reasoning now that you've explained it in more detail than your original sly comment to my post (no worries....i didn't take it personally)!

    Coming back to your question as to why most people (including myself, obviously) are optimistic about nvidia is the prospect in the mobile platform (notably, tablets). While it's true that there is huge competition (i.e. overcrowding) in this area, I think there is room for at least 3 players to dominate in the SoC arena. And nvidia is going to be the first out of the gates to introduce the cortex a9 based dual core tegra 2 chipset....being first does make a huge impact (especially given nvidia's already established reputation in graphics over the last decade). More importantly, nvidia is clearly committed to this branch of their business (their CEO even admitted it). There is a focus shift into this area over graphics within nvidia, I believe.

    The parallel I draw is with apple when they (a small niche computer company at the time circa 2005-6) decided to enter the smartphone market with the iphone. Now look at apple....the iphone is their bread and butter! Thus tablets will be the next "smartphone" revolution so to speak and I think nvidia is positioned nicely to rule it. Their partnership with Goolge android on tablets is a huge advantage for them as well. Android is the fastest growing mobile OS (faster than iOS) and will dominate the tablet landscape.

    And if you're seen the early reviews of the Tegra 2, you'll see how it blows past the competition:

  • Report this Comment On December 10, 2010, at 11:47 AM, rav55 wrote:

    "NVIDIA® Tegra™ 2 is the world’s most advanced MOBILE processor."

    This is from Nvidia's website.

    The emphosis is on MOBILE. Small cheap chips.

    And they are not X86. Tegra does not run Windows or Linux or MAC OS.

    Here is a comparison of Tablet PC's probably not too current close but good enough t make the point that there are a lot of players making the same thing. But nobody is going to buy a server or desktop or laptop unless it is x86. That means NO TEGRA. Nvidia is not licensed to design x86 CPU's. You are limited to Android or similar OS.

    If you want to be content with piece of the mobile market then they are great. But as the x86 market evolves with multi-core graphics enabled gpu's that can reach down into that market niche then the relevance of Tegra begins to decline.

    AMD fusion can compete downwardly into tablets as well as upwardly into netbooks and laptops. Intel Atom is probably done as it is antique architecture but Intel being Intel will evolve that architecture also. And again downward into the Tablet market.

    So the real attraction to hold Nvidia is the hope that Intel or Oracle or some other white knight will buy them out.

    In the mobile market it is all about battery life and graphics not clock cycles. As die sizes become smaller and the graphics capability of the big guns AMD and Intel become smaller and more efficient then the established mobile players will start to consolidate.

    I can't see Nvidia being a growth stock, but I do see it as an attractive acquisition, BUT the SEC is not likely to allow Intel to make that acquisition. I think that the Wall Street Journal made that point a few weeks ago.

  • Report this Comment On December 10, 2010, at 11:48 AM, rav55 wrote:
  • Report this Comment On December 10, 2010, at 11:56 AM, rav55 wrote:

    Oh yeah almost forgot!

    Tegra and Tegra 2 is not even a new architecture.

    Nvidia merely licensed the ARM core and built there own ARM cpu. All they did was reinvent ARM's wheel.

    So to sum up:

    Chipsets are done.

    Chipsets are not graphics.

    Apple dumps Nvidia graphics.

    Tegra and Tegra 2 is really ARM in a new dress.

    The outlook must be all acquisition.

  • Report this Comment On December 10, 2010, at 3:13 PM, renitentInv wrote:

    Well all this raging aside,

    The way I see it, nvidia graphics being dropped in the mobile platform isn't just "a drop in the bucket." I mean apple is a mobile computing company. The majority of their sales are not their Desktop systems with discrete graphics solutions.

    IMHO this isn't a HUGE problem for NVDA, but it certainly isn't a good omen, especially when their margins have been shrinking lately.

  • Report this Comment On December 10, 2010, at 4:52 PM, rav55 wrote:

    What else do they have to sell?

    Apple dumps them who's next?

    Dell? Gateway? Nvidia pi**ed off many whitebox sellers with "bumpgate".

    Time to pay the piper.

  • Report this Comment On December 10, 2010, at 5:51 PM, TheBlindCat wrote:

    @rav55 - As someone once told me, Research, research, research (said with the same emphasis as "Marcia, Marcia, Marcia")!

    When you say something like "Tegra and Tegra 2 is really ARM in a new dress." That is akin to saying AMD is Intel in a new dress because they cross license some technology, now doesn't that sound silly?

    Like Apple, Samsung, Qualcomm, Marvell and a host of others, Nvidia licenses ARMs designs and builds SOCs combining them with unique Graphics IP and other chippery.

    Also, most of the Arm development boards come with some sort of Linux distro and Ubuntu runs just fine on ARM, thank you. I'd also bet money that somewhere in Cupertino, OS X is running on ARM, leaving Windows as the odd man out.

    Tegra 2 is the undisputed leader from a performance perspective for ARM based SOCs and has been for nearly a year. Samsung Orion and TI's latest OMAP are a good match from a performance perspective but they have just been released to the development community ( love to get my hands on a Panda board ).

    By the time anyone else has a dual-core A9 out in the wild, Tegra 3 will be likely be available.

    Also I hate to be the one to break it to you, but Intel and x86 is dead (at least in laptops and other mobile devices), or hadn't you heard (only half kidding)?

    You do know that the bulk of Apple's cash comes from iPhone and iPad (both ARM). That trend will only accelerate in the coming years. I cover this pretty extensively at

    "Intel is dead on the desktop"

  • Report this Comment On December 10, 2010, at 6:13 PM, rav55 wrote:

    Yeah sure ARM is pretty good for a smartphone.

    But is it x86??


    That means NO WINDOWS, NO MAC OS and NO LINUX.

    Tegra is just another piddiling little cell pphone or pda cpu.

    So it is irrelevant.

    Like I said.

    Tegra is ARM in a new thong. Big deal.

  • Report this Comment On December 10, 2010, at 6:16 PM, rav55 wrote:

    It also means no Windows Server, no Autocad, no Microstation, no Mathcad, no Catia.

    The world runs on x86 applications.

    The world is designed with x86 applications.

    Try designing anything but your dog dish on your cell phone.

    That is it!

  • Report this Comment On December 10, 2010, at 6:25 PM, rav55 wrote:

    What film studio would create an film like AVATAR with an ARM core Android OS cpu???

    ARM or Tegra can't even process HD viewing on your 400hz LED widescreen TV and you really expect that the PC world is going to drop x86 for a piddling half baked OS like Android running on an anemic core logic chip set?

    Do you really think that ARM or Tegra can compete with a 3.0 ghz, 16 core Bulldozer running Radeon 5700 ON DIE GRAPHICS??

    NOW? EVER?

    Intel is hiccupping over this and you really think Nvidia is goinf to breeze on into the PC market with licensed core logic?

  • Report this Comment On December 10, 2010, at 6:37 PM, rav55 wrote:

    ANDROID has less than .6% market share TOTAL worldwide all cpu's, all uses, the whole nine yards. Everything from cellphones to servers.

    Microsoft has almost 89%

    MAC OS almost 10%.

    Somewhere Linux is hanging out with ANDROID.

    With these numbers I'd rather invest in a good restaurant. At least I'd get a good meal and a couple of drinks before the doors closed.

  • Report this Comment On December 10, 2010, at 6:44 PM, rav55 wrote:



    Basically a compiler to run software.

    Nvidia licenses HARDWARE, ARM core logic has been licensed to Invidia. Nvidia can't design a cpu.

    Nvidia has to ride in the backseat to the prom. ARM is driving.

    AMD designs their own architecture around a set of instructions on a standard bus; PCI.

    That iswhy they can run Windows and ARM can not.

    You really do need to understand the difference.

  • Report this Comment On December 11, 2010, at 7:47 AM, TheBlindCat wrote:

    A good hockey player plays where the puck is. A great hockey player plays where the puck is going to be.

    Wayne Gretzky

    I might add that a bad hockey player skates to where the puck used to be.

    When investing that advice is equally appropriate.

    4th quarter numbers make it clear where the market is going:

    Apple Q4 2010 Unit Shipments by Product-

    Desktops: 1.24 million units

    Portables: 2.64 million units

    iPod: 9.05 million units

    iPhone: 14.1 million units

    iPad: 4.189 million units

    There were more iPads sold than there were Desktops and Laptops combined.

    Turns out most people don't need to animate a 3D movie or produce complex engineering drawings with AutoCAD. They seem to be spending more time on FaceBook.

  • Report this Comment On December 12, 2010, at 5:29 PM, websterphreaky wrote:

    Writers for this site are idiots and obviously know NOTHING about the real world of PC's (even Macs are PC's ... they're just Dell clones, made on the same Chinese manufacturing lines).

    Get this straight once and for all Clueless Media Writers - APPLE DESIGNS AND MANUFACTURES NOTHING!! "Design" includes the Electronic Engineering, which Apple DOES NOT do. They "make" pretty pictures of the OUTSIDE of their PC Clones and that is ALL. (Note that the Mac desktop hasn't changed since the G5, either! The MacBook Pro is the same too)

    Apple has MADE NOTHING since the 1980s, the Apple ][e was the first, farmed out to Canada in 1979. Then off to Japan, Taiwan, Korea, Indonesia, Ireland and not ALL in Commie China's Sweatshops.

    Research it and stop BSing the public. Apple is NOT America's PC company. Apple is the Limousine Liberal company FIRST to OFF SHORE American Jobs.

  • Report this Comment On December 13, 2010, at 9:02 AM, TheBlindCat wrote:

    @websterphreaky - you might have a difficult time convincing the 46,600 Apple employees (primarily in the U.S.) that they don't "make" anything.

    I guess since a books author and/or illustrator did not actually print the book or do the binding themselves, they too have contributed nothing?

  • Report this Comment On December 16, 2010, at 5:21 PM, Ironbob wrote:

    "Earth to Apple jockies, this is reality calling..."

    You still lag behind Wincomps by about 85%-90%.

    No one with a lick of computer sense would ever rely on "built-in" or "on-board" graphics for anything other than cruising a porn site or checking out Facebook.

    Graphics afficianados actually do use PCs overwhelmingly over I-mac regardless of what the Appleheads want you to believe and Nvidia makes the money off PC, not Apple.

  • Report this Comment On December 19, 2010, at 12:44 AM, Accreator wrote:

    Why I am bullish on Nvidia:

    1) Tegra 2 smartphones LG optimus 2x, Motorola Olympus and possibly Samsung xxx are going to be released in the coming month(s). First demos of Optimus 2x (aka Star) have proven the product is hot. In fact, Nvidia came first with dual-core to the smartphone market and is really challenging Qualcomm dominance in the smartphone processor market.

    Many people will buy what comes first.

    Expect HTC, SonyEricsson to have a look at Tegra 2. I expect Tegra 3 will become even stronger in the market.

    2) A tsunami of Tablet PC with tegra 2 will hit the shelves in early 2011:

    - Motorola tablet with Honeycomb 3.0 (Google choose Tegra/Motorola as partners for the first release of HoneyComb)

    - Toshiba Folio 100, Viewsonic G, Asus Eee Tablet, Notion Ink (could be huge), Acer tablet, ICD gemini and vega, Samsung galaxy tab 2(!!), Compal tablet, MSI tablet, Quanta tablet, Foxconn tablet, Matala SMBA, Advent Vega, Mobii, eLocity, LuvPad AD100,...

    Some estimate that >50% of all ARM based tablets will be Tegra 2 based

    3) The bumpgate has passed and Nvidia is again leading the stack in GPU for gaming and for supercomputing. Don't forget that Sandy Bridge is worthless for high end games (certainly good for playing Pac Man :0)

    Now about chipsets, Intel's Ottelini has been making major mistakes since he has become the boss of Intel. He has bought companies with low synergy (infineon wireless and even worse McAfee), he has weaked his desktop PC segment by pushing out Nvidia from the chipset business (very good for AMD btw, expect a flood of AMD fusion PCs in second half of 2011). And worse of all, Intel is still failing in the smartphone segment. I do expect Intel to fall apart (yes, yes) in the coming 2-5 years even though it sound unbelievable today. Only way out for Intel is to start designing Cortex devices, "which is out of the question, because we are Intel for god sake". A little bit like Nokia is going down the toilet in the smartphone segment because they don't want to make Android smartphones.

    So, the loss of the chipset segment for nvidia is certainly some cash flow gone out of the pocket, but on the other side, it is a good thing, forcing Nvidia to focus on the real market of today that is Tablets and Smartphones, where the growth is.

    Q1-Q2 2011 is going to be huge for Nvidia. I expect NASDAQ:NVDA to hit the 20$-25$ price in Q1 2010. Good shopping.

    PS. About X86. X86 and backward compatibility used to be the main advantage of the Intel-Windows paradigma, but today, the new paradigm in the Google/ARM world is going to be Cortex low power and VIrtualization/Cloud. All operating systems will be runnable on ARM devices in the near future.

  • Report this Comment On December 19, 2010, at 4:18 AM, LazeLaze wrote:

    Reply to "drsynthesis":

    So your argument is that NVIDIA will be unaffected by this decision because they'll still have a foothold in Apple's desktop market?


    Apple sells a minimum of desktops. In fact, they recently closed their server division because business was so poor and competition so fierce. Apple makes most of its money on mobile and laptop markets, and that is exactly where NVIDIA has been hurt here. How much NVIDIA was hurt will depend on whether the higher-end mobile devices and laptops continue to be equipped with NVIDIA chips. Apple's desktop performance has barely any effect on anything...

  • Report this Comment On December 30, 2010, at 12:13 PM, GBIL wrote:

    The CPU will become increasing less important with the adoption of nVidea's Fermi/CUDA architecture. With 100s even 1000s of cores on the GPU it will be possible to crunch in minutes what now takes hours/days to do.

    If all one is doing is word processing, speadsheets, email, & browsing, their phone has way more power than they need. But otherwise, CUDA & its successors appear to be the future.

Add your comment.

Compare Brokers

Fool Disclosure

Sponsored Links

Leaked: Apple's Next Smart Device
(Warning, it may shock you)
The secret is out... experts are predicting 458 million of these types of devices will be sold per year. 1 hyper-growth company stands to rake in maximum profit - and it's NOT Apple. Show me Apple's new smart gizmo!

DocumentId: 1395185, ~/Articles/ArticleHandler.aspx, 10/24/2016 2:48:32 PM

Report This Comment

Use this area to report a comment that you believe is in violation of the community guidelines. Our team will review the entry and take any appropriate action.

Sending report...

Today's Market

updated Moments ago Sponsored by:
DOW 18,224.12 78.41 0.43%
S&P 500 2,150.88 9.72 0.45%
NASD 5,304.04 46.63 0.89%

Create My Watchlist

Go to My Watchlist

You don't seem to be following any stocks yet!

Better investing starts with a watchlist. Now you can create a personalized watchlist and get immediate access to the personalized information you need to make successful investing decisions.

Data delayed up to 5 minutes

Related Tickers

10/24/2016 2:33 PM
AAPL $117.58 Up +0.98 +0.84%
Apple CAPS Rating: ****
AMD $6.99 Up +0.47 +7.13%
Advanced Micro Dev… CAPS Rating: **
INTC $35.18 Up +0.03 +0.09%
Intel CAPS Rating: ****
NVDA $70.25 Up +2.71 +4.01%
Nvidia CAPS Rating: ****