Apple's latest iMacs. Image credit: Apple. 

Last month, Apple (NASDAQ:AAPL) refreshed its iMac line. The company introduced a new 21.5-inch iMac with a Retina 4K display and refreshed its 27-inch iMacs (with Retina 5K displays) to include Intel's (NASDAQ:INTC) latest Skylake processors.

Interestingly enough, Apple eliminated the discrete GPU option on all of its 21.5-inch iMacs, opting instead to use the integrated graphics engine that ships as part of the Intel processors inside. The 27-inch iMac, however, still comes in configurations with stand-alone graphics processors supplied by Advanced Micro Devices (NASDAQ:AMD).

In an earlier column, I argued that Intel's integrated graphics engines may soon become good enough to allow Apple to no longer include a stand-alone graphics processor inside of its 15-inch Retina MacBook Pro product.

Here I would like to lay out why I believe Apple could soon reduce its use of discrete graphics processors in its Retina 5K iMac product line, opting instead to transition at least some models, if not its entire Retina 5K product lineup, to graphics processors that come integrated directly with the main processor. 

Intel's upcoming Kaby Lake chip should be able to do it
Details of Intel's next-generation processor family, known as Kaby Lake, recently. One of the features shown in the leaked slides is support for 5K displays, which represents an improvement from Skylake which supports "only" up to 4K displays.

Now, support for 5K displays is nice, but certainly not the only requirement to actually be worthy of replacing a stand-alone graphics chip. The graphics processor itself needs to have enough performance to actually drive such a display while delivering good performance in the kinds of applications that the buyer of such a system will be interested in.

Interestingly, a leak courtesy of generally reliable suggests that with the highest-end configuration of Kaby Lake (Kaby Lake with GT4 graphics) will ship with two 128 megabyte blocks of embedded DRAM or eDRAM.

Intel has included fast, on-package eDRAM chips with its highest-end processors for quite a while now, beginning with the Haswell generation of processors in 2013, in order to try to overcome the bandwidth limitations associated with dual-channel DDR3/DDR4 memory interfaces.

Some have speculated that Intel might be including two 128 megabyte blocks of eDRAM in the top Kaby Lake model (rather than just one in Haswell/Broadwell/Skylake) in a bid to increase the effective bandwidth to/from the cache, improving overall performance. I think this explanation is reasonable.

If this is the case, then this may point to a significant boost in 3D graphics performance in going from Skylake with GT4 graphics to Kaby Lake with GT4 graphics (and Skylake with GT4 graphics is already expected to deliver a large performance boost over the Broadwell chip with GT3 graphics that is currently shipping in the 21.5-inch 4K iMacs).

This performance increase might actually be enough to make such a chip a credible option to drive, at the very least, an entry-level 5K iMac (assuming Apple would still want to offer variants with stand-alone graphics).

A potential win/win for Intel/Apple
Intel has made it clear that it wants to be able to capture as much of the PC platform bill of materials it can and Apple is sure to be interested in a solution that can lower overall system costs without sacrificing user experience.

I think a Kaby Lake chip with enough graphics horsepower to power a 5K iMac would be a win/win for both companies.

Intel would win because it would instead of selling Apple a quad core processor with the lowest graphics tier be able to offer the iDevice maker higher-end chips with faster graphics, improving its dollar content within the Retina 5K iMac models.

Apple would benefit in this case as, from a total bill of materials perspective, an integrated graphics solution is probably cheaper -- even if we factor in a premium for the Intel silicon with better graphics -- than using a stand-alone graphics solution (which requires separate cooling, memory, and so on).