Over the last several years, Intel (INTC 2.96%) has made it clear that it's focused on developing increasingly robust graphics solutions for integration with its processors. In just five years, the company has gone from shipping a barely passable integrated graphics solution with its Westmere family of processors to putting out some pretty respectable solutions, particularly with its Iris Pro integrated graphics engines.

The overarching motivation behind this large investment is simple: Intel wants to increase the average selling prices of its processors by integrating functionality into its chips that would otherwise require third party, stand-alone devices.

With that in mind, integrating increasingly powerful graphics wouldn't make sense unless customers were willing to pay for it. Here's one big reason that Intel's PC customers may be willing to do just that in the coming years.

The 4K revolution is happening; 8K is on the horizon
At its recent developer forum in San Francisco, Intel hosted a presentation in which it discussed future trends in all-in-one desktop technologies. One area of focus, unsurprisingly, was display technologies.

Today, many all-in-one PCs ship with displays featuring a resolution of 1920-by-1080, or "full HD" and it's not hard to find some with 2560-by-1440, or "quad HD" displays. However, as you might have noticed, Apple (AAPL 1.30%) recently launched a 21.5-inch iMac with a 4K display (four times the pixels of 1920-by-1080) and has been shipping 27-inch Macs with 5K displays for about a year now.

According to Intel, 4K displays will go "mainstream" (remember that Apple iMacs represent very high-end products) in 2016.

Source: Intel. 

Driving a 4K display is extremely graphics/media intensive, which is why Apple chose Intel's highest-end Broadwell processor with Iris Pro graphics to power the 4K iMac. A 5K display is even more graphics/display intensive, which is probably why the 27-inch Retina 5K iMac still uses a stand-alone graphics processor rather than Intel's integrated graphics (though I think this could change with the launch of Intel's Kaby Lake chip in late 2016/early 2017).

However, even if Intel's current integrated graphics can handle 4K displays today and potentially 5K displays in the future, the next challenge -- at least according to Intel -- is the shift to 8K displays.

The performance requirements for 8K will skyrocket
The resolution that many people refer to when they talk about "8K" is 7680-by-4320 pixels. To put this into perspective, that's four times the number of pixels found in a 4K display and a whopping sixteen times the number of pixels in a 1920-by-1080 display.

That's a lot of pixels.

In order to drive those pixels, Intel will need to deliver substantial enhancements to its 3D graphics engine (so that, at the very least, casual games will run smoothly at 8K resolution), media encode/decode engines, as well as display engines.

Intel's current processor family won't be enough (without a stand-alone graphics card), and even the next generation Kaby Lake processor family likely won't have the ability to drive 8K displays (leaked slides suggest it can handle 5K displays, though).

Enter Cannonlake -- the Intel processor designed for 8K?
Although Intel's late 2016-to-late 2017 processor family known as Kaby Lake likely won't be enough for 8K all-in-one systems, the company's next generation processor family known as Cannonlake might be able to do it.

With Cannonlake, Intel will be transitioning from its 14-nanometer manufacturing process to its 10-nanometer process. This means, in a nutshell, that the company will be able to pack in far more transistors (and ultimately functionality) into the chip than it could with its prior generation 14-nanometer technology.

And, of course, the company will be introducing a new graphics architecture as well.

According to Intel, 8K displays will start shipping in the 2018 timeframe, start ramping volume in 2019, and become mainstream by 2020. It would make sense for Cannonlake to bring support for 8K and for future processors (Icelake and beyond) to further improve graphics/media performance at that resolution.

Bottom line? Expect Intel to keep pushing on graphics
If 8K resolutions really become "mainstream" by 2020, then Intel is going to have to keep investing heavily in improving its integrated graphics solutions, particularly if it wants system vendors to pay extra for chips with Intel's best integrated solutions rather than for chips with weaker integrated solutions to pair with stand-alone graphics chips.