In late 2013, graphics chip company NVIDIA (NASDAQ:NVDA) introduced a technology called G-SYNC, a hardware solution that claims to fix once and for all the problem of GPUs and displays that are out of sync, rendering frames at different rates and causing unpleasant visual side-effects for gamers. Monitors with G-SYNC are currently on the market, and the technology has been demonstrated to vastly improve the quality of the PC gaming experience.
There's a catch, though. G-SYNC is a proprietary technology, and monitor manufacturers can only get the necessary hardware directly from NVIDIA. G-SYNC monitors have a significant price premium over standard monitors for this reason. In addition, G-SYNC works only with NVIDIA GPUs, leaving those with Advanced Micro Devices (NASDAQ:AMD) products unable to use the technology.
AMD has developed an alternative solution, called FreeSync, which relies on the DisplayPort 1.2a standard. Most newer graphics cards and monitors now feature a DisplayPort interface, meant to replace the old DVI and VGA interfaces, although support for G-SYNC-like functionality will be on a monitor-to-monitor basis.
There are no FreeSync monitors on the market yet, but they're expected to be released soon. Can NVIDIA's proprietary solution catch on, effectively locking customers into NVIDIA's ecosystem? Or will AMD's standards-based alternative render G-SYNC irrelevant?
What exactly does G-SYNC do?
GPUs are designed to render frames as quickly as possible, and the rate at which they do so can vary wildly depending on what's happening on-screen. Monitors, in contrast, are designed to refresh the screen at a fixed rate, typically 60 times per second, although monitors with higher refresh rates are available.
This poses a problem. If the GPU is allowed to update the image to be displayed while the monitor is actively refreshing the screen, the result is a visible tearing effect, where parts of different frames are displayed simultaneously. The solution for many years has been a software feature called vertical sync, which forces the GPU to wait until the monitor is ready to refresh the screen before updating the image. This eliminates the screen tearing, but because some frames will end up being displayed longer than intended when frame rates are low, stutter and lag can result.
G-SYNC and FreeSync allow the monitor itself to adjust its refresh rate to match the GPU, instead of the other way around like in the case of vertical sync. This not only eliminates screen tearing, but it greatly reduces stutter and lag. While no FreeSync monitors are on the market yet, reviews for G-SYNC have so far been glowing. A handful of G-SYNC monitors have been available for a few months, and more were announced at CES earlier this month. FreeSync monitors are expected to first launch in the early part of this year.
Why is this important?
One thing that's clear from reading reviews of G-SYNC monitors is that variable refresh rate monitors are here to stay. This is one of the most dramatic improvements in display technology in quite some time, and it fixes a long-standing problem that has plagued PC gamers for years.
The only question remaining is whether NVIIDA's proprietary solution can survive AMD's standards-based alternative. NVIDIA does have some important advantages. First, it sells far more GPUs than AMD, controlling about 70% of the market during the third quarter. Second, G-SYNC monitors are already on the market, while FreeSync monitors don't yet have a firm release date.
The cost of NVIDIA's G-SYNC solution may be a problem, though. G-SYNC monitors are expensive, carrying a price premium measured in the hundreds of dollars. This G-SYNC monitor from AOC, for instance, is about $200 more than the non-G-SYNC version. This feature seems to be aimed at the high-end PC gaming market, where hundreds or even thousands of dollars are spent on graphics cards, and this premium isn't a big deal there. But G-SYNC won't be a mainstream feature until prices come down.
Because AMD's FreeSync isn't proprietary, prices will presumably not be as extreme as NVIDIA's solution. But there's a wrinkle in AMD's plan: This Adaptive-Sync feature is an optional part of the DisplayPort standard, meaning that it will be up to monitor manufacturers to make the choice to implement it. As a result, there will likely be some sort of price premium over standard monitors, although prices, and indeed even hard release dates, of FreeSync monitors are still unknown.
NVIDIA has been trying to build an ecosystem around its graphics cards, and G-SYNC is another piece of that plan. The company's GeForce Experience software adds functionality to its GPUs, like the ability to easily stream gameplay to services like Twitch, and G-SYNC is another attempt to create meaningful switching costs for those considering upgrading to an AMD graphics card.
If NVIDIA can get enough of its high-end customers buying G-SYNC monitors, the company could increase its stranglehold on the high-end GPU market. AMD was forced to slash prices on many of its GPUs when NVIDIA introduced the GTX 970 and 980 a few months ago, and AMD won't have an answer in the form of new products until later this year.
Right now, the only option available is G-SYNC, and the longer that remains true, the longer NVIDIA has to lock customers into its ecosystem. NVIDIA has a first-mover advantage with this technology, and ultimately G-SYNC's success will depend on how widespread FreeSync support becomes. If a large number of new monitors support the standard, NVIDIA could be forced to support it as well, effectively killing G-SYNC. For now, though, NVIDIA has a big advantage, particularly in the high-end part of the market where the G-SYNC price premium matters less. This is where most of the profit in the industry comes from, so it's certainly not a bad advantage to have.