In 2013, graphics specialist NVIDIA (NASDAQ:NVDA) announced a technology that it dubbed G-SYNC. It allows a G-SYNC-capable computer monitor (working with a compatible NVIDIA GPU) to vary the rate at which the frames in a video game are rendered (known as the frame rate) to eliminate a visual effect known as "tearing."
Those G-SYNC monitors included a technology known as G-SYNC modules -- relatively expensive, complex pieces of hardware that enabled that feature.
It wasn't long before the Video Electronics Standards Association (VESA) introduced the Adaptive-Sync standard, which required the production of new display scaler chips (which are much cheaper than NVIDIA's G-SYNC modules) that implemented the standard. Over the last several years, computer monitors that support VESA Adaptive Sync have become ubiquitous -- and for years, NVIDIA hasn't supported the feature on its GPUs.
The fact that NVIDIA's GPUs required pricey G-SYNC monitors to deliver gaming with variable refresh rates while competitor Advanced Micro Devices (NASDAQ:AMD) could deliver a similar feature with cheaper, more ubiquitous displays wasn't ideal for NVIDIA.
And more recently, chip giant Intel (NASDAQ:INTC) -- which plans to enter the discrete GPU market in 2020 and become another competitor for NVIDIA -- made it clear that it intends to support VESA Adaptive Sync with its future GPUs.
On Jan. 7, NVIDIA announced a move that should eliminate this particular competitive weakness.
NVIDIA announced that it will test monitors that support the VESA Adaptive Sync standard. The graphics specialist says that "[those] that pass our validation tests will be [called] G-SYNC Compatible and enabled by default in the GeForce driver."
To be clear, this doesn't appear to end NVIDIA's higher-end G-SYNC efforts. In fact, NVIDIA explicitly says that "[for] the best gaming experience, we recommend NVIDIA G-SYNC and G-SYNC Ultimate monitors..."
However, as a result of this move, variable-refresh-rate gaming on NVIDIA GPUs won't be limited to the pricier G-SYNC monitors. At the same time, there still appears to be reasons for less cost-conscious gamers to choose higher-end G-SYNC monitors.
Covering both ends of the market
Ultimately, this is the right move for the company. At the very high end of the market -- those who buy NVIDIA's priciest GPUs and are willing to splurge on displays -- the company should still be able to sell G-SYNC and G-SYNC Ultimate monitors.
But for gamers on budgets -- say, the kind who buy a $200 GeForce GTX 1060 or even something cheaper, like the GTX 1050 -- an expensive G-SYNC monitor simply doesn't make sense. Before this announcement by NVIDIA, these gamers might have been attracted to AMD's comparable offerings because they knew that they could get the variable refresh rate for a much lower cost. But this competitive disadvantage for NVIDIA should now be eliminated.
Also keep in mind that NVIDIA appears to be limiting this feature to GeForce 10-series and GeForce 20-series GPUs -- not so coincidentally the GPU families that NVIDIA currently has for sale -- so this could serve as an added incentive for users of older GeForce cards to upgrade to new cards.