VSync, FreeSync, and G-Sync explained - what they are and why they matter for gaming

If you've ever looked at the video or graphics options in a PC game, you've likely noticed a toggle for a setting called Vsync. Or if you've been shopping for the best gaming monitor, you may have seen the term G-sync or FreeSync used liberally in marketing copy. While copy writers and manufacturers may make a big deal out of these features, for the uninitiated those terms are just more confusing technical jargon (the sort we address in our big hardware glossary). 

So what do they actually mean? Do you need a monitor with G-sync or Freesync? And how much of a difference does it actually make? I'll take you through each term in depth and explain what it means and how much of the jargon is just empty marketing fluff, starting with the granddaddy of these technologies, Vsync. 

An example of screen tearing

An example of screen tearing

What is Vsync?

Vsync, or vertical synchronization, was designed primarily to address one central issue in display technology - screen tearing. Screen tearing, illustrated in the example image above, is a kind of visual artifact that makes it appear as though an image is divided across a horizontal line, or tear. Tearing is most commonly the result of a mismatch between the frames your hardware is outputting and refresh rate of your display; if your gaming PC is sending the display 100 frames per second, and your monitor or TV only has a 60hz refresh rate (the equivalent of 60 FPS), tearing occurs as the graphics card sends a new frame while the current frame is being displayed. 

Vsync attempts to reduce or eliminate tearing by forcing your graphics hardware to match (or sync) with the refresh rate of your display. When frames are rendered and displayed at the same rate, tearing is much less likely, and this solution is a robust way to avoid tearing and other artifacting. Vsync does introduce a few issues of its own, however, as a result of the way it throttles performance.

The first and most obvious issue is that frame rates will be limited to the refresh rate of your display, so while tearing may be significantly reduced you won't reach the performance ceiling of your GPU. Vsync can also introduce input lag which, while generally fairly minor, can become problematic for genres that demand high levels of precision like rhythm or fighting games. It can also cause issues when displaying videos or movies filmed at significantly slower frame rates, creating a juddering effect where people and objects shake or move strangely on screen. FreeSync attempts to address some of these limitations.

An example of FreeSync in action on a supported monitor

An example of FreeSync in action on a supported monitor

What is FreeSync?

Whereas Vsync attempts to adjust the rate at which frames are transmitted to displays, AMD's FreeSync technology takes the opposite tack. It attempts to dynamically adjust the display's refresh rate to match the rate at which graphics hardware is outputting frames. Because the refresh rate is dynamic, FreeSync can keep up with changes in FPS during rendering on the fly - demanding sections of a game that cause the frame rate to dip drastically won't affect syncing. 

The way FreeSync achieves this is actually really simple, and kind of elegant. It just forces the monitor to continue displaying the current frame until a new one is received from the graphics hardware, so it's always displaying images at the same rate as they're being outputted (so long as the rate falls between 9-240 FPS, the supported refresh range for FreeSync displays). And because it's a hardware solution on the display side, there's no performance penalty for enabling FreeSync. AMD announced the second generation of FreeSync in 2017, the primarily selling points of which were HDR support and low latency.

One of the major downsides to FreeSync is that, barring a fairly major workaround, it requires an AMD GPU to work. If you're using one of Nvidia's cards you're out of luck, and will instead need a G-sync display.

G-Sync

G-Sync works on a principal very similar to FreeSync by adapting the monitor's refresh rate to the frame rate your hardware is outputting. Unlike FreeSync, which uses VESA's DisplayPort Adaptive-Sync protocols (and also works over HDMI) and is royalty-free and free-to-use, G-Sync relies on a proprietary Nvidia module that must be built into a display for the tech to function. 

Like FreeSync, G-Sync requires a GPU from the manufacturer to function, so you'll need at least an Nvidia card, specifically a GTX 650 Ti or higher. That said, at this year's CES Nvidia announced that it will be bringing G-Sync support to a limited selection of FreeSync monitors, and that they'll let you test G-Sync on any FreeSync monitor though any unsupported monitor "may work partly or not work at all" with G-Sync enabled. This is welcome news for a lot of people with PC builds centered on Nvidia GPUs, because FreeSync monitors are largely more affordable than their G-Sync counterparts. Unfortunately, as of this writing only 12 monitors are supported, though Nvidia has announced plans to test more monitors for the program. 

How important is sync for gaming, and do I need G-Sync or FreeSync?

Generally speaking, Vsync is extremely important for gaming, not just for avoiding tearing but for insuring an overall smoother experience. This is especially true if you're running gaming hardware that's outputting more frames than your display can handle. Whether or not you need to invest the extra dollars in FreeSync or G-Sync is a slightly thornier conversation, however.

Overall, a G-Sync or FreeSync equipped display is going to cost you more than an equivalent display that doesn't support either technology. This is especially true for G-Sync, largely because of the cost of including Nvidia's proprietary G-Sync module. Whether or not the additional investment is worth it comes down, in large part, to what sort of gamer you are and what kind of rig you game on.

If you play a lot of fighting games, rhythm games, or twitch shooters that require lightning reflexes and hardware that can keep pace with them, the input lag inherent in Vsync solutions can be a serious issue. This is also true if you play practically any game at a high level of competition, where a few frames of difference can be the margin between victory and defeat. Playing with Vsync enabled makes a display feel slightly more sluggish, while G-Sync and FreeSync feel like cleaner, smoother solutions.

On the other hand, the difference in input lag is going to be a scant few milliseconds under most conditions, so if you don't spend a lot of time playing those genres and aren't an eSports professional, there's a good chance you may never notice. It's also less important on older or lower spec hardware - if your GPU is never spitting out more frames than your refresh rate can handle, you're not likely to experience a lot of tearing or other issues related to sync. It's also less of an issue on really high spec cards - given the extreme FPS possible on very high end hardware, you're not likely to notice a few lost frames here and there due to the performance hit from Vsync. FreeSync and G-Sync are thus most important (and impactful) for mid to upper-mid systems that will overproduce frames but that will also be more sensitive to performance loss. 

For most users, the difference between Vsync and G-Sync/FreeSync isn't going to be major enough to warrant a significant price markup. But if you have the money to spare, or you're the sort of early adopter or gamer that demands the very best hardware, G-Sync and FreeSync are definitely better solutions. 

Alan Bradley

Alan Bradley was once a Hardware Writer for GamesRadar and PC Gamer, specialising in PC hardware. But, Alan is now a freelance journalist. He has bylines at Rolling Stone, Gamasutra, Variety, and more.