Vertical Synchronization (V-Sync), Part 1:
G-SYNC? ADAPTIVE SYNC? FREESYNC?
What is V-Sync? Should you turn it on or off?
If you play video games, you may encounter this option in video settings. It’s usually called “vertical sync” or “VSync” for short, and what it does, cannot be seen instantly.
So why is this option here, and what does it do? What forms does it take?
When your graphics processor renders a 3D scene, it will process frames, as quickly as possible.
GPU - FPS (Frames per second)
Monitor - Hz (refresh rate)
Your screen is always trying to keep up with the frames your graphics processor is producing.
The maximum number of frames it can display is depicted in its refresh rate, which is usually defined in frequency or “Hz.” V-Sync locks the ratio is 1:1, so a monitor at 60Hz can show up to 60FPS.
In short, if your FPS is higher ingame (for ex: above native refresh rate, ingame 100fps, monitor 60hz ), you will suffer from “screen tearing” as picture shown below;
Picture courtesy of Tom’s Hardware Forum
Going back to a couple of years ago, NVIDIA was the first to come up with Adaptive-VSync (Not to be confused with VESA's Adaptive Sync). The whole idea was to eliminate stuttering and screen tearing.
Stuttering tends to occur when FPS are low whilst the screen tearing are when FPS are higher than monitor refresh rate.
The Adaptive-VSync was introduced to render frames using the NVIDIA Control Panel software. If the FPS are high, the Adaptive-VSync will automatically kick in to eliminate tearing. If the FPS are low, it will be disable to minimize stuttering.
So what happened to Adaptive-VSync?
Since input lags are still very common with the traditional V-Sync, NVIDIA introduced the almighty G-SYNC which eliminates stuttering, tearing, as well as the aforementioned V-Sync related input lags.
G-SYNC is a proprietary adaptive sync technology developed by NVIDIA, aimed primarily at eliminating screen tearing and the need for software alternatives such as VSync.
G-SYNC eliminates screen tearing by allowing a display device (monitor) to adapt to the frame rate of the outputting device (graphics card/integrated graphics) rather than the outputting device adapting to the display, which could traditionally be refreshed halfway through the process of a frame being output by the device, resulting in screen tearing, or two or more frames being shown at once.
In order for a device to use G-SYNC, it must have a proprietary G-SYNC module..
This has caused G-SYNC to face some criticism due to its proprietary nature as G-SYNC requires an NVIDIA-made module in the display device in order for it to function properly with select NVIDIA GeForce graphics cards, starting from the Kepler microarchitecture (GTX 650Ti and later).
Sometime later, VESA (formally known as Video Electronics Standards Association, they are the ones that standardized DisplayPort) announced Adaptive-Sync (*Not to be confused with the above NVIDIA’s Adaptive V-Sync), a royalty-free alternative of G-Sync.
VESA announced Adaptive-Sync as a component of the DisplayPort 1.2a specification.
One of the adopters is AMD’s FreeSync which is a hardware–software solution that uses DisplayPort Adaptive-Sync protocols to enable smooth, tearing-free and low-latency gameplay. Besides AMD, Intel has also confirmed it intends to support VESA Adaptive Sync in their future GPUs.
FreeSync technology is AMD’s dynamic refresh rate feature that was developed by AMD and first announced in 2014 to compete against NVIDIA’s proprietary G-Sync.
It is implemented through embedded and external HDMI & DisplayPort supported display panels. FreeSync products are certified by both AMD and VESA standard, so it can be regarded as VESA Adaptive-Sync as well.
*Please note that VESA Adaptive-Sync is specified in DisplayPort only.
In 2017, AMD has introduced FreeSync 2 which is HDR compatible.