Does Freesync/GSync eliminate the need of a higher refresh rate monitor?

shepardh

Member
Jan 6, 2011
32
0
0
I saw this list a couple days ago:
http://www.guru3d.com/news-story/list-of-freesync-adaptive-sync-compatible-monitors.html

and noticed that currently the 4K displays (at least the freesync ones) do not support a 144hz refresh rate comparing to some of the WQHD models.

I currently own a 120hz refresh rate monitor and was unable to go back to a 60hz afterwards.

My question is this, does Gsync/Freesync eliminate the need of a 120/144hz screen? or will the higher refresh rate still make a huge difference?
 

hawtdawg

Golden Member
Jun 4, 2005
1,223
7
81
Gsync/Freesync make slight changes in framerate pretty much unnoticeable. It is certainly no replacement for a higher refresh rate.
 

Black Octagon

Golden Member
Dec 10, 2012
1,410
2
81
Fair question. I think it's too early to tell, since AFAIK no 60Hz variable refresh monitor has yet been professionally reviewed.

As I understand G-Sync, for instance, it only works when the frame-rate is at least 30fps. On a 144Hz monitor like the Rog Swift, this means that you have a 30-144fps range within which G-Sync can operate. If the monitor is only 60Hz, however (like the 4k monitors announced so far with variable refresh), and if 30fps remains the lower framerate 'threshold,' it means that G-Sync/FreeSync only has a small 30-60fps window within which to work.

Moreover, high refresh rates on a monitor can contribute to lower motion blur, especially if coupled with a blur-reduction technology like ULMB. Variable refresh technologies address tearing and stutter, but not motion blur.
 

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
G-sync does not produce higher refresh rates, so no, it doesn't replace it. If all you wanted from a higher refresh rate, is smooth game play, then it might help replace its benefits.


Moreover, high refresh rates on a monitor can contribute to lower motion blur, especially if coupled with a blur-reduction technology like ULMB. Variable refresh technologies address tearing and stutter, but not motion blur.

Higher refresh rates plays no part in reduced motion blur. It just happens that high refresh monitors tend to use technology that reduces motion blur, as it is more needed at those refresh rates. ULMB is also only available at those refresh rates.
 

exar333

Diamond Member
Feb 7, 2004
8,518
8
91
No.

It can make lower refreshes more tolerable and even-out the experience, but cannot replicate the experience of a higher-refresh display, like 90, 120, or 144hz (for example).
 

QuantumPion

Diamond Member
Jun 27, 2005
6,010
1
76
If you cap your FPS, then g-sync will introduce input lag the same as regular v-sync. On a 144 hz monitor this isn't as problematic as most games won't render that high fps, but at 60 hz it definitely will be and most games do not have the option to set a fps limit.
 

Mondozei

Golden Member
Jul 7, 2013
1,043
41
86
If all you wanted from a higher refresh rate, is smooth game play, then it might help replace its benefits.

That's a common misconception. Yes, Adaptive Sync(the underlying tech behind both AMD/NV's offerings) does make gaming more smooth but it cannot "replace the benfits" of having a high refresh panel. You'll get less stutter/lag but on a 60 Hz monitor you'll still be playing on a much slower monitor and it will show, than it would on a 144 Hz one, especially in fast-paced games.

This is why most G/F-sync monitors have had high refresh rates, too, because it's in combination of both that you get the best of both worlds.
 

sheh

Senior member
Jul 25, 2005
247
8
81
I think it might help some, because it reduces latency as there's no need to wait for the "next refresh".

Higher refresh rates plays no part in reduced motion blur.
It does because part of the problem is the eyes' persistence of vision (or however you'd call it). That's why, for example, motion interpolation on TVs is supposed to help with blur. Each static image appears for a shorter while before it changes. It's also why OLED, where the pixels are much quicker than LCD, still suffers from motion blur.
 
Last edited:

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
That's a common misconception. Yes, Adaptive Sync(the underlying tech behind both AMD/NV's offerings) does make gaming more smooth but it cannot "replace the benfits" of having a high refresh panel. You'll get less stutter/lag but on a 60 Hz monitor you'll still be playing on a much slower monitor and it will show, than it would on a 144 Hz one, especially in fast-paced games.

This is why most G/F-sync monitors have had high refresh rates, too, because it's in combination of both that you get the best of both worlds.
I said it would "help" replace it benefits. I did not mean to say it would replace it completely. It is better at producing smoothness in some ways, and not as good in others.
 

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
I think it might help some, because it reduces latency as there's no need to wait for the "next refresh".

It does because part of the problem is the eyes' persistence of vision (or however you'd call it). That's why, for example, motion interpolation on TVs is supposed to help with blur. Each static image appears for a shorter while before it changes. It's also why OLED, where the pixels are much quicker than LCD, still suffers from motion blur.

That is why I said, ULMB modes are offered at the high refresh rates, but the high refresh rates on their own doesn't help. ULMB is what reduces the persistence.
 

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
It's not about strobing, but about less time spent showing each frame before switching to a different one.

I don't remember where I read it, but I think there is some related info here:
http://blogs.valvesoftware.com/abrash/why-virtual-isnt-real-to-your-brain-judder/
http://blogs.valvesoftware.com/abrash/down-the-vr-rabbit-hole-fixing-judder/

Reading through all the areas they mention persistence, and they are clearly stating it is all about the time a pixel is lit. To reduce it, it requires a blank/black period. With normal LCD tech, pixels remain lit 100% of the time, so a higher refresh rate doesn't change the persistence. The strobing is what reduces it, be having a period of time to allow image to dissipate from your view.

At least that is what I read. Perhaps you can show me what I missed. If there is a difference, it would be very small.

While the wording is clearly stating that LCD tech has full persistence, and refresh rates have no impact on that, they are mentioning that judder is reduced by higher refresh rates. Now their definition of judder seems to be a mix of things, including smearing, which I assume includes some ghosting.
 
Last edited:

5150Joker

Diamond Member
Feb 6, 2002
5,549
0
71
www.techinferno.com
Short answer: No it does not. If I had to just pick either G-Sync or high refresh rate, then I'd pick high refresh rate every time. While G-Sync is nice at lower framerates, if you have the GPU power to match your refresh rate, tearing isn't really a big factor but the added smoothness is FANTASTIC.
 

sheh

Senior member
Jul 25, 2005
247
8
81
With normal LCD tech, pixels remain lit 100% of the time, so a higher refresh rate doesn't change the persistence. The strobing is what reduces it, be having a period of time to allow image to dissipate from your view.
If you change the image content quicker, a pixel shows the same thing for a shorter time.

Real life doesn't have strobing, yet no smearing because the view changes constantly as you move your eyes.
 

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
If you change the image content quicker, a pixel shows the same thing for a shorter time.

Real life doesn't have strobing, yet no smearing because the view changes constantly as you move your eyes.

Because of LCD's lack of strobing, even if the refresh happens twice as often, the image still remains, it just might move a couple pixels over. The links you gave specifically mentioned that higher refresh rates have no effect on persistence. It is judder that is improved, though their definition of judder included a lot of things, so I'm not 100% sure if they were saying ghosting is improved or not.
 

sheh

Senior member
Jul 25, 2005
247
8
81
The image that moved a few pixels means you get something different cast on your retina, more similar to real life's infinite temporal resolution.

See, for example, here:
http://www.mpi-inf.mpg.de/~pdidyk/question_of_time.pdf

The human visual system is tuned to stabilize moving objects which is achieved by so-called smooth pursuit eye motion that tracks moving objects. Most modern displays (so-called hold-type displays) present the moving objects in discrete positions, while they are continuously tracked by our eyes. This discrepancy leads to the perceived blur. The easiest way of reducing blur is to shorten the time duration of each frame.

The simplest solution is black data insertion (BDI). ... A more sophisticated version of this method is backlight flashing (BF) ... the backlight of the display is used to limit the duration time of each frame. ... the results are limited by possible artifacts such as flickering, brightness reduction or color desaturation. ... the commonly used solution in TV-sets is frame rate doubling by interpolation (FRT) ... additional frames are created by interpolation between original frames, based on computed optical flow.

Or another paper:
http://resources.mpi-inf.mpg.de/3DTemporalUpsampling/3DTemporalUpsampling.pdf
 
Last edited: