Question CES 2019 - The beginning of the end for gsync?

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

pj-

Senior member
May 5, 2015
481
249
116
On Jensen's stream a few minutes ago he spent 10 minutes talking around the fact that they are going to start supporting some freesync monitors at the driver level next week as "gsync compatible".

He made a point of saying that only 12 of the 400 they tested so far met their requirements. Not sure if it will be an "at your own risk" thing, or if it will only be supported for the specific models that pass their testing.

Edit: Apparently it can be enabled on any freesync monitor.
https://blogs.nvidia.com/blog/2019/01/06/g-sync-displays-ces/
 
Last edited:
  • Like
Reactions: Krteq

Arkaign

Lifer
Oct 27, 2006
20,736
1,377
126
I agree. If you can pull 120-144hz steady, adaptive sync isn't as big a deal. 100hz and below it's huge in my opinion. For some reason tearing just drives my brain nuts, even more so now that I'm used to gsync.

It all hinges on the titles and settings you can reach.

For esports and simpler titles at lower settings, especially at 1080p, sustaining 120 or 144hz is fairly doable if paired with a great CPU.

AAA titles though? Even a 2080ti suffers significant drops at 3440x1440 Ultra, and that's where Gysnc is way way way better. I've used both, and will never go back to a non VRR display. I don't play CS, competitive overwatch, or COD MP though. I dropped a 144hz 16:9 display to go 100hz 35" Gsync and couldn't be happier. Tried 4k 60 and even locked, it was simply awful to play comparatively, like running in muddy sand vs a nice track.
 

Cableman

Member
Dec 6, 2017
78
73
91
I spent a few months researching monitors and decided that I want a 27" 1440p 144hz Gsync display. Coming from a 24" 1080p 60hz monitor. I got the Asus PG279QZ for Christmas. It was my first experience with a high refresh monitor, as well as VRR. I had very high expectations. The smoothness was great, but I would not call it a "wow" experience. Certainly nice to have, but nothing life changing.

However, the monitor had horrible color uniformity with the top third of the screen displaying a yellow band instead of white. I returned it and decided to try a 27" 4k 60hz model and see what I think. I got the LG 27UK650. The visual fidelity is amazing and a definite improvement from 1440p. Some people say that they can't tell the difference in resolution at that size. For me it was a "wow" moment. The visual quality is much better than the Asus, not even close. Do I wish it was also high refresh? Absolutely, that would be an improvement. But I definitely prefer 4k 60Hz over 1440p 144Hz. And I am happy that soon I'll be able to use the Freesync functionality with my Nvidia GPU. One day when there are good 4k 144Hz monitors and GPUs that can drive them, I'll probably think about upgrading. But I am not going down to 1440p, it's not worth the tradeoff for me.
 

PeterScott

Platinum Member
Jul 7, 2017
2,605
1,540
136
I agree. If you can pull 120-144hz steady, adaptive sync isn't as big a deal. 100hz and below it's huge in my opinion. For some reason tearing just drives my brain nuts, even more so now that I'm used to gsync.

With 144Hz monitor it doesn't have to be that steady.

People are so often comparing FS/GS to Vsync on 60 Hz monitors, where even if you dip to 58 FPS you are immediately at 30 FPS, which makes the dips glaring.

On a 144Hz monitor dips are not so pronounced, and would go unnoticed by many people that freak about the 60FPS to 30 FPS dips on a 60 HZ monitor.

Vsync off is also something used for years, which suddenly became intolerable once the GS/FS hype started. If you really must have the lowest lag fastest response Vsync off is the fastest, lowest lag option available, as FS/GS add more input lag, than just running Vsync off.
 
  • Like
Reactions: Thrashard

Despoiler

Golden Member
Nov 10, 2007
1,966
770
136
FS is definitely is worth it, but it depends on your setup. I went from a 23" 1080p 120hz monitor with strobing to a 27" 1440p 144hz monitor with Freesync. I lost some FPS by going up in resolution, but FS takes care of any dips that might happen into a range I would notice tearing. Everything is butter smooth all the time with FS. You really don't realize how good it is until you play on it. 1080p with a good GPU/GPU doesn't need it because you will be pushing high enough FPS at all times. It's when the resolution goes up OR you have a weaker GPU/CPU that FS is indispensable.
 

ibex333

Diamond Member
Mar 26, 2005
4,086
119
106
Lol what?

I went from a really nice 12-bit IPS 2560x1440 60hz display to a 3440x1440 Gsnyc and it was a gargantuan improvement. RPGs, FPS, hell even the smoothness of moving things around the desktop was far better.

I also tried RX580 with a mid-range 1440p Freesync and it was pretty good, though you have to lower some details to make sure it doesn't dip below FS range or it gets ugly again.

Going back to 60hz with dips is agonizing. Locked 60 is fine, but still inferior.

Freesync and Gsync are not close to a scam. VRR is the best improvement in display tech in ages. However, one could say that Gsync module displays are overpriced, and/or that some Freesync displays are a bit janky. But scam? No. Might not be a match for you personally though, and that's fine.


Just because you and few other people are sensitive to lower fps, doesn't mean everyone is. I used a 120Hz monitor for a while and saw no difference whatsoever. Literally zero.

Saying that gsync/freesync makes everything universally better across the board is just silly.
 

Midwayman

Diamond Member
Jan 28, 2000
5,723
325
126
I have to wonder if all new monitors will move to the HDR version for the high end or can freesync do VRR on HDR as well? If nothing else we'll probably see a lot better options with mid-range gaming monitors.
 

railven

Diamond Member
Mar 25, 2010
6,604
561
126
Welps, this announcement sure did make my next monitor upgrade easier...well to the wallet. The G-Sync tax was definitely a barrier. Thankfully I got my monitor for 1/3 of the price :D

But when looking to upgrade wife's FreeSync EDIT: Monitor! to a G-Sync Monitor!, sticker shock kept her from letting me pull the trigger. So I'll probably upgrade her monitor first this time around.
 
Last edited:

Headfoot

Diamond Member
Feb 28, 2008
4,444
641
126
That certification program is definitely smoke and mirrors. Until they publish real facts as to what they measure, it's garbage.

I personally viewed the Nixeus VUE Freesync monitor, 30-144hz refresh rate, works flawlessly, I can't tell the difference between it and my Dell Gsync (re: motion clarity) yet somehow it doesn't pass. Maybe because its 1080p and not 1440? Pretty dumb and not transparent at all.

Still, very good that you can still use it just by flipping a toggle
 

ozzy702

Golden Member
Nov 1, 2011
1,151
530
136
That certification program is definitely smoke and mirrors. Until they publish real facts as to what they measure, it's garbage.

I personally viewed the Nixeus VUE Freesync monitor, 30-144hz refresh rate, works flawlessly, I can't tell the difference between it and my Dell Gsync (re: motion clarity) yet somehow it doesn't pass. Maybe because its 1080p and not 1440? Pretty dumb and not transparent at all.

Still, very good that you can still use it just by flipping a toggle

Yeah, NVIDIA is being NVIDIA on this one. I don't have data to make a solid judgement, but based off of my limited research when finding monitors for friends and family, at least half of Freesync monitors are junk, but even then that would put at least 100 monitors in the "acceptable" category, not just 12.
 
Last edited:

Cableman

Member
Dec 6, 2017
78
73
91
Nvidia makes it sound like virtually all Freesync monitors are trash. I had a Gsync monitor (Asus PG279QZ) for a few days and that was trash. I now have a Freesync monitor (LG 27UK650) that is not among the Gsync certified ones and the difference in quality is tremendous - the Freesync one makes the Gsync monitor look worse than trash. There are trashy Freesync monitors, but there are certainly trashy Gsync monitors too so I wouldn't just assume that anything that Nvidia slaps a Gsync sticker on is miraculously better. Just give me the toggle to enable Freesync and keep your certification.
 
  • Like
Reactions: krumme

PeterScott

Platinum Member
Jul 7, 2017
2,605
1,540
136
Yeah, NVIDIA is being NVIDIA on this one. I don't have data to make a solid judgement, but based off of my limited research when finding monitors for friends and family, at least half of Freesync monitors are junk, but even then that would put at least 100 monitors in the "acceptable" category, not just 12.

If you don't believe NVidia testing, you are free to manual enable it. It isn't like they are blocking you anymore.
 

arandomguy

Senior member
Sep 3, 2013
556
183
116
Yeah, NVIDIA is being NVIDIA on this one. I don't have data to make a solid judgement, but based off of my limited research when finding monitors for friends and family, at least half of Freesync monitors are junk, but even then that would put at least 100 monitors in the "acceptable" category, not just 12.

Realistically unless they've been testing for the time period of something like a year it's unlikely they can actually test in a meaningful way 400+ displays.

One of Nvidia's criteria for G-Sync compatible is for full range VRR which would mean that max refresh must be 2.4x min refresh to support frame multiplying. This eliminates the majority of existing VRR displays.

This is something rather easy to test for if you need to test it all as you could just eliminate based on panel specs. The rest of the visual tests are almost certainly more resource consuming and unlikely to have been done for 400+ displays in the time frame that is likely involved here.

If you don't believe NVidia testing, you are free to manual enable it. It isn't like they are blocking you anymore.

May not be that simple, we'll need to see the reviews with respect to what Nvidia's philosophy is with VRR and how they handle cases outside of that.

I'd be willing to bet that Nvidia believes that VRR must be full range. So how will they choose to handle non full range capable monitors such his LG 27UK650?
 

ozzy702

Golden Member
Nov 1, 2011
1,151
530
136
Nvidia makes it sound like virtually all Freesync monitors are trash.

To be fair, most Freesync monitors are trash but I don't believe for a second that only 3% are good. I've had a few Freesync monitors that were very nice, especially given the price disparity between Freesync and Gsync.

We'll know soon enough which Freesync monitors actually work well under the new driver.

I'm excited for this because I have a ton of friends that want to make the jump to 1440p 144hz who own NVIDIA GPUs but want adaptive sync and 1440p 144hz Gsync monitors are just damned expensive.
 

PeterScott

Platinum Member
Jul 7, 2017
2,605
1,540
136
I'd be willing to bet that Nvidia believes that VRR must be full range. So how will they choose to handle non full range capable monitors such his LG 27UK650?

I don't think it handles them any different, but does what every VRR implementation does when you hit it's bounds; acts like Vsync. But probably without the equivalent of LFC.

For something like LFC to work, you need some range (Though I think you would only need 2.0X not 2.4 or 2.5 like NVidia and AMD use).

For LFC imagine you have one of the ubiquitous 48-75Hz freesync monitors.

If you fall to 44 Hz, LFC would insert two frames at double the frame rate, to get back into the monitors range, but the monitor can't do 88Hz. So it would be forced to work like Vsync and when below the lower bounds and essentially give you 24 FPS by doubling the frames at the minimum frame rate.

AMD users without LFC report judder and tearing below freesync range, and bet that is all NVidia will do as well.

Certfified Gsync will do the equivalent of LFC.
 
Last edited:

KentState

Diamond Member
Oct 19, 2001
8,397
393
126
That certification program is definitely smoke and mirrors. Until they publish real facts as to what they measure, it's garbage.

I personally viewed the Nixeus VUE Freesync monitor, 30-144hz refresh rate, works flawlessly, I can't tell the difference between it and my Dell Gsync (re: motion clarity) yet somehow it doesn't pass. Maybe because its 1080p and not 1440? Pretty dumb and not transparent at all.

Still, very good that you can still use it just by flipping a toggle

It's been reported as to which criteria is used by Nvidia and how that compares to AMD. Look at the Gamers Nexus video for a summary of what they have looked at so far. The biggest factor is the Adaptive Sync range and color/contrast uniformity. The problem with a large number of monitors is that the range is so small that frame duplication pushes above the maximum rate of the device.
 

arandomguy

Senior member
Sep 3, 2013
556
183
116
To be fair, most Freesync monitors are trash but I don't believe for a second that only 3% are good. I've had a few Freesync monitors that were very nice, especially given the price disparity between Freesync and Gsync.

12 out of 400 tested does mean that only 3% of all monitors are good.

Here if you go to AMD's own list for monitors -

https://www.amd.com/en/products/freesync-monitors

They list 584 total displays. If you filter for displays that do not support LFC (AMD's implementation to support full range VRR) that already filters out 358 displays, those automatically fails with Nvidia's criteria. However AMD's list isn't fully accurate they list some displays that obviously don't have the range as supporting LFC. You're basically looking at almost 400 fails already without any actual extensive testing needed. Nvidia also has another requirment which is that VRR is enabled by default, quite a few are not and require it to be toggled on via the OSD.

validate that the monitor can operate in VRR at any game frame rate by supporting a VRR range of at least 2.4:1 (e.g. 60Hz-144Hz), and offer the gamer a seamless experience by enabling VRR by default.

So basically you can easily fail a huge chunk of existing displays with those two easy to gauge criteria. After that you're looking at more extensive testing need for the rest

monitor does not show blanking, pulsing, flickering, ghosting or other artifacts during VRR gaming.

which is why I bet only 12 have passed so, they haven't had the time/resources to really test everything yet other than the previously mentioned easy fail cases. And those cases are going to be hard to test for because of the need to cover a wide variety of edge cases. This is why you see conflicting reports often about certain Freesync displays having issues like flickering effecting some but not others.

https://www.nvidia.com/en-us/geforce/news/g-sync-ces-2019-announcements/

I don't think it handles them any different, but does what every VRR implementation does when you hit it's bounds; acts like Vsync. But probably without the equivalent of LFC.

For something like LFC to work, you need some range (Though I think you would only need 2.0X not 2.4 or 2.5 like NVidia and AMD use).

For LFC imagine you have one of the ubiquitous 48-75Hz freesync monitors.

If you fall to 44 Hz, LFC would insert two frames at double the frame rate, to get back into the monitors range, but the monitor can't do 88Hz. So it would be forced to work like Vsync and when below the lower bounds and essentially give you 24 FPS by doubling the frames at the minimum frame rate.

AMD users without LFC report judder and tearing below freesync range, and bet that is all NVidia will do as well.

Certfified Gsync will do the equivalent of LFC.

We'll have to see but I have a feeling that the result may not be quite the same due to how Nvidia and AMD interpret VRR support and how the standard isn't really all that standard. If I want to bring up a similar situation Nvidia and AMD (not sure this is still the case though) I believe had differing intereptation on how they read TVs regarding handling overscan/underscan. Neither was really right or wrong but the default behavior of both was different.
 
  • Like
Reactions: VirtualLarry

Headfoot

Diamond Member
Feb 28, 2008
4,444
641
126
It's been reported as to which criteria is used by Nvidia and how that compares to AMD. Look at the Gamers Nexus video for a summary of what they have looked at so far. The biggest factor is the Adaptive Sync range and color/contrast uniformity. The problem with a large number of monitors is that the range is so small that frame duplication pushes above the maximum rate of the device.

So you have to dig into some random video by some random website to figure out what arbitrary specs they chose that contitute "pass"? That's not transparent whatsoever
 

arandomguy

Senior member
Sep 3, 2013
556
183
116
So you have to dig into some random video by some random website to figure out what arbitrary specs they chose that contitute "pass"? That's not transparent whatsoever


https://www.nvidia.com/en-us/geforce/news/g-sync-ces-2019-announcements/

G-SYNC Compatible testing validates that the monitor does not show blanking, pulsing, flickering, ghosting or other artifacts during VRR gaming. They also validate that the monitor can operate in VRR at any game frame rate by supporting a VRR range of at least 2.4:1 (e.g. 60Hz-144Hz), and offer the gamer a seamless experience by enabling VRR by default.
 

KentState

Diamond Member
Oct 19, 2001
8,397
393
126
So you have to dig into some random video by some random website to figure out what arbitrary specs they chose that contitute "pass"? That's not transparent whatsoever

I was pointing you to a place to get a good summary. It's on Nvidia's site and many other but I can't do the work for you of actually looking at the provided material.
 

Krteq

Senior member
May 22, 2015
991
671
136
Driver 417.71 with Adaptive-Sync support is up now

How to enable G-Sync onAdaptive-Sync enabled but not "G-Sync Certified" monitors:
  • Connect the monitor to your GeForce RTX 20-Series or GeForce GTX 10-Series graphics card using a DisplayPort cable
  • Enable the Variable Refresh Rate functionality of your display by using the monitor's controls and On-Screen Display
  • Open the NVIDIA Control Panel from the bottom right of Windows
  • Expand the "Display" section
  • Click on “Set up G-SYNC”
  • Tick the “Enable G-SYNC, G-SYNC Compatible” box
  • Tick the “Enable settings for the selected display model” box
  • Click “Apply” on the bottom right
  • If the above isn't available, or isn't working, you may need to go to "Manage 3D Settings", click the "Global" tab, scroll down to "Monitor Technology", select "G-SYNC Compatible" in the drop down, and then click "Apply"
  • Additionally, you may need to go to "Change Resolution" on the left nav and apply a higher refresh rate, or different resolution
 
  • Like
Reactions: Elfear and Cableman

nOOky

Platinum Member
Aug 17, 2004
2,826
1,846
136
I don't have an nvidia GPU, but it is nice to know that the option to use my existing monitor is there using adaptive sync. In fact I just ordered the LG 34GK950-F and should have it when I get home, it's a Freesync 2 monitor, and not on the list of 12, but it should work. I don't have any plans to change my GPU out currently, but who knows what next year will bring.

Interesting "test" from HardOCP about G-Sync versus Freesync 2 if you haven't read it:

https://www.hardocp.com/article/2018/03/30/amd_radeon_freesync_2_vs_nvidia_gsync