Question CES 2019 - The beginning of the end for gsync?

Page 4 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

pj-

Senior member
May 5, 2015
481
249
116
On Jensen's stream a few minutes ago he spent 10 minutes talking around the fact that they are going to start supporting some freesync monitors at the driver level next week as "gsync compatible".

He made a point of saying that only 12 of the 400 they tested so far met their requirements. Not sure if it will be an "at your own risk" thing, or if it will only be supported for the specific models that pass their testing.

Edit: Apparently it can be enabled on any freesync monitor.
https://blogs.nvidia.com/blog/2019/01/06/g-sync-displays-ces/
 
Last edited:
  • Like
Reactions: Krteq

Cableman

Member
Dec 6, 2017
78
73
91
With the new update, variable refresh rare works great with my Freesync monitor. However, enabling Gsync leads to a 15-20 fps drop. Games start off at normal fps and then after a few minutes drop 15-20 fps and then stay there no matter the settings and what's on screen. Disabling Gsync reverts it back to normal. Any ideas?
 

PeterScott

Platinum Member
Jul 7, 2017
2,605
1,540
136
I don't have an nvidia GPU, but it is nice to know that the option to use my existing monitor is there using adaptive sync. In fact I just ordered the LG 34GK950-F and should have it when I get home, it's a Freesync 2 monitor, and not on the list of 12, but it should work. I don't have any plans to change my GPU out currently, but who knows what next year will bring.

Interesting "test" from HardOCP about G-Sync versus Freesync 2 if you haven't read it:

https://www.hardocp.com/article/2018/03/30/amd_radeon_freesync_2_vs_nvidia_gsync

I take everything from HardOCP with a huge grain of salt. Years ago they were having a spat with AMD, and seemed to spend all their time trashing AMD at every chance, then they made up with AMD and now spend all their time praising AMD and trashing NVidia. :rolleyes:
 

nOOky

Platinum Member
Aug 17, 2004
2,830
1,851
136
I take everything from every tech site with a grain of salt, this site included. But I've seen other sites comparing gsync to freesync and the results are similar, I can't find a site I trust that says "omfg one is way better than the other" because the technologies in actual practice are probably very similar, and everybody's system is different enough to take what is said with a grain of salt.
 

railven

Diamond Member
Mar 25, 2010
6,604
561
126
I take everything from HardOCP with a huge grain of salt. Years ago they were having a spat with AMD, and seemed to spend all their time trashing AMD at every chance, then they made up with AMD and now spend all their time praising AMD and trashing NVidia. :rolleyes:

Funny part is when I was deeply ingrained in the camp red, HardOCP was almost banned here by poster request. It wasn't until the great "change" that it finally became accepted again. Countless times their reviews/metrics would be called out because it wasn't apple-to-apple comparisons, or they weren't using canned benchmarks. You can almost nail the shift in opinion around here to when Kyle was at an AMD presentation.

Hell, even Steam Surveys had a place here for discussion. Now good luck starting a thread on it.

This place has gone downhill so fast. I miss a lot of the old posters. Oh well.
 

Mopetar

Diamond Member
Jan 31, 2011
7,827
5,971
136
All sites editorialize to some degree, but just ignore their articles or recommendations and go for the benchmarks. Unless you believe that they're intentionally gimping certain products (which is far more serious than simply just disliking a company due to some ongoing spat) then just let the numbers speak for themselves. Even if they use a biased set of applications or games, the results are still real. Just make sure to consider benchmarks from multiple sites or sources and you won't end up getting fooled or misleading yourself.
 
  • Like
Reactions: VirtualLarry

Cableman

Member
Dec 6, 2017
78
73
91
I kept testing the Gsync implementation with my Freesync monitor. I am consistently getting 20 fps less in every game I try with Gsync on vs Gsync off. I am not sure if this will be something that they'll iron out with future updates, but I'll be playing without Gsync for now.
 

PeterScott

Platinum Member
Jul 7, 2017
2,605
1,540
136
Techspot had 7 Freesync monitors on hand to test:
https://www.techspot.com/article/1779-freesync-and-nvidia-geforce/

1 of them didn't work because it only had HDMI and so far NVidia Gsync compatibility only works over Display Port.

The other 6 worked exactly as they do with AMD cards. Monitors that had the range for it, used LFC, monitors lacking in range didn't have LFC, exactly the same outcome as connecting to AMD cards.
 

Cableman

Member
Dec 6, 2017
78
73
91
Probably because gsync reduces GPU load if it's faster than the monitor can refresh?
It's not that either - what I observed was at 4k and some games went from ~55 fps to ~35 fps. It's frustrating because at 55 fps I would be in the sweet spot for Freesync, but at 35 fps it's almost unplayable and it's below the Freesync range so there's no point keeping it on.
 
Jul 24, 2017
93
25
61
1 of them didn't work because it only had HDMI and so far NVidia Gsync compatibility only works over Display Port.

At least for me, this is the big issue going forward, as I'm a couch gamer and I want to plug my PC into a TV. I know that 4K TVs with HDMI 2.1 and VRR support are supposed to be releasing...sometime...in the near future. If Nvidia will support adaptive sync over HDMI 2.1 then I plan to stick with Nvidia the next time I upgrade my display and my GPU. Otherwise I'll be switching to AMD.
 

Arkaign

Lifer
Oct 27, 2006
20,736
1,377
126
I don't have an nvidia GPU, but it is nice to know that the option to use my existing monitor is there using adaptive sync. In fact I just ordered the LG 34GK950-F and should have it when I get home, it's a Freesync 2 monitor, and not on the list of 12, but it should work. I don't have any plans to change my GPU out currently, but who knows what next year will bring.

Interesting "test" from HardOCP about G-Sync versus Freesync 2 if you haven't read it:

https://www.hardocp.com/article/2018/03/30/amd_radeon_freesync_2_vs_nvidia_gsync

I like Hocp most of the time but sometimes tests make no sense. I followed back to the original test with Doom, and it was kind of laughable.

First, Doom is not hard to keep at a high refresh and even mid-range GPU, but they were using Vega64 and 1080ti and 100hz panels. Both undoubtedly were sitting at max the entire time. Even a non FS or non GS display would have done great at a reasonable refresh rate, and an outright 144hz panel would have won lol. That, and motion blur was enabled, which just makes it a bit of a joke. Having the screen smear when you turn basically breaks the point of VRR.

The other thing is that GS, and good FS displays don't really show their value I'm situations where you comfortably sit around max refresh. They show their value by maintaining relatively smooth results even when the framerate tanks a bit. Say like AC Origins and Odyssey when you can get swings from 50-120 pretty regularly depending on where you are and what's happening. A VRR display with poor range will suffer here, while a good one will hold up.

If you run less demanding stuff and esports stuff (CSS, Overwatch, MOBA), then you do NOT need to spend a fortune on a VRR display. Just look for a good display with high peak refresh.
 
  • Like
Reactions: Elfear

PeterScott

Platinum Member
Jul 7, 2017
2,605
1,540
136
At least for me, this is the big issue going forward, as I'm a couch gamer and I want to plug my PC into a TV. I know that 4K TVs with HDMI 2.1 and VRR support are supposed to be releasing...sometime...in the near future. If Nvidia will support adaptive sync over HDMI 2.1 then I plan to stick with Nvidia the next time I upgrade my display and my GPU. Otherwise I'll be switching to AMD.

I fully expect NVidia will support HDMI VRR when it becomes official. Currently VRR over HDMI is an AMD vendor extension, not a standard HDMI feature:
425de5b6-3ec2-49a0-8709-986bcb6d5fce.png
 

Hitman928

Diamond Member
Apr 15, 2012
5,235
7,775
136
I fully expect NVidia will support HDMI VRR when it becomes official. Currently VRR over HDMI is an AMD vendor extension, not a standard HDMI feature:
425de5b6-3ec2-49a0-8709-986bcb6d5fce.png

Didn't HDMI 2.1 make adaptive sync part of the standard? I don't think there's any 2.1 displays out yet, but I believe moving forward it is a standard feature of HDMI.
 
  • Like
Reactions: VirtualLarry

beetlejuicetv

Junior Member
Jan 18, 2019
9
0
6
i have a samsung c32hg70 and enabling freesync through the gsync worked great for about 30 mins. there are massive bugs at this point though my monitor isnt on the "approved" list.
 

Genx87

Lifer
Apr 8, 2002
41,095
513
126
Really, both G-sync and Freesync is a scam. There's no screen tearing or any issues whatsoever without them. I use a regular monitor with my 1080GTX and there is no screen tearing. All games look fine at any fps above 30.

To be fair, I don't play any FPS games at all. RTS and RPG only. But I never seen this "screen tearing" everyone talks about.

Until I bought the ASUS 27" Gsync monitor. I always had to turn on vert sync to stop tearing. Honestly, this monitor is uber nice, well worth the money imo. But it was also really expensive for a 27". So I see the allure of free sync becoming the norm. 650 bucks for a 27" is not something the majority of users will buy into.
 

ibex333

Diamond Member
Mar 26, 2005
4,087
119
106
I used the nVidia pendullum demo yesterday(it shows you in real time how freesync/gsync work in comparison to vsync and nothing at all)

So with everything OFF, yes, I noticed screen tearing, but I don't get why it annoys people so much. Is it definitely something one is better without? Absolutely. But to pay $600 to smooth out some tearing? If you have the money to burn, god bless!

Then I compared G-Sync/Freesync(My monitor has freesync, even though it's only 70Hz) to Vsync. Guess what... NO DIFFERENCE. Both eliminated screen tearing completely.

So whats the point in GSync/Freesync when Vsync was already doing that job just fine?!
 

Hitman928

Diamond Member
Apr 15, 2012
5,235
7,775
136
I used the nVidia pendullum demo yesterday(it shows you in real time how freesync/gsync work in comparison to vsync and nothing at all)

So with everything OFF, yes, I noticed screen tearing, but I don't get why it annoys people so much. Is it definitely something one is better without? Absolutely. But to pay $600 to smooth out some tearing? If you have the money to burn, god bless!

Then I compared G-Sync/Freesync(My monitor has freesync, even though it's only 70Hz) to Vsync. Guess what... NO DIFFERENCE. Both eliminated screen tearing completely.

So whats the point in GSync/Freesync when Vsync was already doing that job just fine?!

The purpose is that Vsync only works at your monitor's refresh rate (70 Hz in your case). If you're playing a game and your fps fluctuates from 50 fps to 80 fps, then when you turn Vsync on, your graphics card will output a maximum of 70 fps even when it could put out 80 fps so that it can sync with the monitor refresh rate. When the fps drops below 70 fps you'll either drop to 35 fps (1/2 of 70) to maintain the tear free experience or you'll have tearing while outputting 50 fps - 68 fps.

With freesync / gsync, you'll have tear free gaming from 50 fps to 70 fps where you'll cap if you want or you can let it go and get tearing above that rate. So people can enjoy smooth, tear free gaming without having to turn settings down to try and keep their fps above their monitor's refresh rate. Some people are more sensitive to tearing than others so it's value is obviously subjective.

In other words, cap the fps on the pendulum demo to 55 fps and see what happens with freesync on vs off and vsync on vs off.

Also, gsync monitors don't cost $600 just because they have gsync, it's because those monitors also are large and have high refresh rates. Gsync does come with a hefty fee (~$150) but it's not the whole sum. That's why people are excited about Nvidia starting to work with Freesync because now you can (if it actually does work) buy a $450 monitor and enjoy the same specs and tear free gaming as a $600 gsync monitor. You also have (potentially) the option to buy in at much cheaper price points to enjoy more limited spec'd monitors but at much cheaper prices with at least some range for freesync.
 
  • Like
Reactions: Arkaign

Headfoot

Diamond Member
Feb 28, 2008
4,444
641
126

And the NIXEUS VUE 24 I've seen meets all of these criteria, and yet still is somehow not "certified." Everyone who is disagreeing with me on this is missing the point. Subjective "we think its good enough" criteria is a lot of garbage. It must be objective, or its just a nonsense marketing program (pay to play) with no real value to consumers.

It's just cover to save face for having dumped on FreeSync monitors in the past when plenty of them work just fine, as they tried so hard to market against.
 
Last edited:

PeterScott

Platinum Member
Jul 7, 2017
2,605
1,540
136
Then I compared G-Sync/Freesync(My monitor has freesync, even though it's only 70Hz) to Vsync. Guess what... NO DIFFERENCE. Both eliminated screen tearing completely.

So whats the point in GSync/Freesync when Vsync was already doing that job just fine?!

VRR also eliminates stutter when your frame rate drops while synced. If you are VSynched at 70Hz and your FPS drops below 70 FPS it will immediately drop all the way to 35 FPS because the only option is to wait one more full frame.

I can't run the demo, but I would assume you can drop the frame rate to below 70fps and see that happen.
 

Arkaign

Lifer
Oct 27, 2006
20,736
1,377
126
The purpose is that Vsync only works at your monitor's refresh rate (70 Hz in your case). If you're playing a game and your fps fluctuates from 50 fps to 80 fps, then when you turn Vsync on, your graphics card will output a maximum of 70 fps even when it could put out 80 fps so that it can sync with the monitor refresh rate. When the fps drops below 70 fps you'll either drop to 35 fps (1/2 of 70) to maintain the tear free experience or you'll have tearing while outputting 50 fps - 68 fps.

With freesync / gsync, you'll have tear free gaming from 50 fps to 70 fps where you'll cap if you want or you can let it go and get tearing above that rate. So people can enjoy smooth, tear free gaming without having to turn settings down to try and keep their fps above their monitor's refresh rate. Some people are more sensitive to tearing than others so it's value is obviously subjective.

In other words, cap the fps on the pendulum demo to 55 fps and see what happens with freesync on vs off and vsync on vs off.

Also, gsync monitors don't cost $600 just because they have gsync, it's because those monitors also are large and have high refresh rates. Gsync does come with a hefty fee (~$150) but it's not the whole sum. That's why people are excited about Nvidia starting to work with Freesync because now you can (if it actually does work) buy a $450 monitor and enjoy the same specs and tear free gaming as a $600 gsync monitor. You also have (potentially) the option to buy in at much cheaper price points to enjoy more limited spec'd monitors but at much cheaper prices with at least some range for freesync.

This is good info.

I would like to advocate for people to really dive deep on any VRR display selection, as the details of it are crucially important.

Foremost should be the user examining what they play, at what settings, and with what frame variance they experience. I've seen some terribly incomplete comparisons, and they do a grave injustice to the buyers when they don't go into a wider analysis of scenarios. The one testing Doom 2016 was particularly laughable.

Following this, there is the massive gap in quality between cheap Freesync displays and higher performing ones. However, the effective VRR enabled range is vastly more or less important depending on game choice and settings. For example, similar to the flawed Doom comparison, if one plays say only CSGO and Overwatch, and can maintain a basically locked 75, 100, 120, or whatever their max refresh is, then the difference between a $350 Freesync and $550 Gsync is nil. Otoh, if someone is running AAA titles that see a fairly drastic and fluctuating gap between 40-100+ as common in things like new Assassin's Creed titles, FFIV etc, then you see a tremendous improvement by selecting either a Gsync display or a very wide range Freesync 2 model.

Above all, we must make clear that the devil is in the details. Not all VRR is created equal.
 
  • Like
Reactions: godihatework

beginner99

Diamond Member
Jun 2, 2009
5,210
1,580
136
I take everything from HardOCP with a huge grain of salt. Years ago they were having a spat with AMD, and seemed to spend all their time trashing AMD at every chance, then they made up with AMD and now spend all their time praising AMD and trashing NVidia. :rolleyes:

You only have to reads the article to confirm you are right. The Freesync 2 monitor had HDR enabled vs gsync without it. It's not simply a comparison between freesync and g-sync. In fact gamers more or less rated the systems equal meaning you could spin it in a way that hdr isn't worth it at all.
 

Dribble

Platinum Member
Aug 9, 2005
2,076
611
136
And the NIXEUS VUE 24 I've seen meets all of these criteria, and yet still is somehow not "certified." Everyone who is disagreeing with me on this is missing the point. Subjective "we think its good enough" criteria is a lot of garbage. It must be objective, or its just a nonsense marketing program (pay to play) with no real value to consumers.

It's just cover to save face for having dumped on FreeSync monitors in the past when plenty of them work just fine, as they tried so hard to market against.
To be clear you don't know the exact testing criteria, and you don't know the detail of how the monitor did - you just know the general testing aims, and the general specs of the monitor. I am sure the manufacturer will have got a report card saying exactly what passed and failed.

It doesn't really matter much for existing monitors anyway - if you own one then you can turn on freesync and it works as well as it works. What's more important is the next set of monitor releases - finally we'll have a way of knowing if it's a quality freesync product. Any manufacturer who's put some effort into their freesync implementation will make it pass the Nvidia certification. Any that just wanted the feature as a tick box but didn't bother putting the effort in to make it work well won't pass.

That's a win for everyone whether you use AMD or Nvidia.
 
  • Like
Reactions: godihatework