Question CES 2019 - The beginning of the end for gsync?

pj-

Senior member
May 5, 2015
371
9
91
#1
On Jensen's stream a few minutes ago he spent 10 minutes talking around the fact that they are going to start supporting some freesync monitors at the driver level next week as "gsync compatible".

He made a point of saying that only 12 of the 400 they tested so far met their requirements. Not sure if it will be an "at your own risk" thing, or if it will only be supported for the specific models that pass their testing.

Edit: Apparently it can be enabled on any freesync monitor.
https://blogs.nvidia.com/blog/2019/01/06/g-sync-displays-ces/
 
Last edited:
Oct 27, 2006
19,709
197
106
#46
Really, both G-sync and Freesync is a scam. There's no screen tearing or any issues whatsoever without them. I use a regular monitor with my 1080GTX and there is no screen tearing. All games look fine at any fps above 30.

To be fair, I don't play any FPS games at all. RTS and RPG only. But I never seen this "screen tearing" everyone talks about.
Lol what?

I went from a really nice 12-bit IPS 2560x1440 60hz display to a 3440x1440 Gsnyc and it was a gargantuan improvement. RPGs, FPS, hell even the smoothness of moving things around the desktop was far better.

I also tried RX580 with a mid-range 1440p Freesync and it was pretty good, though you have to lower some details to make sure it doesn't dip below FS range or it gets ugly again.

Going back to 60hz with dips is agonizing. Locked 60 is fine, but still inferior.

Freesync and Gsync are not close to a scam. VRR is the best improvement in display tech in ages. However, one could say that Gsync module displays are overpriced, and/or that some Freesync displays are a bit janky. But scam? No. Might not be a match for you personally though, and that's fine.
 

ozzy702

Senior member
Nov 1, 2011
953
177
136
#47
Really, both G-sync and Freesync is a scam. There's no screen tearing or any issues whatsoever without them. I use a regular monitor with my 1080GTX and there is no screen tearing. All games look fine at any fps above 30.

To be fair, I don't play any FPS games at all. RTS and RPG only. But I never seen this "screen tearing" everyone talks about.
You couldn't be more wrong but maybe it's not an issue for the games you play and your personal sensitivity to it. I almost exclusively play first person shooters and screen tearing ruins the experience. Going from a 60hz vanilla 1440p monitor to a 144hz Gsync 1440p monitor was probably the single greatest improvement I've ever seen from a tech purchase. Screen tearing is obnoxious and before adaptive sync I'd always tailor my settings so that I could pull as steady 60fps and use vsync to avoid the tearing but we know what that means in terms of latency, etc.

I can't go back to vanilla 60hz, I tried on a friend's machine and it was horrible.
 
Last edited:

Cableman

Junior Member
Dec 6, 2017
23
5
41
#52
I spent a few months researching monitors and decided that I want a 27" 1440p 144hz Gsync display. Coming from a 24" 1080p 60hz monitor. I got the Asus PG279QZ for Christmas. It was my first experience with a high refresh monitor, as well as VRR. I had very high expectations. The smoothness was great, but I would not call it a "wow" experience. Certainly nice to have, but nothing life changing.

However, the monitor had horrible color uniformity with the top third of the screen displaying a yellow band instead of white. I returned it and decided to try a 27" 4k 60hz model and see what I think. I got the LG 27UK650. The visual fidelity is amazing and a definite improvement from 1440p. Some people say that they can't tell the difference in resolution at that size. For me it was a "wow" moment. The visual quality is much better than the Asus, not even close. Do I wish it was also high refresh? Absolutely, that would be an improvement. But I definitely prefer 4k 60Hz over 1440p 144Hz. And I am happy that soon I'll be able to use the Freesync functionality with my Nvidia GPU. One day when there are good 4k 144Hz monitors and GPUs that can drive them, I'll probably think about upgrading. But I am not going down to 1440p, it's not worth the tradeoff for me.
 

railven

Diamond Member
Mar 25, 2010
6,477
152
126
#79
I take everything from HardOCP with a huge grain of salt. Years ago they were having a spat with AMD, and seemed to spend all their time trashing AMD at every chance, then they made up with AMD and now spend all their time praising AMD and trashing NVidia. :rolleyes:
Funny part is when I was deeply ingrained in the camp red, HardOCP was almost banned here by poster request. It wasn't until the great "change" that it finally became accepted again. Countless times their reviews/metrics would be called out because it wasn't apple-to-apple comparisons, or they weren't using canned benchmarks. You can almost nail the shift in opinion around here to when Kyle was at an AMD presentation.

Hell, even Steam Surveys had a place here for discussion. Now good luck starting a thread on it.

This place has gone downhill so fast. I miss a lot of the old posters. Oh well.
 
Jun 8, 2003
14,129
163
126
#2
He said 400 async monitors have been tested and the driver is coming Jan.29th. That's what I thought he said.
 

pj-

Senior member
May 5, 2015
371
9
91
#3
He said 400 async monitors have been tested and the driver is coming Jan.29th. That's what I thought he said.
400 were tested and 12 "passed". I think passing is irrelevant if the option is enabled for all of them
 

ThatBuzzkiller

Senior member
Nov 14, 2014
822
7
91
#4
With this new information, G-sync has been effectively deprecated ...

There was a lot of blowback against g-sync becoming the ubiquitous standard from just about every direction. From display manufacturers (few options to be had) to hardware vendors (Intel or AMD wasn't going to have it) and other parties involved as well such as Microsoft but most of all those dictating the display output standards like the HDMI Forum or VESA weren't interested in standardizing G-sync either ...

Now that some of the generic adaptive refresh rate displays are deemed G-sync "compatible", I wonder what this means for their upcoming G-sync capable BFGD whether they'll still release or be cancelled along with their G-sync integration program plans in the future ...
 

Harry_Wild

Senior member
Dec 14, 2012
334
30
91
#5
NEC was big on this type PC monitor? I think since I purchase a huge CRT but I lost touch what purpose it has in 4K IPS land!
 

krumme

Diamond Member
Oct 9, 2009
5,755
143
136
#6
This is a time to be happy because someone died.

Thank you JHH for killing this anti competitive monster that busted us for years.

It was a monopoly disgusting feature that hurt us all and put shame on nv.

To all gsynch supporters. Good you were cheated and the money taken from you as punishment.

May you learn of this punishment. Now shame yourself. And come back as enlightened consumers.

Thank you.
 

NeoLuxembourg

Senior member
Oct 10, 2013
681
18
106
#7
If you listen carefully, you can hear the sound of some AMD people adding slides to the Keynote presentation about how this ...

As an 1080ti owner, I'm happy but I hope they do not limit the driver to those 12 monitors!

Edit:

Coming next week, this is changing. On January 15th, NVIDIA will be releasing a new driver that enables VESA Adaptive Sync support on GeForce GTX 10 and GeForce RTX 20 series (i.e. Pascal and newer) cards. There will be a bit of gatekeeping involved on NVIDIA’s part – it won’t be enabled automatically for most monitors – but the option will be there to enable variable refresh (or at least try to enable it) for all VESA Adaptive Sync monitors. If a monitor supports the technology – be it labeled VESA Adaptive Sync or AMD FreeSync – then NVIDIA’s cards can finally take advantage of their variable refresh features. Full stop.
Source: https://www.anandtech.com/show/1379...-adaptive-sync-with-gsync-compatible-branding
 
Last edited:

Reinvented

Senior member
Oct 5, 2005
486
3
91
#8
It's honestly about time. This will be amazing.
 

Dribble

Golden Member
Aug 9, 2005
1,684
114
126
#9
400 were tested and 12 "passed". I think passing is irrelevant if the option is enabled for all of them
What it tells you is only 12 out 400 freesync displays work properly. Something AMD has never been able to do - they are happy to label pretty well anything freesync which is rubbish. No quality standards for freesync is part of the reason why gsync did so well.
 

halc

Junior Member
Jan 7, 2019
1
0
11
#10
Nope.

1) G-Sync "basic" via driver update for any generic AdaptiveSync (FreeSync, VRR, etc) display for RTX20 and GTX10 cards only. Only 14 monitors are nVidia certified for this, but you can enable it for any Adaptive Sync display at your own peril.

2) G-Sync Ultimate with a hardware nVidia module will still be required along with an nVidia display card (generation not specified) for FULL G-Sync Ultimate Brand Experience (TM). Along with the G-Sync Ultimate tax on the monitor hardware itself.

Now, what this means in practise, remains to be seen.

However, it's more like "embrace and extend" for nVidia on G-Sync, instead of giving up on "for profit" G-Sync.

They gave "G-sync" v1 for free to two generations of cards, but simultanously launched a G-Sync Ultimate (not for free via drivers) for HDR monitors in the higher end.
 

beginner99

Diamond Member
Jun 2, 2009
4,107
173
126
#11
Well at least a tiny bright side from NV. This makes their cards at least an option again if AMD continues to not deliver. I guess the jacked up prices of RTX series has been taking it's toll on sales already.
 

PeterScott

Platinum Member
Jul 7, 2017
2,605
227
96
#12
This was the best announcement from the keynote. I thought with hdmi 2.1 on new 2019 displays, NVIDIA would be forced to support it or look like they were stifling industry standards.

We will have to wait and see how well it works, but this does look like the beginning of the end of the gsynch tax on monitors.
 

NTMBK

Diamond Member
Nov 14, 2011
8,300
280
126
#13
This is a time to be happy because someone died.

Thank you JHH for killing this anti competitive monster that busted us for years.

It was a monopoly disgusting feature that hurt us all and put shame on nv.

To all gsynch supporters. Good you were cheated and the money taken from you as punishment.

May you learn of this punishment. Now shame yourself. And come back as enlightened consumers.

Thank you.
Jesus, lighten up.
 

Despoiler

Golden Member
Nov 10, 2007
1,828
78
136
#15
With this new information, G-sync has been effectively deprecated ...

There was a lot of blowback against g-sync becoming the ubiquitous standard from just about every direction. From display manufacturers (few options to be had) to hardware vendors (Intel or AMD wasn't going to have it) and other parties involved as well such as Microsoft but most of all those dictating the display output standards like the HDMI Forum or VESA weren't interested in standardizing G-sync either ...

Now that some of the generic adaptive refresh rate displays are deemed G-sync "compatible", I wonder what this means for their upcoming G-sync capable BFGD whether they'll still release or be cancelled along with their G-sync integration program plans in the future ...

Well ya there was blowback because GSync flies in the face of standards that were already established on how monitors and GPUs interact. I'll give Nvidia credit for coming up with a novel use for the already invented and standardized variable vblank function that was part of EDP, but they chose to Frankenstein a product into existence and go proprietary. Frankenstein is a tragedy and so is Nvidia. GSync failed just like NForce failed. Nvidia cannot out muscle several well established companies from their industry. The nail in the coffin is Intel's upcoming next gen GPUs that are going to support adaptive sync. Just like I said Intel would be the decision maker.
 

pj-

Senior member
May 5, 2015
371
9
91
#16
What it tells you is only 12 out 400 freesync displays work properly. Something AMD has never been able to do - they are happy to label pretty well anything freesync which is rubbish. No quality standards for freesync is part of the reason why gsync did so well.
No, all it tells you is that nvidia says only 12 work properly. They want to save face by claiming that almost no normal VRR can match up to gsync's $100+ of secret sauce.

This is great and seemed inevitable with VRR being added to TVs and consoles
 

ewite12

Junior Member
Oct 9, 2015
12
0
41
#17
Interestingly I was "thinking" with Nvidia's crazy $$ cards coming out switching to AMD once there was a 1080ti level card just so that I could benefit from my monitor.
 

linkgoron

Golden Member
Mar 9, 2005
1,876
79
106
#18
Probably the best thing Nvidia have done lately. Opens the door for people with FuryX/older GCN with Freesync monitors who didn't want to lose VRR and didn't have anything significant to upgrade to. Should've happened sooner IMO, but I suppose that HDMI 2.1 and more TVs supporting VRR probably forced their hand.
 

Dribble

Golden Member
Aug 9, 2005
1,684
114
126
#19
No, all it tells you is that nvidia says only 12 work properly. They want to save face by claiming that almost no normal VRR can match up to gsync's $100+ of secret sauce.

This is great and seemed inevitable with VRR being added to TVs and consoles
Well I'm guessing they'd instantly reject any monitor that doesn't have a low of 30hz and low framerate compensation which is what gsync has always required and tbh what freesync *should* have required. That probably rejects nearly all freesync monitors right away.

That's the problem with the freesync label - you just need to support some very basic level of vrr, nothing mandates it actually needs to work properly. Even AMD abused this when they had that bundle including a freesync monitor which when reviewers looked at it only worked properly from the range of 80-100hz, and hence was basically useless for vrr.
 

KeithP

Diamond Member
Jun 15, 2000
5,445
27
91
#20
This is a time to be happy because someone died.

Thank you JHH for killing this anti competitive monster that busted us for years.

It was a monopoly disgusting feature that hurt us all and put shame on nv.

To all gsynch supporters. Good you were cheated and the money taken from you as punishment.

May you learn of this punishment. Now shame yourself. And come back as enlightened consumers.

Thank you.
The first time in a long time that I literally LOL at a post…thanks! :D

-KeithP
 

Krteq

Senior member
May 22, 2015
733
55
136
#21
Interesting...

Two identical Free/Adaptive-Sync panels:
AOC G2590FX (24"; TN; 1920x1080; 30-144Hz; DP, HDMI; LFC) - passed "G-Sync Compatible" test
AOC G2590PX (24"; TN; 1920x1080; 30-144Hz; DP, HDMI; LFC; USB passthrough) NOT passed "G-Sync Compatible" test

Just another nV trap
 

Reinvented

Senior member
Oct 5, 2005
486
3
91
#22
Interesting...

Two identical Free/Adaptive-Sync panels:
AOC G2590FX (24"; TN; 1920x1080; 30-144Hz; DP, HDMI; LFC) - passed "G-Sync Compatible" test
AOC G2590PX (24"; TN; 1920x1080; 30-144Hz; DP, HDMI; LFC; USB passthrough) NOT passed "G-Sync Compatible" test

Just another nV trap
I have an XF240H, and it's very similar to the XFA240. XFA240 passed, mine did not. So, is this going to mean that i'm going to get terrible screen tearing and stuff with Nvidia's Adaptive Sync?
 

ASK THE COMMUNITY

TRENDING THREADS