Question CES 2019 - The beginning of the end for gsync?

pj-

Senior member
May 5, 2015
366
7
61
#1
On Jensen's stream a few minutes ago he spent 10 minutes talking around the fact that they are going to start supporting some freesync monitors at the driver level next week as "gsync compatible".

He made a point of saying that only 12 of the 400 they tested so far met their requirements. Not sure if it will be an "at your own risk" thing, or if it will only be supported for the specific models that pass their testing.

Edit: Apparently it can be enabled on any freesync monitor.
https://blogs.nvidia.com/blog/2019/01/06/g-sync-displays-ces/
 
Last edited:
Jun 8, 2003
13,987
42
126
#2
He said 400 async monitors have been tested and the driver is coming Jan.29th. That's what I thought he said.
 

pj-

Senior member
May 5, 2015
366
7
61
#3
He said 400 async monitors have been tested and the driver is coming Jan.29th. That's what I thought he said.
400 were tested and 12 "passed". I think passing is irrelevant if the option is enabled for all of them
 

ThatBuzzkiller

Senior member
Nov 14, 2014
802
2
91
#4
With this new information, G-sync has been effectively deprecated ...

There was a lot of blowback against g-sync becoming the ubiquitous standard from just about every direction. From display manufacturers (few options to be had) to hardware vendors (Intel or AMD wasn't going to have it) and other parties involved as well such as Microsoft but most of all those dictating the display output standards like the HDMI Forum or VESA weren't interested in standardizing G-sync either ...

Now that some of the generic adaptive refresh rate displays are deemed G-sync "compatible", I wonder what this means for their upcoming G-sync capable BFGD whether they'll still release or be cancelled along with their G-sync integration program plans in the future ...
 

Harry_Wild

Senior member
Dec 14, 2012
312
23
81
#5
NEC was big on this type PC monitor? I think since I purchase a huge CRT but I lost touch what purpose it has in 4K IPS land!
 

krumme

Diamond Member
Oct 9, 2009
5,644
45
136
#6
This is a time to be happy because someone died.

Thank you JHH for killing this anti competitive monster that busted us for years.

It was a monopoly disgusting feature that hurt us all and put shame on nv.

To all gsynch supporters. Good you were cheated and the money taken from you as punishment.

May you learn of this punishment. Now shame yourself. And come back as enlightened consumers.

Thank you.
 

NeoLuxembourg

Senior member
Oct 10, 2013
672
10
106
#7
If you listen carefully, you can hear the sound of some AMD people adding slides to the Keynote presentation about how this ...

As an 1080ti owner, I'm happy but I hope they do not limit the driver to those 12 monitors!

Edit:

Coming next week, this is changing. On January 15th, NVIDIA will be releasing a new driver that enables VESA Adaptive Sync support on GeForce GTX 10 and GeForce RTX 20 series (i.e. Pascal and newer) cards. There will be a bit of gatekeeping involved on NVIDIA’s part – it won’t be enabled automatically for most monitors – but the option will be there to enable variable refresh (or at least try to enable it) for all VESA Adaptive Sync monitors. If a monitor supports the technology – be it labeled VESA Adaptive Sync or AMD FreeSync – then NVIDIA’s cards can finally take advantage of their variable refresh features. Full stop.
Source: https://www.anandtech.com/show/1379...-adaptive-sync-with-gsync-compatible-branding
 
Last edited:

Reinvented

Senior member
Oct 5, 2005
486
3
91
#8
It's honestly about time. This will be amazing.
 

Dribble

Golden Member
Aug 9, 2005
1,584
43
106
#9
400 were tested and 12 "passed". I think passing is irrelevant if the option is enabled for all of them
What it tells you is only 12 out 400 freesync displays work properly. Something AMD has never been able to do - they are happy to label pretty well anything freesync which is rubbish. No quality standards for freesync is part of the reason why gsync did so well.
 

halc

Junior Member
Jan 7, 2019
1
0
6
#10
Nope.

1) G-Sync "basic" via driver update for any generic AdaptiveSync (FreeSync, VRR, etc) display for RTX20 and GTX10 cards only. Only 14 monitors are nVidia certified for this, but you can enable it for any Adaptive Sync display at your own peril.

2) G-Sync Ultimate with a hardware nVidia module will still be required along with an nVidia display card (generation not specified) for FULL G-Sync Ultimate Brand Experience (TM). Along with the G-Sync Ultimate tax on the monitor hardware itself.

Now, what this means in practise, remains to be seen.

However, it's more like "embrace and extend" for nVidia on G-Sync, instead of giving up on "for profit" G-Sync.

They gave "G-sync" v1 for free to two generations of cards, but simultanously launched a G-Sync Ultimate (not for free via drivers) for HDR monitors in the higher end.
 

beginner99

Diamond Member
Jun 2, 2009
3,925
70
126
#11
Well at least a tiny bright side from NV. This makes their cards at least an option again if AMD continues to not deliver. I guess the jacked up prices of RTX series has been taking it's toll on sales already.
 

PeterScott

Platinum Member
Jul 7, 2017
2,517
93
96
#12
This was the best announcement from the keynote. I thought with hdmi 2.1 on new 2019 displays, NVIDIA would be forced to support it or look like they were stifling industry standards.

We will have to wait and see how well it works, but this does look like the beginning of the end of the gsynch tax on monitors.
 

NTMBK

Diamond Member
Nov 14, 2011
8,111
77
126
#13
This is a time to be happy because someone died.

Thank you JHH for killing this anti competitive monster that busted us for years.

It was a monopoly disgusting feature that hurt us all and put shame on nv.

To all gsynch supporters. Good you were cheated and the money taken from you as punishment.

May you learn of this punishment. Now shame yourself. And come back as enlightened consumers.

Thank you.
Jesus, lighten up.
 

Despoiler

Golden Member
Nov 10, 2007
1,799
42
136
#15
With this new information, G-sync has been effectively deprecated ...

There was a lot of blowback against g-sync becoming the ubiquitous standard from just about every direction. From display manufacturers (few options to be had) to hardware vendors (Intel or AMD wasn't going to have it) and other parties involved as well such as Microsoft but most of all those dictating the display output standards like the HDMI Forum or VESA weren't interested in standardizing G-sync either ...

Now that some of the generic adaptive refresh rate displays are deemed G-sync "compatible", I wonder what this means for their upcoming G-sync capable BFGD whether they'll still release or be cancelled along with their G-sync integration program plans in the future ...

Well ya there was blowback because GSync flies in the face of standards that were already established on how monitors and GPUs interact. I'll give Nvidia credit for coming up with a novel use for the already invented and standardized variable vblank function that was part of EDP, but they chose to Frankenstein a product into existence and go proprietary. Frankenstein is a tragedy and so is Nvidia. GSync failed just like NForce failed. Nvidia cannot out muscle several well established companies from their industry. The nail in the coffin is Intel's upcoming next gen GPUs that are going to support adaptive sync. Just like I said Intel would be the decision maker.
 

pj-

Senior member
May 5, 2015
366
7
61
#16
What it tells you is only 12 out 400 freesync displays work properly. Something AMD has never been able to do - they are happy to label pretty well anything freesync which is rubbish. No quality standards for freesync is part of the reason why gsync did so well.
No, all it tells you is that nvidia says only 12 work properly. They want to save face by claiming that almost no normal VRR can match up to gsync's $100+ of secret sauce.

This is great and seemed inevitable with VRR being added to TVs and consoles
 

ewite12

Junior Member
Oct 9, 2015
10
0
41
#17
Interestingly I was "thinking" with Nvidia's crazy $$ cards coming out switching to AMD once there was a 1080ti level card just so that I could benefit from my monitor.
 

linkgoron

Golden Member
Mar 9, 2005
1,840
43
106
#18
Probably the best thing Nvidia have done lately. Opens the door for people with FuryX/older GCN with Freesync monitors who didn't want to lose VRR and didn't have anything significant to upgrade to. Should've happened sooner IMO, but I suppose that HDMI 2.1 and more TVs supporting VRR probably forced their hand.
 

Dribble

Golden Member
Aug 9, 2005
1,584
43
106
#19
No, all it tells you is that nvidia says only 12 work properly. They want to save face by claiming that almost no normal VRR can match up to gsync's $100+ of secret sauce.

This is great and seemed inevitable with VRR being added to TVs and consoles
Well I'm guessing they'd instantly reject any monitor that doesn't have a low of 30hz and low framerate compensation which is what gsync has always required and tbh what freesync *should* have required. That probably rejects nearly all freesync monitors right away.

That's the problem with the freesync label - you just need to support some very basic level of vrr, nothing mandates it actually needs to work properly. Even AMD abused this when they had that bundle including a freesync monitor which when reviewers looked at it only worked properly from the range of 80-100hz, and hence was basically useless for vrr.
 

KeithP

Diamond Member
Jun 15, 2000
5,406
9
91
#20
This is a time to be happy because someone died.

Thank you JHH for killing this anti competitive monster that busted us for years.

It was a monopoly disgusting feature that hurt us all and put shame on nv.

To all gsynch supporters. Good you were cheated and the money taken from you as punishment.

May you learn of this punishment. Now shame yourself. And come back as enlightened consumers.

Thank you.
The first time in a long time that I literally LOL at a post…thanks! :D

-KeithP
 

Krteq

Senior member
May 22, 2015
683
12
106
#21
Interesting...

Two identical Free/Adaptive-Sync panels:
AOC G2590FX (24"; TN; 1920x1080; 30-144Hz; DP, HDMI; LFC) - passed "G-Sync Compatible" test
AOC G2590PX (24"; TN; 1920x1080; 30-144Hz; DP, HDMI; LFC; USB passthrough) NOT passed "G-Sync Compatible" test

Just another nV trap
 

Reinvented

Senior member
Oct 5, 2005
486
3
91
#22
Interesting...

Two identical Free/Adaptive-Sync panels:
AOC G2590FX (24"; TN; 1920x1080; 30-144Hz; DP, HDMI; LFC) - passed "G-Sync Compatible" test
AOC G2590PX (24"; TN; 1920x1080; 30-144Hz; DP, HDMI; LFC; USB passthrough) NOT passed "G-Sync Compatible" test

Just another nV trap
I have an XF240H, and it's very similar to the XFA240. XFA240 passed, mine did not. So, is this going to mean that i'm going to get terrible screen tearing and stuff with Nvidia's Adaptive Sync?
 

Kenmitch

Diamond Member
Oct 10, 1999
7,119
43
126
#23
Desperate times ahead for the falling....Failing.
 

Hitman928

Golden Member
Apr 15, 2012
1,602
61
136
#24
I have an XF240H, and it's very similar to the XFA240. XFA240 passed, mine did not. So, is this going to mean that i'm going to get terrible screen tearing and stuff with Nvidia's Adaptive Sync?
We don't know for sure but it makes it seem like Nvidia might be including some pretty arbitrary and meaningless criteria to determine what passes and what fails. In other words, more than likely many of the "failed" monitors will be just fine and work just as well as the ones that passed, probably most of them.
 

ozzy702

Senior member
Nov 1, 2011
776
104
136
#25
We don't know for sure but it makes it seem like Nvidia might be including some pretty arbitrary and meaningless criteria to determine what passes and what fails. In other words, more than likely many of the "failed" monitors will be just fine and work just as well as the ones that passed, probably most of them.
My guess is all the freesync monitors that didn't give a large enough range to work within were a "fail". How large a range is anyone's guess but most Freesync monitors are garbage so it's not surprising that NVIDIA claims very few "pass". Even still, I'd have to think that way more than 12 out of 400 would be acceptable.

I have a few AOC G2450PF monitors that I'm hoping work well.
 


ASK THE COMMUNITY