G-sync... something?

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
If the guy was correct, in that the G-sync module was not needed, then why on earth did AMD need to go out and get VESA to change the standards so it's adaptive-sync could work?
 

geoxile

Senior member
Sep 23, 2014
327
25
91
If the guy was correct, in that the G-sync module was not needed, then why on earth did AMD need to go out and get VESA to change the standards so it's adaptive-sync could work?

Adaptive sync is actually an eDP (mostly used on mobile platforms) standard feature. Intel has already been using it to control refresh rate (switching between fixed refresh rates) to save power. AMD pushed to bring it over to DP for use on desktops.
 

Paul98

Diamond Member
Jan 31, 2010
3,732
199
106
guess we will see if people try out the modded drivers for their laptop.
 

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
Adaptive sync is actually an eDP (mostly used on mobile platforms) standard feature. Intel has already been using it to control refresh rate (switching between fixed refresh rates) to save power. AMD pushed to bring it over to DP for use on desktops.

So AMD didn't have to push for a new standard with VESA, and wait for new monitors with these new features for their Freesync? Why is AMD requiring the newest DP standards for their Freesync implementation?
 

jackstar7

Lifer
Jun 26, 2009
11,679
1,944
126
So AMD didn't have to push for a new standard with VESA, and wait for new monitors with these new features for their Freesync? Why is AMD requiring the newest DP standards for their Freesync implementation?

No, they did. Again this existed on laptop screens, but not desktop monitors. The new VESA standard (which is still optional for DP 1.2a and forward) makes the sync option available.

Or so is my understanding.
 

Paul98

Diamond Member
Jan 31, 2010
3,732
199
106
No, they did. Again this existed on laptop screens, but not desktop monitors. The new VESA standard (which is still optional for DP 1.2a and forward) makes the sync option available.

Or so is my understanding.

My understanding also, and was shown when amd first started talking about freesync with their demo of the windmill.
 

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
They thought they could do it with existing standards, and found it wasn't good enough, and then pushed for new standards. If it was simply a matter of getting hardware manufacturers to use the existing standards, then they didn't have to make new ones.

The previous standards only let them set a refresh rate between a given range. The new standard lets them tell the monitor to start a refresh and hold until it is told to start a new one. What previously existed, if they got it working, would have resulted in a lot of latency, the new method doesn't.
 
Last edited:

jackstar7

Lifer
Jun 26, 2009
11,679
1,944
126
They thought they could do it with existing standards, and found it wasn't good enough, and then pushed for new standards. If it was simply a matter of getting hardware manufacturers to use the existing standards, then they didn't have to make new ones.

The previous standards only let them set a refresh rate between a given range. The new standard lets them tell the monitor to start a refresh and hold until it is told to start a new one. What previously existed, if they got it working, would have resulted in a lot of latency, the new method doesn't.

It's still an optional part of the standard. And someone always needs to be pushing new advances forward in the standards or all we'd get is bandwidth improvements for higher resolutions. But because AMD saw what nV was doing and their engineers saw a means to accomplish the same goal without extra hardware, they pushed for that... or at least that's how it appears to me.

I'm not sure where you're information about latency comes from. Can you source that for me so I can get smarter about this topic in general?
 

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
It's still an optional part of the standard. And someone always needs to be pushing new advances forward in the standards or all we'd get is bandwidth improvements for higher resolutions. But because AMD saw what nV was doing and their engineers saw a means to accomplish the same goal without extra hardware, they pushed for that... or at least that's how it appears to me.

I'm not sure where you're information about latency comes from. Can you source that for me so I can get smarter about this topic in general?

Since G-sync came out before these new optional standards, they came up with their own solution, one that required a chip on the monitor, that also possibly had stuff to make up for their hardware limitations.

The point was, G-sync needed the chips to function and AMD needed to update VESA standards. If it is a conspiracy that G-sync needed the chip in the monitors, why did AMD also need new hardware changes?

As far as the latency issue, we have to piece of some of it together with their own preview and what they said later. 1) The first preview required them to use triple buffering. They could only set the refresh rate ahead of time, which would mean their triple buffering was there as a look ahead system, which causes a frame of latency. 2) Their new VESA spec's added the ability to start a refresh and have it hold until it is told to refresh again. That was new and not there previously.

Since AMD had to get new VESA standards to make it work, why then would anyone think Nvidia didn't need anything new?
 

flexy

Diamond Member
Sep 28, 2001
8,464
155
106
He has a modded driver download on his site, so proving anything shouldn't be a problem?
 

garagisti

Senior member
Aug 7, 2007
592
7
81
Since G-sync came out before these new optional standards, they came up with their own solution, one that required a chip on the monitor, that also possibly had stuff to make up for their hardware limitations.

The point was, G-sync needed the chips to function and AMD needed to update VESA standards. If it is a conspiracy that G-sync needed the chip in the monitors, why did AMD also need new hardware changes?

As far as the latency issue, we have to piece of some of it together with their own preview and what they said later. 1) The first preview required them to use triple buffering. They could only set the refresh rate ahead of time, which would mean their triple buffering was there as a look ahead system, which causes a frame of latency. 2) Their new VESA spec's added the ability to start a refresh and have it hold until it is told to refresh again. That was new and not there previously.

Since AMD had to get new VESA standards to make it work, why then would anyone think Nvidia didn't need anything new?
For what it's worth, it was already there on mobile and both amd and intel had this working, but it was geared towards power savings. Mind, the technology in broad terms was already part of existing standards. IIRC, the standards to bring this to desktop were already being worked on, but reportedly Nvidia decided to go on their own. Standards in a group take time as a consensus develops, and that gave Nvidia an opportunity to create a niche. VESA isn't just amd, intel and Nvidia...
 

96Firebird

Diamond Member
Nov 8, 2010
5,742
340
126
For what it's worth, it was already there on mobile and both amd and intel had this working, but it was geared towards power savings. Mind, the technology in broad terms was already part of existing standards. IIRC, the standards to bring this to desktop were already being worked on, but reportedly Nvidia decided to go on their own. Standards in a group take time as a consensus develops, and that gave Nvidia an opportunity to create a niche. VESA isn't just amd, intel and Nvidia...

PSR is not the same thing as adaptive sync/G-Sync. PSR is for static images, not changing refresh rates every refresh of the screen. And I'm not sure if it was in development for desktop screens, I think the closest example to that was Korean panels that had eDP connectors internally because then the panel could be used for desktop or mobile. When that was discovered, we had people who claimed adaptive sync would work on these panels with a simple firmware update. That never happened...
 

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
For what it's worth, it was already there on mobile and both amd and intel had this working, but it was geared towards power savings. Mind, the technology in broad terms was already part of existing standards. IIRC, the standards to bring this to desktop were already being worked on, but reportedly Nvidia decided to go on their own. Standards in a group take time as a consensus develops, and that gave Nvidia an opportunity to create a niche. VESA isn't just amd, intel and Nvidia...

So all 3 companies are lying and in some conspiracy together? Seriously, they did not have the same tech on laptops. They had a power saving tech that was similar. They could lower the refresh rate, but you did not read my post very well, as I explained the difference. Being able to reduce the refresh rate is not the same as being able to tell the monitor exactly when to start a given refresh.
 

garagisti

Senior member
Aug 7, 2007
592
7
81
So all 3 companies are lying and in some conspiracy together? Seriously, they did not have the same tech on laptops. They had a power saving tech that was similar. They could lower the refresh rate, but you did not read my post very well, as I explained the difference. Being able to reduce the refresh rate is not the same as being able to tell the monitor exactly when to start a given refresh.
I did mention "broad terms" and that the technology was geared towards power savings. For what it's worth, it was done with edp on mobile, and dp 1.2a is what's minimum required on desktop side, and I suppose you know as much, and standards take time to develop. I don't know where you infer that I suggested that any company lied. Gsync was introduced when nothing such existed on the desktop market. Variable refresh did exist on mobile, once again, designed towards power savings.
 

therealnickdanger

Senior member
Oct 26, 2005
987
2
0
I'm pretty sure AMD's first demo was an eDP 1.3 (not the same as DP 1.3) notebook where they hacked the panel self-refresh (PSR) technology to imitate G-Sync. That makes me think that any eDP 1.3 notebook screen could replicate the functionality, theoretically. How much of that demo was hardware dependent versus software dependent is anyone's guess.
 

Ichigo

Platinum Member
Sep 1, 2005
2,158
0
0
No point in arguing about it until a few trusted sources test that modded driver. Even if it's not exactly the same performance as G-Sync, there's a certain threshold that, if met, makes paying an extra $300 for a G-Sync monitor start to look unreasonable.
 

garagisti

Senior member
Aug 7, 2007
592
7
81
No point in arguing about it until a few trusted sources test that modded driver. Even if it's not exactly the same performance as G-Sync, there's a certain threshold that, if met, makes paying an extra $300 for a G-Sync monitor start to look unreasonable.
Technically DP 1.2 will have limitations as to refresh rates it supports, but you could theoretically have software, which could push certain refresh rates compatible with the rates that are already supported. I think 144hz is not supported by DP1.2, but only upto 120. It is just a guess, but it may be why they waited till 1.2a came along to bring variable refresh rates forward.
 

kasakka

Senior member
Mar 16, 2013
334
1
81
Technically DP 1.2 will have limitations as to refresh rates it supports, but you could theoretically have software, which could push certain refresh rates compatible with the rates that are already supported. I think 144hz is not supported by DP1.2, but only upto 120. It is just a guess, but it may be why they waited till 1.2a came along to bring variable refresh rates forward.

DP 1.2 support 144 Hz just fine, as witnessed with G-Sync displays.

I think the guy with the modded drivers is bullshitting or possibly even spreading malware. Nvidia wouldn't need to develop the G-Sync module in the first place if G-Sync on any display supporting DP was as simple as a driver update but could push it as a big feature on their cards. Even FreeSync requires different components in the display.
 

Creig

Diamond Member
Oct 9, 1999
5,170
13
81
That's crazy. I read through the author's post and couldn't really make heads or tails of it and blew it off as just fishing for page hits. But it really might actually function? Since the user with the laptop had a 980M, does that mean that these hacked drivers could possibly enable nvidia owners to utilitize G-sync on FreeSync monitors?

I don't get it. Nvidia said it wasn't possible for G-sync to operate without the module because the GPU itself didn't contain the necessary functions.