[gamenab] nvidia gsync working on laptop without gsync module

Grooveriding

Diamond Member
Dec 25, 2008
9,108
1,260
126
http://gamenab.net/2015/01/26/truth...using-vesa-adaptive-sync-technology-freesync/


A leaked driver allowed this guy to get gsync working on a laptop that does not have a gsync module. I'm assuming they are using an adaptive refresh rate method similar to what AMD did with their first freesync demo on a laptop.

Obviously nvidia's current discreet gpu desktop cards can't leverage this because of lack of support for 1.2a, but future cards may be seen with support for it and labeling their adaptive sync support as gsync.

Highlight is they are doing adaptive refresh without the need for a gsync module which supposedly was necessary... :sneaky:
 

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
I'd like to see this tested, as I'm betting it has a frame worth of latency, unless AMD's original demo's and claims were misguided.
 

Grooveriding

Diamond Member
Dec 25, 2008
9,108
1,260
126
Watching that video. Differences they note are that below 25fps you get flickering, which on gsync screens with a module you do not. I believe gsync on standalone monitors disables its self at 25 or 20fps and vsync takes over and the same thing if you exceed the monitor's refresh rate. They saw the screen blank out some times as well. There is also flickering under certain conditions on screens with the gsync module, like the rog swift, which they mentioned and I've read at OCN.

Really just sounds like it is beta and not fully tuned from that video. From what we have heard of the CES demos of adaptive sync enabled screens using AMD's free sync they were operating similarly where there was a minimum framerate after which vsync would take over. With some tuning nvidia can do the same thing so long as they put out a video card that supports DP 1.2a.

No wonder they do not want to support adaptive sync on desktop displays, no additional $200 module cost...
 

Creig

Diamond Member
Oct 9, 1999
5,171
13
81
Well, Nvidia said SLI couldn't function without their bridge chip. They said PhysX couldn't function properly with an AMD card in the system. And now they said G-sync couldn't function without the module.
 

Grooveriding

Diamond Member
Dec 25, 2008
9,108
1,260
126
Well, Nvidia said SLI couldn't function without their bridge chip. They said PhysX couldn't function properly with an AMD card in the system. And now they said G-sync couldn't function without the module.

True. The question is will they lock this down via drivers like they did using an nvidia card for gpu physx with an AMD card. History points to yes so they can sell $200 gsync modules to sit in your monitor.

http://www.overclock.net/t/1538208/nvidia-g-sync-free-on-dp-1-2-monitors/100#post_23478999

Several users at OCN all have it working as well. It apparently only works with the specific LG IPS panel used on those models of laptops which has a recently designed scalar. Must be one of the scalars that supports adaptive sync and because it's in a laptop using EDP for the monitor connection they can circumvent the need for DP 1.2a support like you require on the desktop.
 

destrekor

Lifer
Nov 18, 2005
28,799
359
126
They also said they have no plans to support Freesync... I guess we can take them on their word, right? Right?!

I feel like it is obvious that, in the end, they will support the VESA Adaptive Sync standard. They will likely never refer to it as FreeSync, which doesn't matter.
Right now, they are definitely in PR spin mode and trying to milk the market for G-sync as long as they can. They accept industry standards in the end, until they actually prove they can out-engineer them. That VESA's standard will have a larger market share, by far, is surely what guarantees Nvidia will either patch in support down the road, or, if the hardware cannot currently support it, add it in to next-gen cards. They might even sneak it in to the next cards (did they already?) or to the next-gen, then later say, "oh here's a patch, you're welcome!". With G-sync having the dominant market position (for now, that won't last long with VESA adopting a standard), Nvidia will try to ride that for what they can.

Just like I suspect the NVlink technology won't go to far in the consumer space, but might just have some lasting appeal in the compute/supercomputer world. As it requires a new interface entirely on the motherboard, I just don't think they'll ever see success with it on the desktop, and likely won't even try to push it. They might try to incorporate it as a new SLI bridge, or merge what they can of the NVlink protocol into an XDMA-style logical point-to-point links over PCIe as opposed to hardwired point-to-point links.
 

Creig

Diamond Member
Oct 9, 1999
5,171
13
81
Since the laptop GPUs are based on the desktop GPUs, I'm wondering if this means the driver could be modified to enable G-sync on the upcoming Adaptive-Sync capable monitors for desktop users. Talk about a one-two punch to Nvidia's credibility if this turns out to be true! First the GTX 970 memory flaw, now this.

Looks like Nvidia is having a very bad week.
 
Last edited:

5150Joker

Diamond Member
Feb 6, 2002
5,559
0
71
www.techinferno.com
True. The question is will they lock this down via drivers like they did using an nvidia card for gpu physx with an AMD card. History points to yes so they can sell $200 gsync modules to sit in your monitor.

http://www.overclock.net/t/1538208/nvidia-g-sync-free-on-dp-1-2-monitors/100#post_23478999

Several users at OCN all have it working as well. It apparently only works with the specific LG IPS panel used on those models of laptops which has a recently designed scalar. Must be one of the scalars that supports adaptive sync and because it's in a laptop using EDP for the monitor connection they can circumvent the need for DP 1.2a support like you require on the desktop.


PCPer mentioned that these earlier monitors like the Swift pre-date DP 1.2a scalers and therefore need the G-Sync module. In addition, the module supposedly has some hidden functionality that NVIDIA has yet to reveal. But I'd imagine they will take adaptive vsync and lock it down to NVIDIA only hardware and call it G-Sync. So at the end of the day, nothing will likely change. Keep in mind this only worked because NVIDIA gave OEM's like ASUS an alpha driver with this functionality specifically coded into the drivers, it wouldn't be hard for them to lock it down in the final release.

Since the laptop GPUs are based on the desktop GPUs, I'm wondering if this means the driver could be modified to enable G-sync on the upcoming Adaptive-Sync capable monitors for desktop users. Talk about a one-two punch to Nvidia's credibility if this turns out to be true! First the GTX 970 memory flaw, now this.

Looks like Nvidia is having a very bad week.

Even if you could modify the inf with the device id of a desktop GPU, it would be useless since future drivers wouldn't have that ability presumably.
 
Last edited:

jackstar7

Lifer
Jun 26, 2009
11,679
1,944
126
I'm confused, has anyone got this working on a regular old 1.2DP desktop display?

No. That's not one of the options.

eDP laptops are the primary thing able to take advantage right now. And from what I was reading it possibly requires m980.
 

Mondozei

Golden Member
Jul 7, 2013
1,043
41
86
Nvidia is starting to look like the Ubisoft of the video cards industry. The amount of greed and deception is just astounding.
 
  • Like
Reactions: Charlie22911

ViRGE

Elite Member, Moderator Emeritus
Oct 9, 1999
31,516
167
106
PCPer mentioned that these earlier monitors like the Swift pre-date DP 1.2a scalers and therefore need the G-Sync module. In addition, the module supposedly has some hidden functionality that NVIDIA has yet to reveal. But I'd imagine they will take adaptive vsync and lock it down to NVIDIA only hardware and call it G-Sync. So at the end of the day, nothing will likely change. Keep in mind this only worked because NVIDIA gave OEM's like ASUS an alpha driver with this functionality specifically coded into the drivers, it wouldn't be hard for them to lock it down in the final release.
Agreed. If NVIDIA can support DP Adaptive Sync, then I suspect that they intend to keep it as a Value Added Feature, similar to PhysX, 3D Vision/3DTV, and SLI. In which case they'll probably go the SLI route on this one: G-Sync via DPAS will only be enabled in licensed devices, where the manufacturer has paid NVIDIA royalties for a key that will be burnt into the EDID or BIOS. Which would be unfortunate, but it is consistent with how NVIDIA handles these kinds of technologies.

On the plus side though this means that the physical G-sync modules are a limited duration solution, and everyone would be standardizing on DPAS. That makes device manufacturing easier for everyone, and makes it more likely that monitors and scalars will support DPAS.
 
Last edited:

garagisti

Senior member
Aug 7, 2007
592
7
81
Agreed. If NVIDIA can support DP Adaptive Sync, then I suspect that they intend to keep it as a Value Added Feature, similar to PhysX, 3D Vision/3DTV, and SLI. In which case they'll probably go the SLI route on this one: G-Sync via DPAS will only be enabled in licensed devices, where the manufacturer has paid NVIDIA royalties for a key that will be burnt into the EDID or BIOS. Which would be unfortunate, but it is consistent with how NVIDIA handles these kinds of technologies.

On the plus side though this means that the physical G-sync modules are a limited duration solution, and everyone would be standardizing on DPAS. That makes device manufacturing easier for everyone, and makes it more likely that monitors and scalars will support DPAS.
But what about monies for that sweet green logo? I think they still may keep charging as long as customers will keep queuing up.

Although, it would be hilarious if as initial reports say, there's not much between what is free and paid refresh solutions. Free will actually have more monitors.

Also, since you may know more on this, i thought that Nvidia was using Scalers in their modules, as their GPUs at the time of the launch of GSync had something to sort on the scalers in their cards. It's been so long since i read it up, but is that correct?
 

ViRGE

Elite Member, Moderator Emeritus
Oct 9, 1999
31,516
167
106
Also, since you may know more on this, i thought that Nvidia was using Scalers in their modules, as their GPUs at the time of the launch of GSync had something to sort on the scalers in their cards. It's been so long since i read it up, but is that correct?
The rumor was that Kepler was incapable of supporting DPAS, and that the G-sync module was effectively a combination of a scaler and additional hardware to handle the variable timing. However no one was able to prove it, so it remains a rumor.

It is interesting though that all of the testing done thus far has been on Maxwell (900M) GPUs, which has upgraded display controllers.
 

garagisti

Senior member
Aug 7, 2007
592
7
81
The rumor was that Kepler was incapable of supporting DPAS, and that the G-sync module was effectively a combination of a scaler and additional hardware to handle the variable timing. However no one was able to prove it, so it remains a rumor.

It is interesting though that all of the testing done thus far has been on Maxwell (900M) GPUs, which has upgraded display controllers.

I read at the time in quite some detail, but there was no 'rumour', just that 290s had better scalers, and now I can't even remember where. Thanks for helping with this.
 

Grooveriding

Diamond Member
Dec 25, 2008
9,108
1,260
126
Yeah, some of his claims were off. What I'd like to see is someone try the driver with an adaptive sync desktop monitor. I doubt it will work because of lack of DP 1.2a support, but it would be interesting. Maybe we will see support for DP 1.2a on the GM200 geforce card whenever it releases later this year.
 

destrekor

Lifer
Nov 18, 2005
28,799
359
126
Yeah, some of his claims were off. What I'd like to see is someone try the driver with an adaptive sync desktop monitor. I doubt it will work because of lack of DP 1.2a support, but it would be interesting. Maybe we will see support for DP 1.2a on the GM200 geforce card whenever it releases later this year.

What was the deal with the R9 cards? They launched with DP 1.2, based on everything I can find. Then they got support for FreeSync, which utilizes the DPAS standard, so... it should mean they have DP 1.2a ports then, yes?
 

jackstar7

Lifer
Jun 26, 2009
11,679
1,944
126
What was the deal with the R9 cards? They launched with DP 1.2, based on everything I can find. Then they got support for FreeSync, which utilizes the DPAS standard, so... it should mean they have DP 1.2a ports then, yes?

I believe they have limited Freesync support, not full.
 

Hitman928

Diamond Member
Apr 15, 2012
5,320
7,995
136
R9-285, 290 and 290x support full freesync (as well as 260x I believe). The rest of the R9 and R7 products support freesync but not in games, just for videos.