G-Sync low refresh mode?

EliteRetard

Diamond Member
Mar 6, 2006
6,490
1,021
136
Sounds like if your FPS drops low the monitor locks at a minimum refresh, but is it not possible to simply double/triple the refresh rate and display 2-3 frames?

Say your running at 18FPS, instead of tearing and migrains locked at 30Hz (or whatever the mnimum is) why not triple it and set the refresh to 54Hz?

I'm not sure how much detail is out there on this, but I figure I'll ask.

Personally I don't think G-Sync should be on any monitor under ~120Hz and anything under 60FPS gets doubled and under 40FPS should automatically be tripled. Maybe we could have a few otions, like a 90Hz mode that sets the numbers to 45/30. If we had a 60Hz monitor/mode then at least we could double anything under 30FPS and triple anything under 20FPS.

Having monitors work in conjuction with the GPU sounds good, but not at the expense of refresh rates. I can't stand low refresh rates...G-Sync has potential, but no way in heck am I going to use it if I'm getting less than 60Hz on my monitor.

I'm still waiting for a superior option to my CRT...high res (and high DPI), high refresh, no lag, great color and solid blacks. I hate tearing, and between vsync and my variable refresh and resolution options I never have a problem and I never have to go below 72Hz.

So my ideal monitor: 24" 120Hz 2560x1600 w/ G-Synce 60Hz lowest refresh.
I'd be willing to sacrifice a little on the color/input lag and I'd be willing to pay up to $800.

I think there should be 27-30" 120Hz "4k" options at ~$1200.

Knock $200 off for G-Synce and $200 for 120Hz, so $400 for a "regular" 2560x1600 24"

Anybody think there's more than a 1% chance of this happening in the next ~5 years?

Since the chip is built into the monitor, what would stop AMD from using G-Sync? I absolutely think this should be a standard option for either GPU, if Nvidia tries to vendor lock this I am going to murder them.
 

ShintaiDK

Lifer
Apr 22, 2012
20,378
145
106
Showing the same image multiple times on LCD changes nothing. So not sure why you wish to double or tripple it, instead of just keep showing the original frame.

Nothing stops AMD from supporting it. They just need to manipulate the v-blank signal in the displayport as well.
 

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
That would be possible, and it is possible that is what happens. They mentioned they'd force a minimum of 30hz, but that doesn't say it won't force it in the way you mentioned (though I doubt it).

That said, I don't think you recognize the benefits of Gsync, and how it will make the need for high refresh rate obsolete on LCD when the FPS are not high.

High refresh rates on LCD's only have a few purposes, as they are solid state and do not flicker unless you are referring to Lightboost.

Having a low refresh rate only hurts by preventing you from refreshing completed images. Without completed images, it doesn't matter, unless you drop below 15hz, which causes some light variance.

With the monitor syncing to the GPU's completed frames, there is no reason, other than Lightboost or sub 15 FPS, to update the image on the display, and would give no benefit what so ever. Nor is it possible to see as nothing happens in those cases.
 

Teizo

Golden Member
Oct 28, 2010
1,271
31
91
Nothing stops AMD from supporting it. They just need to manipulate the v-blank signal in the displayport as well.

AMD will have to pay Nvidia a licensing fee, according to Linus on his latest WAN show.
 

BrightCandle

Diamond Member
Mar 15, 2007
4,762
0
76
Gsync is no worse than vsync below 30 fps, its just not always better. After an image has been displayed 33.4 ms (30 fps) the monitor will have to refresh it which on a 120hz monitor will take 8ms blocking the frame from coming out at the ideal moment if it appears just after that refresh starts. But that minor stutter of 8ms is nothing compared to the frames that are on average being held for 55.5ms anyway, so the stutter caused by the refresh is minimal.

I do think that for particular content, like movies, the player will want to show the same frame twice in a row to push the refresh up from 24 to 48 fps to avoid this effect, as the stutter will be similar to 3:2:2 otherwise.
 

Wall Street

Senior member
Mar 28, 2012
691
44
91
With g-sync, for the first time there will be a difference between how quickly the monitor refreshes and how often it refreshes. The lowest g-sync will go is refreshing at least 30 times per second (every 33.3 ms) or the colors of the pixels can get wacky. That being said, it still should refresh the screen in only 8 ms because the panels will be 120 Hz capable.

LCDs don't cause migraines from low refresh rates, only CRTs do. This is because LCD monitors consistently display the color while the strobe that sets refreshed the color on CRTs also is the light (it is not a backlight-based technology). This shouldn't be a worry.

Showing the same image multiple times on LCD changes nothing. So not sure why you wish to double or tripple it, instead of just keep showing the original frame.

It the screen doesn't refresh at all, the colors of the pixels will start to drift. Think of it like RAM where charge is continually added to the cells to keep the data in storage, if the cells aren't "refreshed" by resetting their charge to the correct value, they forget what value they should be set at and will start to drift.
 
Last edited:

Leadbox

Senior member
Oct 25, 2010
744
63
91
Showing the same image multiple times on LCD changes nothing. So not sure why you wish to double or tripple it, instead of just keep showing the original frame.

Nothing stops AMD from supporting it. They just need to manipulate the v-blank signal in the displayport as well.

SO nVidia was lying when they said there was hardware in Kepler needed for G-sync?
 

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
SO nVidia was lying when they said there was hardware in Kepler needed for G-sync?

They were not lying, but also not talking about the future either.

They require a displayport connection, and the Kepler series is the first series they offered that.

It has also been said they hope to get this to work with AMD and Intel in the future. They may expect royalties, but we do not know what has been planned in that way.