[PCPER] NVidia G-sync

Page 17 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Teizo

Golden Member
Oct 28, 2010
1,271
31
91
Something I would like to know is whether or not 30fps with g-sync is just as smooth as 60fps with g-ysnc, and etcetera on up...Traditional thinking is the higher the better, but it seems as though this is hinting towards a paradigm shift perhaps.

The reason I say is that for those with Kepler gpu's....next gpu cycle rather than get a Maxwell...you could just get a G-Sync monitor and keep your present gpu and save some on upgrade costs.

I hope the Tiger Direct in Raleigh gets one so I can go there and see it in person.
 

Keysplayr

Elite Member
Jan 16, 2003
21,209
50
91
No, it is complicated for us - its not as if this feature is something you get when you purchase an NVIDIA GPU - you also need to buy a specific monitor with the chip preinstalled (currently not available) or mod your current (specific model) monitor. That to me is the part of the package that isn't all that practical :hmm:

Now don't get me wrong - I'm sure the Asus VG248QE is a popular monitor.. but right now with that being the only monitor they're releasing the kit for, just how many people will this feature even be accessible to? I guess we'll have to see how many monitors are available with it preinstalled next year.

I didn't get you wrong. I understand what you're saying, but I think you're over thinking it. Most people who want this tech will need Kepler or newer GPUs. GTX 650Ti or better. And a G-Sync ready monitor. Most people will just buy a whole new monitor over modding the rare bird monitor that they may already have. We don't even know how many kits will be made available for however many monitors yet. Nvidia doesn't even know the extent of how far they'll go with that. They are probably putting most of their focus on the partners who make the monitors to incorporate or at least make them upgradeable to G-Sync.
 

Keysplayr

Elite Member
Jan 16, 2003
21,209
50
91
Something I would like to know is whether or not 30fps with g-sync is just as smooth as 60fps with g-ysnc, and etcetera on up...Traditional thinking is the higher the better, but it seems as though this is hinting towards a paradigm shift perhaps.

The reason I say is that for those with Kepler gpu's....next gpu cycle rather than get a Maxwell...you could just get a G-Sync monitor and keep your present gpu and save some on upgrade costs.

I hope the Tiger Direct in Raleigh gets one so I can go there and see it in person.

I don't think so. As always, the closer you get to 30 the more you might notice flutter. The point where the human eye can sense the transition from one frame to the next because it's progression is slow enough. G-Sync is also supposed to shut off at under 30fps and enable standard V-Sync. It also does this if the fps reaches the upper limit of the monitors refresh rate capability.
But yes, I think I would notice a flutter at 30fps in most games. Some games are perfectly fine at 30, while others are really bad.
 

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
Something I would like to know is whether or not 30fps with g-sync is just as smooth as 60fps with g-ysnc, and etcetera on up...Traditional thinking is the higher the better, but it seems as though this is hinting towards a paradigm shift perhaps.

The reason I say is that for those with Kepler gpu's....next gpu cycle rather than get a Maxwell...you could just get a G-Sync monitor and keep your present gpu and save some on upgrade costs.

I hope the Tiger Direct in Raleigh gets one so I can go there and see it in person.

It's not a simple answer.

60 FPS with v-sync on with 60hz, is very smooth, but it adds an extra frame of latency.

30 FPS with g-sync (assuming no dips below), will have the same latency, but half as many frames, as each frame takes twice as long to render as the 60 FPS with v-sync setup, but suffers no additional latency.

60 FPS with 60hz and v-sync would be better, as it does have double the frames, but the latency isn't any better.

Now at some point between 30 and 60 FPS with G-sync, you may indeed start to notice superior performance. Your latency will decrease in comparison to 60 FPS with v-sync and it may still appear smooth, though it may not be as smooth still.

For me, and I imagine at least some others, latency is what makes a game feel smooth, more so than consistent frames. This may make it so I can play at lower FPS and not get nauseated at lower than 80 FPS, as I do now.
 

omeds

Senior member
Dec 14, 2011
646
13
81
G-sync will be sex on 144hz displays in games like BF4. FPS fluctuating between 80-144 will be perfectly smooth without tearing I would imagine.
 

blackened23

Diamond Member
Jul 26, 2011
8,548
2
0
Aside from that, the most exciting prospect to me is never having to deal with input lag ever. That's going to be huge for competitive FPS games.....(not sure if a competetive pro scene exists for FPS anymore, seems to be all RTS/MOBA lately{)
 
Last edited:

omeds

Senior member
Dec 14, 2011
646
13
81
There is also a "2D lightboost" mode iirc, that shouldn't have the IQ drawbacks of the current hack.
 

BrightCandle

Diamond Member
Mar 15, 2007
4,762
0
76
Gsync is still going to end up using double buffering, because it needs to render into one buffer while it scans out another.

At exactly 60 fps there isn't a difference in latency they both take 16ms to render then 16ms of scanout. If you want to know what gsync would feel like turn the graphics settings down so that its consistently 60 fps all the time (or 120fps etc) and play with vsync on. That is basically what gsync is going to be like but it also feels that way even on lower fps when its varying.

At 45 fps is a more interesting scenario:
There are two scenarios with vsync triple buffering-
a) The frame misses its moment and so it gets delayed to 33.3ms before it starts to scan out so it takes a total of 33.3+16.6 = 49.9ms
b) The frame ends up perfect and it takes 24.9 + 16.6 = 41.5ms

It alternates between these two timings causing a stuttering effect.

With vsync double buffering you end up with exactly 33.3ms + 16.6ms all the time, the frame drops to a consistent 30 fps. Of course it doesn't always work out this way if the rendering time varies which it typically does.

With gsync its a smooth and consistent 24.9ms + 16.6ms = 41.5ms constantly. Without the need to quantise the frame rate can be delivered consistently.

You can take any frame rate between the perfect ones (60/30/20) and find that the frame rate stutters in a pattern with vsync, there isn't any choice about that with vsync. We don't need gsync if games can run a perfect 60/120/whatever. But they don't. They vary quite a lot or they sacrifice graphical detail. We have all played smooth games and they sacrifice a lot graphically to be super smooth. With gsync we can have 45 fps be as smooth as it should be without vsync artifacts making it less smooth than it should be.

Gsync also ends up no worse than vsync in terms of latency. At 45 fps vsync swaps the buffer and scans out and takes 41.5ms just like gsync does. The difference is that its aligned so there are no tear lines. Tear lines in themselves make the game less smooth because the image isn't complete and different parts of the same image are actually displayed across at least 2 frames which depending on where and how your eyes look at the image isn't smooth either.

So the end result is that gsync is superior to both vsync on and off. It eliminates tearing and stuttering caused by vsync at frame rates other than those that are perfect dividers of the 60hz/120hz monitor. Its just plain better in all regards.
 

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
Gsync is still going to end up using double buffering, because it needs to render into one buffer while it scans out another.

At exactly 60 fps there isn't a difference in latency they both take 16ms to render then 16ms of scanout. If you want to know what gsync would feel like turn the graphics settings down so that its consistently 60 fps all the time (or 120fps etc) and play with vsync on. That is basically what gsync is going to be like but it also feels that way even on lower fps when its varying.

Actually, that is not true. At least the part where gsync will feel like v-sync at a solid 60 FPS. That is because v-sync uses triple buffering in most cases (you don't have a choice in most DX games). The DX v-sync option will force you to wait an additional frame. That is because DX v-sync forces the display to render the oldest complete image in the 3 buffers, even if there is a complete newer image waiting. This is where all the input lag comes from with v-sync.

If the game doesn't use triple buffering, then another, equally big problem occurs. The GPU stops rendering when it finishes a frame, and vertical blanking mode has not occurred yet. This results in a massive FPS drop if you cannot maintain 60hz. This is because if the GPU cannot render one frame within the 16ms allotted time, it has to wait for 33ms to roll over, and that is when the next frame starts, resulting in frequently if not always waiting 33ms for every frame. Though this would be like Gsync when you can maintain 60 FPS, but unfortunately, this is not an option for most games, as most DX games force triple buffering on with no way to turn it off.

G-sync will still be better at 60 FPS in the majority of games due to the triple buffer latency issue.
 
Last edited:

lilxskull

Member
Jul 24, 2011
68
0
66
I hope G-sync can be incorporated into existing monitors. Also hope they don't rise prices on monitors that have these.
 

PrincessFrosty

Platinum Member
Feb 13, 2008
2,301
68
91
www.frostyhacks.blogspot.com
Cant wait for this, just a shame it's going to require a new monitor, still the 120hz TN I have now dedicated for gaming can be replaced for fairly cheap, maybe 144hz next time round.

After seeing how well lightboost works integrated into existing monitor tech on my BenQ XL2420T I'm actually excited to see what nvidia can do to increase the experience, I avoid vsync like the plague I just cannot stand the latency it adds, this is a great solution to an age old problem.
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
I hope G-sync can be incorporated into existing monitors. Also hope they don't rise prices on monitors that have these.

They'll only offer updates to old monitors if it's profitable and there's a demand. As far as it not raising the price of the monitors goes, only if nVidia wants to give it away for free. That's just not like nVidia to do that.
 

Keysplayr

Elite Member
Jan 16, 2003
21,209
50
91
I hope G-sync can be incorporated into existing monitors. Also hope they don't rise prices on monitors that have these.

Very very few monitors will have this upgrade capability as Nvidia doesn't plan on extensive amounts of kits for various monitors. At least initially.
And in the beginning, of course there is going to be a noticeable price premium on G-Sync equipped monitors. Figure 100 dollars over what the monitor would cost without G-Sync. As always, early adopters will pay the full monty and over time the cost will come down substantially, once they get the costs down.
 

Keysplayr

Elite Member
Jan 16, 2003
21,209
50
91
They'll only offer updates to old monitors if it's profitable and there's a demand. As far as it not raising the price of the monitors goes, only if nVidia wants to give it away for free. That's just not like nVidia to do that.

It's not like any company to do that. Be accurate now. ;)
 

railven

Diamond Member
Mar 25, 2010
6,604
561
126
Interesting Railven. You interpret ALL this, into them being zero interested in this technology. You could have mentioned these people Arent your conventional gamers from the very beginning? Perhaps? Or did you want to rile me up first?
And it's not arrogance, Railven. Its called being mislead by you.
Any gaming enthusiast who claims they aren't interested in a technology that eliminates stutter, tearing, lag, please introduce me to them.
I don't care what monitor they are currently using. The potential is there for them to use it even if they CANT right now.
Anything else that you'd failed to mention, I'd appreciate knowing now.
And I can guarantee you that these 8 people, no matter what screen they are using, get stutter, tearing or lagging if Vsync is used. Still don't think they might be interested? Why don't you bring them around so we can chat.

And there is that arrogance again. You've made the statement you can't think of any reason for anyone not to be interested in this technology. That is a bold generalistic statement that blanket covers all gamers regardless of their setup. I'm pointing out there is a set of gamers that don't use desktop monitors have no intentions of switching over to desktop monitors and talking to them (as I am interested in the technology) has led to dead end conversations with them showing no interest unless this technology will support their projectors and HDTVs.

I didn't mislead anyone, I'm not making claims that all gamers regardless of setup would love to have this technology. Perhaps if Nvidia announces support for DLPs, Plasma, and Projectors than I can carry this conversation, but since they've only shown a 24" desktop monitor - well.

Perhaps once the potential is there, they'll come around, but for now as far as what they've said to me - they don't care. It be easier if you join our forum instead of making them all join here, you can represent NV (they'll love you, again they're all NV fans), let me know and I'll PM you the forum address.

EDIT: On the subject of Picture Quality, I'll give you $100 if you can convince any of them that there is a monitor that produces a better image than their setups. I've gone 30+ pages arguing with some how 1600p is better than 1080p, but in the end the "blacks on my Kuro are reference and this TV is almost 5 years old and nothing has come close to beating it." Again, these guys aren't your desk-jockey nerds, they pour thousands of dollars into their Home Theaters last thing they want is a desktop monitor (but you'll now single them out as not being gamers, so as to not back track on your initial statement, one has SLI Titans on his 1080p Kuro Elite...go figure.)
 
Last edited:

Mark Rejhon

Senior member
Dec 13, 2012
273
1
71
Gsync is still going to end up using double buffering, because it needs to render into one buffer while it scans out another.
Only on the GPU side (as usual).

On the monitor side, the 768MB of memory likely isn't for framebuffer delay, but for color processing that they said -- which I interpret to mean LUT's and historical (past) framebuffers, to calculate overdrive-compensated/rate-compensated LCD refreshes.

During normal situations (times between 1/30sec and 1/144sec frame rendering time intervals), G-SYNC works like this
1. GPU finishes generating frame into back buffer.
2. Direct3D Present() call triggers immediate delivery to the monitor.
3. Frame delivery occurs at full dotclock (144Hz), regardless of current frame rate / refresh rate.
4. While GPU is delivering the frame to the monitor, the monitor is painting the refresh simultaneously (in real time, as the data comes over the cable).
5. The refresh completes in 1/144sec.
6. The whole refresh is immediately visible to human eyes about 1/144sec after the Direct3D Present() call. (plus ~2ms for pixel transition time)

Below 30fps, a repeat refresh occurs.
Above 144fps, frame delivery is delayed. (technically, it's also possible you can do "proper" triplebuffering at this point, to reduce lag)

Current 120Hz/144Hz ASUS/BENQ monitors already does realtime painting of the refresh directly off the cable. See my high-speed video proof (non-LightBoost portion). The display is refreshed while the data is coming off the cable. Plus, I measured input lag of about 2.8ms for top edge of screen in 120Hz non-LightBoost mode on my VG278H, measured by oscilloscope+photodiode, plus also my prototype Blur Busters Input Lag Tester (which corresponds to the oscilloscope+photodiode), which is probably due to ASUS VG278H's 2ms of pixel transition. My XL2411T in non-LightBoost mode, is also similar. (~3ms for top edge, ~7ms for center, ~11ms for bottom edge)

G-SYNC is going to continue to do exactly the same thing (realtime paint of panel, off the cable), but eliminate the fixed refreshing intervals. Meaning, you have any refresh rate, but with frametimes of 1/144sec (cable-transmission and realtime panel-refresh-cycle of 1/144sec). In fact, it can lower the latency of a fixed refresh rate, due to the accelerated frame delivery/panel top-to-bottom refresh cycle (60Hz refresh scanout in 1/144sec). This will even reduce input lag for 60Hz emulators too, because of this.

The powerful FPGA, the 768MB of memory, and everything, is needed because of realtime unbuffered processing (or perhaps a single scanline buffer, ~10 microseconds) needed for realtime painting of the panel, without color flicker/modulations of variable refresh rates, including complex variable-refresh-compensated overdrive algorithms. Remember 60Hz-vs-120Hz-vs-144Hz has different colors on your monitors, poorer colors at 144Hz, so trying to keep all variable refresh rates at exactly the same color (less than 0.25% error to avoid color flicker during rapid 30fps->144fps->30fps->144ps modulations), while keeping fast response, avoid flicker caused by refresh rate transitions, avoiding interaction problems with FRC and/or inversion, variable-refresh overdrive algorithms, all seems to be a major mathematical engineering feat. And doing all of this, without degrading averaged color quality too much. Some good Ph.D's probably went into G-SYNC. nVidia said "color processing", and that's an accurate, albiet, simplified statement.
 
Last edited:

BrightCandle

Diamond Member
Mar 15, 2007
4,762
0
76
That is very interesting. I already assumed that the latest monitors did off the wire drawing but it's good to get a confirmation of that as a lot of older LCD monitors buffered a frame and scanned out another (double buffering).

In the future it might be possible to scanout the pixels of the GPU buffer as and when they arrive as well to further reduce delay. It might limit how the manufacturer gets to do the production of the final pixels but avoiding that extra latency of waiting for the screen to finish and then spending 16ms of scan out. Although the problem I think is that 15.9ms or so is going to still be spent in rendering the image and no pixels will be produced. Its not I suspect realistic to have the card work on the first pixels faster so that they are all produced throughout the 16ms period of rendering. Still it would help the vr guys out quite a bit.
 

Mark Rejhon

Senior member
Dec 13, 2012
273
1
71
Guru3d is reporting that monitors with gsync will come up to 177hz and 4k and timeframe was Q1 2014 (they said 2013 in their article but pretty clear that is a typo).
177Hz is exactly the maximum bandwidth of a single DP channel for 1920x1080.

1920 x 1080 x 24-bit x 177 (Hz) = 8.8 Gbits/sec
That's exactly half the DisplayPort 1.2 maximum bandwidth of 17.6 Gbits/sec.

Since they said 1Q2014, and the VG248QE during 1Q2014, plus the DisplayPort math above,
I suspect 177Hz non-strobed might even come to the ASUS VG248QE.
The panel in the VG248QE already can refresh its panel quicker than 144Hz, because it needs accelerated refreshes during LightBoost mode (source) to create blanking interval. The 1ms panel itself could theoretically be clocked to 240Hz, if using both DisplayPort channels, but that would go into overclocking territory. I wonder if the G-SYNC firmware theoretically allows the use of 960x1080 240Hz tiled mode. Color quality would be worse than 144Hz, but it would be fairly low-persistence (~4.1ms) without strobing. A bit more persistence than LightBoost=100% during 100Hz (~2.9ms persistence), but getting very close. Albiet you will need an insane GPU to run 240fps@240Hz non-strobed.

Not confirmed, of course.
 
Last edited:

imaheadcase

Diamond Member
May 9, 2005
3,850
7
76
Mark Rejhon do you think chances are likely that newer BenQ XL2420TE 144Hz monitors will be able to get upgrade G-sync like Asus? They technically use same types of boards right?
 

jackstar7

Lifer
Jun 26, 2009
11,679
1,944
126
Are you insane :D ? The target market is ALL gamers. Go ahead. Try and convince me that you do not want this. I'll believe you. Trust me. :sneaky:

I'll stick with Lightboost strobing over G-synch.

It's a nice idea, just not what I wanted to see next (authorized LB strobing, so it doesn't have to be hacked by users).
 

Mark Rejhon

Senior member
Dec 13, 2012
273
1
71
Mark Rejhon do you think chances are likely that newer BenQ XL2420TE 144Hz monitors will be able to get upgrade G-sync like Asus? They technically use same types of boards right?
I'm honestly not sure. The G-SYNC boards require custom firmware upgrades and adds custom menus to the OSD, reportedly. The ASUS/BENQ uses different menus. So it's risky to try; may not be plug and play interchangeable. I'd imagine BENQ will announce something at some point.
 

Mark Rejhon

Senior member
Dec 13, 2012
273
1
71
16ms of scan out.
The scanout at 144Hz is 6.9ms. Since scanout remains constant during variable framerate delivery, you got 6.9ms scanout even at 60fps@60Hz, since the frame-delivery and scanout stays at the full 1/144sec speed (or 177Hz speed, 5.6ms frame delivery time) regardless of the current variable refresh rate (= rate of delivery of frames).

Now if you use DisplayPort 2.0, you could speed up delivery of 1920x1080p frames to just 2.3 milliseconds! The rate of delivery of frames doesn't necessarily have to match G-SYNC's maximum refresh rate. 1920x1080 354Hz is the theoretical maximum refresh rate of DisplayPort 2.0, but no panels can really support that today. You could even do it over DisplayPort 1.2 if you tiled two channels (960x1080 @ 354Hz side-by-side). Theoretically, of course.

However, that doesn't stop future graphics cards from using the maximum dotclock to speed up the delivery of individual refreshes to the monitor, even when the display can't refresh that fast. The monitor can buffer it while immediately beginning a slower scanout. Now you've got a 120Hz refresh with less input lag, because of a 1/240sec frame delivery time (or 1/354sec frame delivery time). In theory, of course.

The concept of decoupling frame-delivery time (over DP) from refresh time, even at fixed refresh rates, is probably going to be an interesting trend of G-SYNC moving forward for certain fixed-refresh-rate applications. G-SYNC will already reduce emulator input lag by about 10ms, even at a fixed refresh rate of 60fps@60Hz, since the 60Hz refresh can be delivered/scanned in only 1/144second (6.9ms versus 16.7ms)

This is getting mathematically interesting.
 
Last edited: