[PCPER] NVidia G-sync

Page 13 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Creig

Diamond Member
Oct 9, 1999
5,171
13
81
First thing why Nvidia will allow AMD use there tech.
Of course they won't. They're Nvidia.

Same is AMD will they allow Mantle to use on Nvidia cards the answer is no.
Completely different scenario. Mantle only works because GCN hardware is in the XBox One, the PS4 and AMD PC video cards. It can't possibly function on Nvidia hardware because it's all low-level API calls.
 

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
I can't judge NVidia yet as its too early, but if they "physX" this technology I will be annoyed.

PhysX may be licensed --- I doubt G-sync may be in the short term based on Tom Peterson statements!

I'm curious if the 3d vision mode of G-Sync improves upon CrossTalk!
 

Gryz

Golden Member
Aug 28, 2010
1,551
203
106
The keyword here is: patent.

If nVidia has a patent that covers G-Sync, that changes everything. If it is a patent about some detail without which you can't make it work.

But I doubt they have a patent. G-Sync is probably just a matter of smart engineering of an idea that has existed longer. nVidia is the first to actually build this. But there is probably nothing to stop others from building similar solutions.

If G-Sync becomes popular, you'll see a standard emerging quickly. Inside HDMI. Or there will be a DisplayPort or DVI standard that does this. And then 2-3 years from now, all videocards and monitors will incorporate this.

That is, unless there are patents involved.
But nobody asked nVidia that simple question. Weird.
 

Leadbox

Senior member
Oct 25, 2010
744
63
91
Hopefully they won't be able to get the manufacturing costs down and the monitors are prohibitively expensive, uptake is slow and low thus forcing nVidia to open this up :biggrin:
 

Braxos

Member
May 24, 2013
126
0
76
As I see it g-sync came for the 4k and the low horse power of the current and next years gpu where they will be not even near to produce a 30fps video on a 4k monitor don't even think for a game.
The resolution technology did improve to fast for them from 1080p to quad of it with the cards having issues performing well on a single 1440p or 1600p without the help of a second or third card. So think what would you need to power a 4k monitor on a game like crysis with proper settings and a option somehow to go to ultra settings too, that should be imposible without paying more for the cards then the monitor itself, contra-productive .

Since you can't eliminate that huge gfx power that is needed without loosing huge amounts of money (research) since you would jump 2-3 generation of gpu chips, meaning the money you spend for this research will not come back from it, or because you are not ready for this. So they just feel the gap with removing the issues you can see and notice so they can sell gfx that will not have the power to drive the games on good setting with good results for the average Joe who spend 1k -2k for a rig with monitor and not 1k for a single gfx.

The past 15 year's the cards are priced twice then back there and they can't even provide playable ultra settings without 2 or 3 or 4 cards. Ok you had smaller cycles 6-9 months and not 12-16months but the improvement was wider then todays one.
 

Aithos

Member
Oct 9, 2013
86
0
0
And when you do give them the money and buy the new monitor (or modify your old one) keep in mind that you won't be able to change to AMD without changing your monitor. Even if you wanted an AMD card you'd have to figure additional cost of changing your monitor in. That will almost always lock you in to staying with nVidia, stifling competition. IF it's unnecessary, I think it stinks and is purely anti-competitive.

What's worse is that people won't care and will go with the flow. Imagine if your car maker made it where if you installed a different brand oil filter or battery, your car wouldn't run, and they did it for no other reason but to force you to buy their proprietary products.

Why would you have to change monitors by switching to AMD? Where has anyone said that a G-Sync monitor is REQUIRED to run G-Sync? You, along with anyone solidly in a single camp are blowing things out of proportion and creating a situation that is black and white when really it's a shade of grey.

First of all, this is an additional PCB that will mod into a monitor. It isn't a physical change to the boards at all, thus you can install it in monitors you already own like the Asus VG248QE. If you run an AMD card you just won't be able to use G-Sync, that isn't anti-competitive at all.

Second of all, everyone tries to make something proprietary and then license it to their competitors. Look at cell phones, lawsuits all over the place over "touch function" that anyone with half a brain would realize was a logical step for EVERYONE. nVidia has even said they would be open to licensing so if AMD ends up not supporting this it is 100% on AMD.

Finally, If you run a 120hz monitor at 120 frames per second this entire concept is pretty much moot because at that framerate the visual negatives created by low framerate are virtually eliminated anyway. The point of this is that you can get better visuals at ANY framerate. The more exciting development is the inclusion of Lightboost support without having to use a driver hack.
 

smackababy

Lifer
Oct 30, 2008
27,024
79
86
I am wondering, since it can match the refresh rate to the frame rate, for a higher refresh rate monitor (120hz+) could you have G-sync double the refresh rate instead? Say, if you're only able to achieve 45 FPS, rather than run at 45hz, run at 90hz?
 

Aithos

Member
Oct 9, 2013
86
0
0
I am wondering, since it can match the refresh rate to the frame rate, for a higher refresh rate monitor (120hz+) could you have G-sync double the refresh rate instead? Say, if you're only able to achieve 45 FPS, rather than run at 45hz, run at 90hz?

No. What you're talking about is frame interpolation and it's a major no-no for both picture quality and gaming performance. That's what LCD TVs do when they claim 120 or 240hz, it isn't refreshing at 120hz based on a source outputting 120fps, it is inserting frames based on a complex algorithm that uses existing frames to "guess" what would come between them.

It is horribly un-natural and honestly is just a way to mask how poor the LCD panel technology was in TVs as recently as a few years ago. Most professional calibrators turn off that feature if you pay them to calibrate your LCD.

I personally would NEVER want a monitor to predictively generate frames, picture quality aside the processing involved in that would introduce lag to your game as well. That's half the reason IPS panels are bad for gaming, the extra inputs and processing that goes into displaying an image on an IPS panel generates input/processing lag.

G-Sync's entire purpose is to make the framerate look more in sync with your graphics card, not really to smooth out issues. It will still be limited to what your hardware can put out.
 

Dribble

Platinum Member
Aug 9, 2005
2,076
611
136
I'm curious if the 3d vision mode of G-Sync improves upon CrossTalk!

A number of current lightboost monitors basically have no cross talk (e.g benq xl2411t). The challenge is to improve colours and contrast levels now.
 

smackababy

Lifer
Oct 30, 2008
27,024
79
86
I don't think you understood what I was saying. I have a 120hz monitor. Even if I am not running at a 120FPS, capping at 60FPS still keeps things smooth.

I am not talking about frame interpolation, but rather than having a low monitor refresh rate, having it sync with a higher refresh rate. If I was running at 25FPS, a 25hz refresh rate could cause eye strain, however, a 50hz refresh rate wouldn't. It just wouldn't update the screen with a new frame every refresh.
 

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106

Not all ghosting is crosstalk when dealing with 3D-Vision. Much of the ghosting you see is light bleed through the darkened lenses. They don't black them out completely, and light does get through when there are bright spots on the screen. There is still room to improve the glasses. I think that is the biggest problem at the moment, in regards to ghosting.
 

mingsoup

Golden Member
May 17, 2006
1,295
2
81
Sounds really awesome. According the Eurogamer article I read, this could significantly make gaming at sub 60fps much more enjoyable?

If gaming at 30 fps feels as good as gaming at 60fps, I will go Nvidia in a heartbeat and get a new monitor. Oh and upgrade twice as less often?
There got to be a catch. I suppose its latency.....but I've never ran into latency issues in my time in gaming. I've seen it come up in say LCD console gaming, but I've never noticed it.
 

BrightCandle

Diamond Member
Mar 15, 2007
4,762
0
76
This patent has nothing to do with G-sync. Its actually a method for comparing to two frames one to the next to save power when the screens are not changing much. For example when on the Windows desktop.

Actually you can already see this in effect on the Windows desktop with Windows 7 if you like. Run fraps and have it monitor the desktop (DWM mode) and you will see that the frame counter doesn't move unless you do something to change the contents of the screen.

I am not a lawyer or a patent expert but this doesn't look like a patent that would block AMD from implementing G-sync.
 

5150Joker

Diamond Member
Feb 6, 2002
5,559
0
71
www.techinferno.com
This patent has nothing to do with G-sync. Its actually a method for comparing to two frames one to the next to save power when the screens are not changing much. For example when on the Windows desktop.

Actually you can already see this in effect on the Windows desktop with Windows 7 if you like. Run fraps and have it monitor the desktop (DWM mode) and you will see that the frame counter doesn't move unless you do something to change the contents of the screen.

I am not a lawyer or a patent expert but this doesn't look like a patent that would block AMD from implementing G-sync.

There's a separate patent for what you're referring to from what I saw: http://patft.uspto.gov/netacgi/nph-...h&OS=nvidia+AND+refresh&RS=nvidia+AND+refresh

Both mention power saving but the above is more specific. I still think the other patent can be used for g-sync. Or who knows maybe both apply. Has anyone seen similar patents for AMD?
 

BrightCandle

Diamond Member
Mar 15, 2007
4,762
0
76
In particular that patent is referring to a method using CRC to compare images for quantitative changes and allowing the os to have that information such that it could change the refresh rate dynamically. Its simply not relevant to the way that gsync works.

Gsync from the GPU side is simply about outputting to the cable as soon as possible whether that be when the frame is ready or at the earliest moment when the screen can refresh again. There is no screen diff/CRC going on nor does it make sense to introduce such latency into a realtime application like a game. I don't see a link at all, the patent is very clear about what it is and what it is used for, I think the website that posted this is just spreading FUD as a cursory glance would tell them this wasn't involved.
 

imaheadcase

Diamond Member
May 9, 2005
3,850
7
76
Its not a patent, they just have licensing for it. Mean if you want to pay for it you can, if its popular enough its cheaper for AMD or whoever in the long run just to pay for license fee than to make own version up. I would guess they only have it for a short time to make some money, they just let others use it.
 

Erenhardt

Diamond Member
Dec 1, 2012
3,251
105
101
Its not a patent, they just have licensing for it. Mean if you want to pay for it you can, if its popular enough its cheaper for AMD or whoever in the long run just to pay for license fee than to make own version up. I would guess they only have it for a short time to make some money, they just let others use it.

Licensing what? I don't think AMD will sell G-sync modules. If AMD makes their drivers so their cards work with G-sync I can't see anything to license. I don't think you can license a compatibility with a certain hardware.