R9 290X or Kepler...the GSYNC dilemma

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Atreidin

Senior member
Mar 31, 2011
464
27
86
It isn't like Nvidia has no reason to allow other manufacturer's cards to use Gsync. They are still making money on each Gsync monitor sold.

Also the monitor manufacturers might put pressure on Nvidia for Gsync to be allowed on other cards, if it isn't already. If AMD and Intel video can conceivably use it in the future, that opens up their market, possibly sells more monitors, and Nvidia makes a profit from each one.

The percentage of consumers who care enough to not only buy a new monitor for this feature, but pay $100 more than one without it, is already tiny. Opening it up would help it catch on and maybe be better for Nvidia's bottom line than keeping it proprietary. I'm sure they'll do whatever they think will net them more money.
 
Last edited:

zlatan

Senior member
Mar 15, 2011
580
291
136
I'm sure it won't be better quality. It's the performance that matters.
It will. They decided to improve the performance and the overall quality also. This is the pilot project so it has to be huge for the gamers and for the industry.
 

shepardh

Member
Jan 6, 2011
32
0
0
First of all - of course I know it requires a GSYNC supported monitor.

My current monitor is 120HZ TN display (LG W2343D), and I will surely replace it for the sake of GSYNC (if I decide to go nvidia).

I'm mostly a FPS gamer, and as such I prefer a faster TN display than a a superior and slower IPS display.

I'm sure my current setup can carry BF4 on Medium to High details as the drivers will get better.

So if I understand right, your advice is "if you don't need to performance right now, wait a bit till 20nm comes out" ?

What is the estimation atm, Q1 2014 ?
 

R0H1T

Platinum Member
Jan 12, 2013
2,582
162
106
First of all - of course I know it requires a GSYNC supported monitor.

My current monitor is 120HZ TN display (LG W2343D), and I will surely replace it for the sake of GSYNC (if I decide to go nvidia).

I'm mostly a FPS gamer, and as such I prefer a faster TN display than a a superior and slower IPS display.

I'm sure my current setup can carry BF4 on Medium to High details as the drivers will get better.

So if I understand right, your advice is "if you don't need to performance right now, wait a bit till 20nm comes out" ?

What is the estimation atm, Q1 2014 ?
If you want a quick performance boost with mantle/for BF4 then get the 290x however GSYNC & 20nm Maxwell don't have a given timeline so you'll have to wait 2 or 3 quarters for'em realistically speaking.
 

5150Joker

Diamond Member
Feb 6, 2002
5,559
0
71
www.techinferno.com
First of all - of course I know it requires a GSYNC supported monitor.

My current monitor is 120HZ TN display (LG W2343D), and I will surely replace it for the sake of GSYNC (if I decide to go nvidia).

I'm mostly a FPS gamer, and as such I prefer a faster TN display than a a superior and slower IPS display.

I'm sure my current setup can carry BF4 on Medium to High details as the drivers will get better.

So if I understand right, your advice is "if you don't need to performance right now, wait a bit till 20nm comes out" ?

What is the estimation atm, Q1 2014 ?


Based on what you said above, AMD is not an option at all. Go G-SYNC or go home. I have a $1000 IPS display that I plan to make my secondary once 27" G-SYNC panels are available in 2014. I think G-SYNC is a huge game changer and puts NVIDIA cards in a different league vs AMD. Put it this way: A single 780 Ti with G-SYNC will give say 40 fps in a really intensive game but it'll be perfectly smooth with no frame tearing or stutter vs 290X Crossfire that may get 70 fps but have tearing + stutter. The choice is pretty easy, especially for an FPS player. VSYNC isn't even a consideration given the input lag it induces and triple buffering is only slightly better. If I were AMD, I'd be panicking because this feature will change the landscape for gaming and VR (if NVIDIA licenses it which they probably will).
 
Last edited:

Owls

Senior member
Feb 22, 2006
735
0
76
Based on what you said above, AMD is not an option at all. Go G-SYNC or go home. I have a $1000 IPS display that I plan to make my secondary once 27" G-SYNC panels are available in 2014. I think G-SYNC is a huge game changer and puts NVIDIA cards in a different league vs AMD. Put it this way: A single 780 Ti with G-SYNC will give say 40 fps in a really intensive game but it'll be perfectly smooth with no frame tearing or stutter vs 290X Crossfire that may get 70 fps but have tearing + stutter. The choice is pretty easy, especially for an FPS player. VSYNC isn't even a consideration given the input lag it induces and triple buffering is only slightly better. If I were AMD, I'd be panicking because this feature will change the landscape for gaming and VR (if NVIDIA licenses it which they probably will).

I think you should wait for reviews before making these lofty assumptions. I have 3x gtx 680s and I don't really deal with tearing if I adjust my settings and honestly at 1600p there is little worry. Gsync sounds great for the vast majority of people running at 1080p.
 

5150Joker

Diamond Member
Feb 6, 2002
5,559
0
71
www.techinferno.com
I think you should wait for reviews before making these lofty assumptions. I have 3x gtx 680s and I don't really deal with tearing if I adjust my settings and honestly at 1600p there is little worry. Gsync sounds great for the vast majority of people running at 1080p.

G-SYNC is great for people running at any modern resolution. Your 3 x 680s are weaker than my dual Titans and I still get massive tearing. So you must have special 680s or your "adjustments" are turning vsync on which is hardly a solution.
 

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
I think you should wait for reviews before making these lofty assumptions. I have 3x gtx 680s and I don't really deal with tearing if I adjust my settings and honestly at 1600p there is little worry. Gsync sounds great for the vast majority of people running at 1080p.

I assume you use v-sync on everything at maxed 60 FPS.

Unfortunately, in a lot of games, if not most, DirectX and triple buffering causes you a full frame worth of latency. This may be ok, but not ideal. I suppose you could use a FPS limiter at 59 FPS, to get rid of those times, though that always gives one frame of stutter a second, though perhaps not a huge stutter problem.

G-sync, on the other hand, should allow higher settings, and still get smooth low latency gaming. Though you still may not want to let it drop much below 60, having a few dips is less of an issue. I personally fine myself sacrificing settings to make sure I'm always over 60 FPS (in 3D) or 70 FPS in 2D (80 is ideal, but a few 70's moments of 70's is ok). I have to live with taking frequent breaks in 3D due to nausea from latency (why I go for 80+ in 2D).

So the main advantage should be more leeway on FPS dips, and lower latency while maintaining 0 tearing. If you never use v-sync anyways, then the main advantage is the same latency without tearing, which may still look smoother, as every update is a full image, rather than partial images with no additional latency.
 

Owls

Senior member
Feb 22, 2006
735
0
76
3 680s are on par with two titans if not better if they are oc'd even slightly. And at 1600p with everything turned on I rarely run into tearing. Quite frankly if u are running dual titans on anything less than a i7 at 1600 you are underutilizing them.
 

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
3 680s are on par with two titans if not better if they are oc'd even slightly. And at 1600p with everything turned on I rarely run into tearing. Quite frankly if u are running dual titans on anything less than a i7 at 1600 you are underutilizing them.
I'm sure its not bad, but it is not better.

http://www.techpowerup.com/reviews/NVIDIA/GeForce_GTX_Titan_SLI/22.html
perfrel_2560.gif


It isn't always faster, but Titan SLI is certainly more consistent with less problems.
 

BrightCandle

Diamond Member
Mar 15, 2007
4,762
0
76
This gsync release is a major dilemma for a lot of people. We are starting to get the first 4k monitors appear at crazy prices, we have the 290X competing very well against the 780 and in the background we have gsync which should completely eradicate tearing and the stuttering it causes, vsync stuttering and a lot of input latency. Its such a big jump that getting an AMD card really does look like it will have you behind on what is the most important change in computer graphics for half a decade, right up there with compute shaders.

But at the same time none of the monitor manufacturers have released any details except Asus, and they will have literally one homebrewable monitor available for the Nvidia module. You can get the monitor now but not the module. We have zero release dates for anything else. Nvidia can say Q1 all they like but that is 2 months away and Benq and co haven't even said anything about the product line up they are planning. Its most likely more like 5 months at least and quite possibly a year before we have a gsync monitor we would buy. In that timescale we will be seeing 20nm process GPUs appearing anyway, and they will be another big jump in performance, putting the current high end minimal improvements to shame.

In general I think now is a bad time to buy a monitor and a GPU, its late in the 28nm process and that makes it a bad time to get a card as it will be replaced very soon by something much faster and monitors for PC gaming within half a year will likely be forever changed. In both cases you might not be keeping anything you buy today for very long anyway. My advice is usually buy the best thing you can get today and don't worry about it, but these are really expensive items that the future is currently promising to obsolete really rather quickly, I can't help but think purchasing any GPU or monitor now is going to be less value for money than it ought to be.
 

Cookie Monster

Diamond Member
May 7, 2005
5,161
32
86
I think you should wait for reviews before making these lofty assumptions. I have 3x gtx 680s and I don't really deal with tearing if I adjust my settings and honestly at 1600p there is little worry. Gsync sounds great for the vast majority of people running at 1080p.

It does sound like the real deal because every author from reputable tech sites that had access in seeing this tech first hand have all said the same thing.

Just read Scott's@TR first hand experience.

http://techreport.com/blog/25542/a-few-thoughts-on-nvidia-g-sync

Like brightcandle suggests, this seems to be one of the most important changes to computer graphics in recent times. Throwing more framerates + vsync (adaptive or not) + triple buffering has never been able to solve these problems 100% til now.
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
Problem is waiting for GSync. Is it going to be vendor locked to only nVidia? How long before we can only get it on "lower IQ" monitors? Are we only going to have a couple of models from one vendor? If it is vendor locked, will there be a hack (I think this is a distinct possibility.). Remember it's only been announced and shown as a tech demo. A little early to plan anything around.

It's like telling people to buy AMD because of Mantle. It might provide an awesome performance boost and in virtually every new title on all of the major AAA game engines. Then again, it could be 5%-10% better and be in only a couple of games. I'm not going to recommend someone spend their money on something that I can't give them more info on.
 

Haserath

Senior member
Sep 12, 2010
793
1
81
Last edited:

Owls

Senior member
Feb 22, 2006
735
0
76
It does sound like the real deal because every author from reputable tech sites that had access in seeing this tech first hand have all said the same thing.

Just read Scott's@TR first hand experience.

http://techreport.com/blog/25542/a-few-thoughts-on-nvidia-g-sync

Like brightcandle suggests, this seems to be one of the most important changes to computer graphics in recent times. Throwing more framerates + vsync (adaptive or not) + triple buffering has never been able to solve these problems 100% til now.

I think that's fine and dandy that he thinks that gsync is awesome. Personally, I'd rather wait to actually see it released and make a final judgement call. Also, having a monitor that is locked to one vendor in order to achieve this won't work. What if AMD comes out with something better and cheaper that may not even need this type of ASIC in the monitor? Point is people keep monitors for several years and graphics cards is what get changed like underwear almost.

I can live with a little tearing if it means not having to ditch 2x $1k 1600p monitors that are perfectly fine.
 

Stuka87

Diamond Member
Dec 10, 2010
6,240
2,559
136
G-SYNC will have to make a pretty big difference to justify being lock into a specific compatible display type, GPU type, and the extra cost associated with it.

Chances are I will not touch it unless it becomes a standard. Which there is no reason it cannot be, as it uses existing display port functionality for the communication.
 

Concillian

Diamond Member
May 26, 2004
3,751
8
81
G-sync is big... huge even, but I'll wait a bit. Wasn't too long ago I bought a monitor and I'm not dropping that kind of coin until it matures a bit.

Once we start seeing IPS compatible monitors I'll think about it.
It should be a very noticeable improvement though, and I expect it will make an experience as if you had at least 10-20% higher FPS on a non-G-sync setup. I expect that feature alone to give you an experience as if you bought the next higher card up the line... if not two higher. Since monitors are around for much longer than cards, it should also be well worth a ~$100 premium.
 
Last edited: