[TechSpot] Gaming at 4K: GTX 970 SLI vs. AMD R9 290 Crossfire

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Sep 27, 2014
92
0
0
Great that competition is bringing this to the gamer. That's why I always say it's best to buy AMD/NV GPUs when they are both locked head-to-head. :thumbsup:

I also forgot to mention that as far as 4K gaming performance goes, we haven't moved much from 1 year ago. We really need those next gen 390X/GM200 cards. For someone like you KaRLiToS there is still no viable upgrade path.

I agree 110% This is the most frustrating thing, that there has been no real change in 4k performance for a year. This again illustrates why the 9 series has been so disappointing for me, especially since you can pick up the 290x now for far less. These price cuts coupled with the general shortage of Maxwell to begin with may actually keep AMD in the fight.

Are the 980s in SLI worth it for 4k gaming? I would say no, they close but no cigar. Bring the 390x and GM200 and I will be more impressed.
 

KaRLiToS

Golden Member
Jul 30, 2010
1,918
11
81
I also forgot to mention that as far as 4K gaming performance goes, we haven't moved much from 1 year ago.

So true man.


We really need those next gen 390X/GM200 cards. For someone like you KaRLiToS there is still no viable upgrade path.

You are also right, no way I'm upgrading to 4 x GTX 980 just to gain 10 positions in the 3dmark hall of fame.

We'll see who gives the best technology with next gen cards (Waiting for 20nm or 16nm) but I have faith in AMD. When I had issues with my rig, I always had a representant trying to help me with a direct contact between me and him. It didn't always work and I had to contribute too but I always ended up with a bug free Rig.

Just check this thread, I pushed so much for that issue and they finally repaired it even if we were like 5-10 people with that issue.

http://www.overclock.net/t/1295809/...rash-hang-freeze-galore-what-to-do-fixed/0_30


People say bad things about AMD drivers but honestly, they are top notch right now. Most of the people that say AMD drivers are bad have Nvidia cards. On the other side, I had 2 x GTX 780 and 2 x GTX 680 and they were bug free also. (I hate adaptive Vsync) .
 

toyota

Lifer
Apr 15, 2001
12,957
1
0
So true man.




You are also right, no way I'm upgrading to 4 x GTX 980 just to gain 10 positions in the 3dmark hall of fame.

We'll see who gives the best technology with next gen cards (Waiting for 20nm or 16nm) but I have faith in AMD. When I had issues with my rig, I always had a representant trying to help me with a direct contact between me and him. It didn't always work and I had to contribute too but I always ended up with a bug free Rig.

Just check this thread, I pushed so much for that issue and they finally repaired it even if we were like 5-10 people with that issue.

http://www.overclock.net/t/1295809/...rash-hang-freeze-galore-what-to-do-fixed/0_30


People say bad things about AMD drivers but honestly, they are top notch right now. Most of the people that say AMD drivers are bad have Nvidia cards. On the other side, I had 2 x GTX 780 and 2 x GTX 680 and they were bug free also. (I hate adaptive Vsync) .
lol whats to hate about adaptive vsync? its regular vsync if you are at your monitor's refresh rate and then its simply no vsync if you drop below. if you cant maintain your monitor's refresh rate then your other option is tearing the whole time with vsync off or choppiness at lower framerates with it on.
 
Sep 27, 2014
92
0
0
So true man.




You are also right, no way I'm upgrading to 4 x GTX 980 just to gain 10 positions in the 3dmark hall of fame.

We'll see who gives the best technology with next gen cards (Waiting for 20nm or 16nm) but I have faith in AMD. When I had issues with my rig, I always had a representant trying to help me with a direct contact between me and him. It didn't always work and I had to contribute too but I always ended up with a bug free Rig.

Just check this thread, I pushed so much for that issue and they finally repaired it even if we were like 5-10 people with that issue.

http://www.overclock.net/t/1295809/...rash-hang-freeze-galore-what-to-do-fixed/0_30


People say bad things about AMD drivers but honestly, they are top notch right now. Most of the people that say AMD drivers are bad have Nvidia cards. On the other side, I had 2 x GTX 780 and 2 x GTX 680 and they were bug free also. (I hate adaptive Vsync) .

Xfire 290x is a really good value now for the money now that the price cuts have come into effect.

Speaking of Nvidia drivers, the whole dropped frames issue with maxwell is a bit disconcerting.

That said I am just hating life because I have the upgrade itch but I have a 780 and no real viable single GPU upgrades that are worth the money. I have been tempted by the 350$ 780ti's running around but ugh. Just doesn't seem worth the upgrade. If I had an SLI/Xfire mobo I would probably just get 2x 780ti or 2x290x and call it a day but sadly I have a single GPU mobo.
 
Last edited:

KaRLiToS

Golden Member
Jul 30, 2010
1,918
11
81
lol whats to hate about adaptive vsync? its regular vsync if you are at your monitor's refresh rate and then its simply no vsync if you drop below. if you cant maintain your monitor's refresh rate then your other option is tearing the whole time with vsync off or choppiness at lower framerates with it on.

I didn't like it, I didn't say it was bad, I just said I didn't like it.

But I didn't explain correctly, there is a option that is fine, but the Adaptive (on) or (half the refresh rate) is just bad it shouldn't exist. I don't like when the FPS drop below 60 fps and go down to 30fps and gets locked there. But outside of that, I have 100% pure fun with those cards.

I don't know why you took my comment personnally.
 

toyota

Lifer
Apr 15, 2001
12,957
1
0
I didn't like it, I didn't say it was bad, I just said I didn't like it.

But I didn't explain correctly, there is a option that is fine, but the Adaptive (on) or (half the refresh rate) is just bad it shouldn't exist.

I don't know why you took my comment personally.
what makes you think I took in personally? I just dont understand why you hate a feature that has no real drawbacks compared to the alternative of using or not using vsync. and really that half refresh rate option on a 60hz is just for games that are capped at 30 fps anyway or that you cant get over 30 fps.
 

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
I didn't like it, I didn't say it was bad, I just said I didn't like it.

But I didn't explain correctly, there is a option that is fine, but the Adaptive (on) or (half the refresh rate) is just bad it shouldn't exist. I don't like when the FPS drop below 60 fps and go down to 30fps and gets locked there. But outside of that, I have 100% pure fun with those cards.

I don't know why you took my comment personnally.

I don't think you really explained why you dislike adaptive v-sync, and what option should exist in its place?

If your FPS drop below 60, it doesn't drop to 30 with adaptive V-sync, it just turns off v-sync.

Granted, I don't use it myself, as I just turn of v-sync and leave it at that, as with a 120hz monitor, I don't usually see 120 FPS anyways, but I am curious why you'd hate it.
 

KaRLiToS

Golden Member
Jul 30, 2010
1,918
11
81
what makes you think I took in personally? I just dont understand why you hate a feature that has no real drawbacks compared to the alternative of using or not using vsync. and really that half refresh rate option on a 60hz is just for games that are capped at 30 fps anyway or that you cant get over 30 fps.

Ok then I liked it. :rolleyes:


I don't think you really explained why you dislike adaptive v-sync, and what option should exist in its place?

If your FPS drop below 60, it doesn't drop to 30 with adaptive V-sync, it just turns off v-sync.

Granted, I don't use it myself, as I just turn of v-sync and leave it at that, as with a 120hz monitor, I don't usually see 120 FPS anyways, but I am curious why you'd hate it.

I think it was Half the refresh rate by default and I had to modify it, I don't remember honestly, but it's the only thing I disliked. Not that bad, no?


Please stop attacking me for that. :oops:
 
Last edited:

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
I wasn't attacking you, only wanted info. I appears as though you just didn't have it set properly. Your initial description just didn't make sense and I wanted clarification.

Btw, the best use of the half refresh rate is for 120hz monitors. The idea is to create a consistent experience with the same FPS with V-sync on, if that is of need. I've found it useful with Skyrim and a 120hz monitor. It drops me to 60 FPS and maintains v-sync to prevent tearing or the physics bug.
 
Last edited:

toyota

Lifer
Apr 15, 2001
12,957
1
0
Ok then I liked it. :rolleyes:




I think it was Half the refresh rate by default and I had to modify it, I don't remember honestly, but it's the only thing I disliked. Not that bad, no?


Please stop attacking me for that. :oops:
wtf are you rolling your eyes about? funny you are so paranoid about others taking it personally or attacking you then you act like that.
 

KaRLiToS

Golden Member
Jul 30, 2010
1,918
11
81
I wasn't attacking you, only wanted info. I appears as though you just didn't have it set properly. Your initial description just didn't make sense and I wanted clarification.

Btw, the best use of the half refresh rate is for 120hz monitors. The idea is to create a consistent experience with the same FPS with V-sync on, if that is of need. I've found it useful with Skyrim and a 120hz monitor. It drops me to 60 FPS and maintains v-sync to prevent tearing or the physics bug.

Yes but as soon as I set it correctly it was fine. I used those GTX 780 especially to play FarCry 3 because at the time AMD was crap in that game. It was awsome experience.

wtf are you rolling your eyes about? funny you are so paranoid about others taking it personally or attacking you then you act like that.

Toyota, I like you and don't want to fight with you. Lets stop and move on. I didn't want to be a jerk with the rolleyes emoticon.
 

monstercameron

Diamond Member
Feb 12, 2013
3,818
1
0
Yes but as soon as I set it correctly it was fine. I used those GTX 780 especially to play FarCry 3 because at the time AMD was crap in that game. It was awsome experience.



Toyota, I like you and don't want to fight with you. Lets stop and move on. I didn't want to be a jerk with the rolleyes emoticon.

emoticons are hard :(

that aside is there any word/news on the 980ti?
 

guskline

Diamond Member
Apr 17, 2006
5,338
476
126
Thanks! We have to give credit to R9 290/290X for forcing NV to price 970 so aggressively. It goes both ways. My point is now a gamer can get R9 290 CF setup + "Free" 512GB Crucial MX100 or step up from a Core i5 4690K to a Core i7 4790K and get a new case/CPU cooler considering there is a $170-200 price difference between R9 290 CF and 970 SLI. The choice isn't as clear now in favour of the 970 SLI as it was 1 month ago. Great that competition is bringing this to the gamer. That's why I always say it's best to buy AMD/NV GPUs when they are both locked head-to-head. :thumbsup:

I also forgot to mention that as far as 4K gaming performance goes, we haven't moved much from 1 year ago. We really need those next gen 390X/GM200 cards. For someone like you KaRLiToS there is still no viable upgrade path.

Funny thing RussianSensation. Love my 2 R9-290s in CF- watercooled with EK blocks and 3930k:p
 
Last edited:

Sohaltang

Senior member
Apr 13, 2013
854
0
0
Still waiting for an explanation on why there are no dinky looking 7970GHz or 290.

Or why they are not used in laptops. And why 970 package weighs 1kg, while 290 weighs 2.5 kg. Small things like that...

I mean the first has better pwr characteristics and according to RS 290 is also far from "hot"

@toyota
Oh I understand clearly what you are saying. There is clear and well defined correlation between Peak and Average.
Except there isn't.


Mixing pounds and kg? 2.5kg 290? None even half that. http://www.tomshardware.com/reviews/radeon-r9-290-and-290x,3728-3.html
 

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
Wow, did they get the worst GTX 970s they could get their hands on to overclock? 1304 is paltry!

As to the premise, I don't see this as any sort of vindication on AMD's part. 4K gaming will not be fully mastered until we're on 20nm, and as it is today, I don't consider 4K to be viable at all for serious gaming..
 
Sep 27, 2014
92
0
0
Wow, did they get the worst GTX 970s they could get their hands on to overclock? 1304 is paltry!

As to the premise, I don't see this as any sort of vindication on AMD's part. 4K gaming will not be fully mastered until we're on 20nm, and as it is today, I don't consider 4K to be viable at all for serious gaming..

I agree its not viable for serious gaming at this time but it IS very close. I disagree on needing 20nm. If the GM200 specs are correct that should be able to handle it nicely and thats on 28nm.

Edit: also arent the "20nm" gpus really just 16nm FF?
 
Last edited:
Feb 19, 2009
10,457
10
76
This is good, very competitive from both sides. Those who value one ecosystem or another will be willing to pay a premium for it, it seems NV users are happy to pay a lot more for similar performance, while AMD users are not. That says a lot about consumer behavior.

The 970 is a bang on perfect execution from NV, sweet spot pricing that obsoletes everything else and forced major price drops on AMD's lineup which were already great performance/$ for a long time.

@Carfax83
4K gaming is just fine and dandy with 2x R290. Smooth experience all round, even with 2x MSAA.
 
Last edited:
Sep 27, 2014
92
0
0
This is good, very competitive from both sides. Those who value one ecosystem or another will be willing to pay a premium for it, it seems NV users are happy to pay a lot more for similar performance, while AMD users are not. That says a lot about consumer behavior.

The 970 is a bang on perfect execution from NV, sweet spot pricing that obsoletes everything else and forced major price drops on AMD's lineup which were already great performance/$ for a long time.

@Carfax83
4K gaming is just fine and dandy with 2x R290. Smooth experience all round, even with 2x MSAA.

I suppose he meant single GPU, but with the price drops 2x R290X is a very attractive proposition.
 

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
I disagree on needing 20nm. If the GM200 specs are correct that should be able to handle it nicely and thats on 28nm.

I didn't say 4K gaming "needed" 20nm, just that it wouldn't be mastered until 20nm.

@Carfax83
4K gaming is just fine and dandy with 2x R290. Smooth experience all round, even with 2x MSAA.

I guess it depends on what you're willing to sacrifice. Look at the Crysis 3 benchmark for instance. Both the GTX 970 and R9 290 in SLI and Xfire respectively struggle to get more than 30 FPS in minimums when the IQ is turned up to very high with SMAA 2x.

Basically, 4K gaming is only viable right now because we're still transitioning to current gen only games. All of the games they tested were released on last gen consoles as well, which limited the amount of detail and eye candy the developers could give us.

When the PS4, Xbox One and PC only titles start coming out, it's going to be a whole other ball game for those that want 4K.. :biggrin:
 
Sep 27, 2014
92
0
0
I didn't say 4K gaming "needed" 20nm, just that it wouldn't be mastered until 20nm.



I guess it depends on what you're willing to sacrifice. Look at the Crysis 3 benchmark for instance. Both the GTX 970 and R9 290 in SLI and Xfire respectively struggle to get more than 30 FPS in minimums when the IQ is turned up to very high with SMAA 2x.

Basically, 4K gaming is only viable right now because we're still transitioning to current gen only games. All of the games they tested were released on last gen consoles as well, which limited the amount of detail and eye candy the developers could give us.

When the PS4, Xbox One and PC only titles start coming out, it's going to be a whole other ball game for those that want 4K.. :biggrin:

I still think GM200 has a good chance, the 390x doesnt count since thats supposedly 20nm I guess. You bring up a good point with next gen PC titles. It will be interesting to see how Witcher 3 (I know its also a console release) in 4k is handled by current gen (whatever is out in Feb) since it looks like Project RED is really going the extra mile to make it look gorgeous on PC.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Wow, did they get the worst GTX 970s they could get their hands on to overclock? 1304 is paltry!

That's a very good overclock because TechSpot always lists Core Clock for NV overclocking because NV's Boost Clock is a moving max target depending on the game. 1304 core clock on the Gainward vs. 1318mhz on Gigabyte Windforce. That means both cards are hitting 1.4-1.45Ghz Boost in games:
http://www.techspot.com/review/885-nvidia-geforce-gtx-970-gtx-980/page8.html

As to the premise, I don't see this as any sort of vindication on AMD's part. 4K gaming will not be fully mastered until we're on 20nm, and as it is today, I don't consider 4K to be viable at all for serious gaming..

Even if we are talking about 1440p/1600p gaming, the value of 290s cannot be beaten:

Gigabyte Windforce G1 970 SLI = $740
MSI Gaming 970s = $700

Vs.

HIS R9 290s = $460
MSI Gaming R9 290s = $550
XFX R9 290Xs = $600 and will level any 980.

Most reviewers never used after-market 290s and used stock 947mhz 290 reference against NV's after-market 970 like Gaming or EVGA SC. What kind of a useful comparison is that?

History repeats itself as now you can buy almost 2x 290X for the price of a single 980 and beat it by 40-50% in games. Same story as unlocked 6950s vs. 580, 7950s vs. 680.

Ironically it is now 970 which needs a game bundle or price drops because at $330-370 it's waaaay overpriced against $230-250 290. As far as 980 goes, it should really be $429-$449.
 
Last edited:

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
I mean the first has better pwr characteristics and according to RS 290 is also far from "hot".

The card's temperature is not only a function of the power usage of the ASIC, but also of the cooler's ability to dissipate heat. After-market GTX480 ran rather cool and I always made a note of that to people who insisted that GTX480 ran at 91-93C as if all of them were such failures.

amp_temps.png


The same can be said of after-market R9 290s. There is no need to make generalizations as if all R9 290s run hot.

After-market R9 290/X can run as low as 68*C and as high as 85*C, but all far below the 94-95*C of reference cards.
http://www.tomshardware.com/reviews/r9-290x-lightning-performance-review,3782-7.html

With the large price disparity between after-market R9 290 and GTX970/980, it may be more cost effective to buy the 290s as a stop-gap setup and wait until prices on newer gen drop or at least see what GM200/390X brings. Buying 980 at $550+ is simply going to be a history repeating itself (GTX780Ti $699 --> $350-360 12 months later, GTX780 $650 --> $270 1.5 years later). If you got $$$ to burn, sure why not buy the best all the time. However, if there is a smarter way to upgrade, getting 80-85% of the performance while putting $ aside for the next upgrade, why not discuss that option? If you feel PhysX, DSR, CUDA eco-system are worth the $150-200 premium, sure get NV. For pure gaming bang for the buck 970 and 980 need price drops for brand agnostic GPU buyers.

I know I would pick R9 290s + i7 over 970 SLI + i5 any day because those GPUs will be upgraded in 2-2.5 years anyway, so why overspend $150-200 now for 3-5% more performance to save what $35 in electricity over 2 years? ^_^

DX12 isn't a selling point since it requires Windows 10 and DX12 games. Those won't be out until late next year which means there will be a card for $550 that will beat the 980 anyway. Right now NV can price 980 at $550 because it's competing against the 1 year old Hawaii architecture. Once AMD releases their true response to the 980, I can't imagine NV being able to sell 980 at $550. Just like most of us recommended to wait until GTX680 dropped for prices to settle on 7970s, it's the same advice for 980 since it has no direct competition right now which allows NV to inflate prices as high as $630 for Windforce 980 - an absurd amount for a mid-range Maxwell when 780Ti is $370.
 
Last edited: