gtx 980 or fury for 1440p ?

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
At least get your facts straight before you accuse TPU of bias... :thumbsdown:

In the TPU review, there are eight AMD Gaming Evolved titles:

1) Alien Isolation

2) Tomb Raider

3) Crysis 3

4) Bioshock Infinite

5) Civilization Beyond Earth

6) Dragon Age Inquisition

7) GTA V (Co-sponsored with NVidia)

8) Battlefield 4

For NVidia's Gameworks, there are eight as well:

1) AC Unity

2) Batman Arkham Origins

3) Far Cry 4

4) Witcher 3

5) Metro Last Light

6) Project Cars

7) Watch Dogs

8) GTA V (Co-sponsored with AMD)

9) CoD Advanced Warfare (Checked and this isn't a GW title. CoD Ghosts is, but not Advanced Warfare.

Yeah, that's so many :rolleyes: Anyway, whether a game has Gameworks or GamingEvolved does not determine final performance or give the sponsor an advantage or their competitor a disadvantage. There are gameworks titles where AMD is ahead of NVidia, and GamingEvolved titles where NVidia is leading AMD..



NVidia are constantly tweaking and optimizing their drivers. A review using a driver that is two three revisions out of date is understandable, but TPU uses drivers that are months old. The driver they used in that review came out in June..

You just keep posting worthless comparisons until someone challenges it... Well, I'm sure you'll just pely by repeating it again, but here goes.

You actually want to compare GE titles with GW titles for brand bias? That in and of itself is pure fantasy.

In the skewed result games I posted the 970 beat the Fury X! In Project cars it wins by double digits, both FPS and Avg. Find me GE games in W1zzard's suite that the 390 beats the 980 ti, which would be the equivalent. Then you can claim that GE titles are unfairly skewed like GW.
 

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
A game being a GE game means nothing much beyond that AMD helped optimize the game. AMD doesn't do the same things nvidia does. currently their involvement does not seem to result in unfairly skewed framerates.

You really are quite misinformed. GE and GW are basically the same type of programs with the same intent, only GW is much more successful.

Both programs try to aid developers when it comes to optimizing their games for their hardware, and both programs try to aid developers when it comes to implementing vendor/PC specific technologies into their games..

The notion that GW is somehow unfair to AMD flies in the face of reality and fact. The only real performance advantage that manifests due to GW or GE are that the vendor has a head start when it comes to optimizing their drivers for a particular game.

PC games do not optimize for specific architectures due to using abstract APIs..
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
I would get Fury tri-X, one of the best air cooling card ever made.

Cool and quiet was so last year. ;)

You will hear no nVidia supporters wanting to talk noise when comparing Fury cards. It no longer matters. You also won't see them wanting to use a reference card ever. They know the old Titan cooler is hot and throttles. They won't talk about it though. And if anyone else does they'll play the custom cooler card unless you bring up Hawaii. Then, even a year and a half later they'll tell you it's hot and loud.
 

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
In the skewed result games I posted the 970 beat the Fury X! In Project cars it wins by double digits, both FPS and Avg. Find me GE games in W1zzard's suite that the 390 beats the 980 ti, which would be the equivalent. Then you can claim that GE titles are unfairly skewed like GW.

LOL now this is funny. YOu didn't even bother to check whether those examples you posted were in fact GW titles..

Out of the three you posted, only Project Cars is GW and even that is suspect because it doesn't use any GW specific technologies from what I can tell unlike The Witcher 3 for example. And the reason for NVidia's massive performance lead in Project Cars likely is due to the game using DX11 multithreading, a feature which AMD has never supported..

And Wolfenstein TNO isn't GW, and neither is WoW. Some games just favor some hardwares more than others.. PC gaming has always been like that..
 

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
Cool and quiet was so last year. ;)



You will hear no nVidia supporters wanting to talk noise when comparing Fury cards. It no longer matters. You also won't see them wanting to use a reference card ever. They know the old Titan cooler is hot and throttles. They won't talk about it though. And if anyone else does they'll play the custom cooler card unless you bring up Hawaii. Then, even a year and a half later they'll tell you it's hot and loud.


I don't care about noise levels personally. If the performance is right it has to be pretty loud for me to care about that. I've always been like that because I play my games with the sound up so I don't hear it.
 

DooKey

Golden Member
Nov 9, 2005
1,811
458
136
OP, the only advice I can give you is don't listen to the video card company fans. There are several offering you advice and it's not reliable because they have LOVE for their preferred brand. They are very obvious in this thread.
 

Eymar

Golden Member
Aug 30, 2001
1,646
14
91
This is probably sound advice I will follow. I should have said when I made the post is that I will naturally wait for a good price (sale or whatever) hence the reason I asked if prices are the same which card. I plan to make the purchase between tomorrow and blackfriday.

Ah you should be in good shape in next few months. There should be better deals in a few months as hopefully Fury Nano is at a good price for it's performance which hopefully should be in <$500 range. That should cause price drops for 390x and 980, possibly Fury. If not there should be price cuts for those cards come black Friday time. For 1440p I don't think you could go wrong with either 390x or 980 so would go with price and\or any other specific requirements (power, size, heat, etc.) you may have.
 

Headfoot

Diamond Member
Feb 28, 2008
4,444
641
126
Fury and 980 are both bad cards compared to the XFX DD 290x at $250 and aftermarket 980 Tis.

If you're ok with lower power, get 290x.
If you want higher power, get 980 Ti aftermarket.
 

Azix

Golden Member
Apr 18, 2014
1,438
67
91
You really are quite misinformed. GE and GW are basically the same type of programs with the same intent, only GW is much more successful.

Both programs try to aid developers when it comes to optimizing their games for their hardware, and both programs try to aid developers when it comes to implementing vendor/PC specific technologies into their games..

The notion that GW is somehow unfair to AMD flies in the face of reality and fact. The only real performance advantage that manifests due to GW or GE are that the vendor has a head start when it comes to optimizing their drivers for a particular game.

PC games do not optimize for specific architectures due to using abstract APIs..

I did not say "for their hardware". Nor did I say vendor specific technologies. That is what gameworks does. It really can't even be the same because if AMD puts something in a game there is a good chance the code is openly available.

Because the code is open it in turn means that improvements AMD contributes are effective for everybody. There is also no lockout of nvidia.

Ultimately we get a situation where GE titles do not result in the same effects as GW titles, thus those charts being shown here need to be looked at in that light.
 

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
I did not say "for their hardware". Nor did I say vendor specific technologies. That is what gameworks does. It really can't even be the same because if AMD puts something in a game there is a good chance the code is openly available.



Because the code is open it in turn means that improvements AMD contributes are effective for everybody. There is also no lockout of nvidia.



Ultimately we get a situation where GE titles do not result in the same effects as GW titles, thus those charts being shown here need to be looked at in that light.


Except we have examples like Dirt showdown that changed the lighting to use direct compute at a time when nvidia cards weren't good at direct compute performance. The lighting did not look better and wasn't any more impressive than the previous game that did not use direct compute. That was a GE title. That's not really different than adding hair works or something. It can be used on any hardware but has an advantage in performance on certain hardware. Like it or not, it has been done by both camps.
 

Azix

Golden Member
Apr 18, 2014
1,438
67
91
Except we have examples like Dirt showdown that changed the lighting to use direct compute at a time when nvidia cards weren't good at direct compute performance. The lighting did not look better and wasn't any more impressive than the previous game that did not use direct compute. That was a GE title. That's not really different than adding hair works or something. It can be used on any hardware but has an advantage in performance on certain hardware. Like it or not, it has been done by both camps.

http://www.anandtech.com/show/6774/nvidias-geforce-gtx-titan-part-2-titans-performance-unveiled/6

DiRT: Showdown is something of a divisive game for benchmarking. The game&#8217;s advanced lighting system, while not developed by AMD, does implement a lot of the key concepts they popularized with their Leo forward lighting tech demo. As a result performance with that lighting system turned on has been known to greatly favor AMD cards. With that said, since we&#8217;re looking at high-end cards there&#8217;s really little reason not to be testing with it turned on since even a slow card can keep up. That said, this is why we also test DiRT with advanced lighting both on and off starting at 1920x1080 Ultra.

The developers thought there was some benefit to be had and it happened to show some weaknesses in hardware. In eg. witcher 3, the developers didn't even write the code. Nor did the excess tessellation have a benefit.

Also...

http://www.rage3d.com/articles/gaming/codemaster_dirt_showdown_tech_review/index.php?p=2

As of a recent patch, two more have been added: Advanced Lighting and Global Illumination. The former you may recognize from AMD's &#8220;Leo&#8221; or &#8220;Forward+&#8221; demo; it offers &#8220;genuine&#8221; dynamic lighting with the assistance of DirectCompute instead of hacking it with 2D glows. Global Illumination also utilizes DirectCompute, though in this case, it's to intelligently simulate reflected light on all surfaces in a given scene for a more attractive and realistic look.

In practice, Advanced Lighting makes a significant difference, noticeably brightening up and &#8220;fleshing out&#8221; any given scene. You might say what Ambient Occlusion offers with shadows, Advanced Lighting offers with lighting. At times, though, it seems to go too far, making parts of your vehicle, for example, so bright you can't fully see them.

dirt9.jpg
dirt6.jpg


Not an insignificant effect
 
Last edited:

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
Because the code is open it in turn means that improvements AMD contributes are effective for everybody. There is also no lockout of nvidia.

If AMD wants to make their gaming technologies open source, more power to them. But NVidia has no obligation whatsoever to follow suit.

A good example of this is Mantle. AMD spent millions of dollars to develop it and said it would be open sourced, but it never came to be......until DX12 stole the limelight, the podium and the whole damn show.

Now Mantle is going to be the foundation of Vulkan, an open sourced API.. So basically AMD wasted their money for nothing..

Ultimately we get a situation where GE titles do not result in the same effects as GW titles, thus those charts being shown here need to be looked at in that light.

And what light is that? Just because your GPU vendor isn't winning, you feel compelled to come up with excuses as to it's performance (or lack thereof) without even understanding the full picture.

I said it on the previous page, and I'll say it again. GE or GW has no bearing on the final performance of a game on a vendor's hardware.. We've seen it time and time again.

Metro 2033 and Metro Last Light ran better on AMD hardware when it first came out despite being GW titles. Far Cry 3 was faster on AMD hardware when it first came out, but after NVidia started optimizing it's drivers, they gained the edge..

The only performance benefit that GW and GE provide, are early access to the code so that IHVs can start optimizing their drivers for the game.
 

boozzer

Golden Member
Jan 12, 2012
1,549
18
81
OP, the only advice I can give you is don't listen to the video card company fans. There are several offering you advice and it's not reliable because they have LOVE for their preferred brand. They are very obvious in this thread.
it is kinda mess up right? a guy looking for advice on his new purchase and he has to wade through the BS just to find a few genuine advices. :colbert: so mess up.
 

RampantAndroid

Diamond Member
Jun 27, 2004
6,591
3
81
http://www.anandtech.com/show/6774/nvidias-geforce-gtx-titan-part-2-titans-performance-unveiled/6



The developers thought there was some benefit to be had and it happened to show some weaknesses in hardware. In eg. witcher 3, the developers didn't even write the code. Nor did the excess tessellation have a benefit.

Also...

http://www.rage3d.com/articles/gaming/codemaster_dirt_showdown_tech_review/index.php?p=2



dirt9.jpg
dirt6.jpg


Not an insignificant effect

Uh, you realize that the shadows on the right side...well, those screenshots are from two DIFFERENT locations. Look forward in the top screenshot - you actually do see shadows of people, if that's what you're suggesting changed.
 

Azix

Golden Member
Apr 18, 2014
1,438
67
91
it is kinda mess up right? a guy looking for advice on his new purchase and he has to wade through the BS just to find a few genuine advices. :colbert: so mess up.

The point is to flesh out those views and see which the OP sees as most valid. I am sure he has gotten some good information here. I mean, what advice would you give that is significantly different from the folks you are bashing?

No point arguing the GW vs GE thing here (tons of strange comments made eg. "Now Mantle is going to be the foundation of Vulkan, an open sourced API.. So basically AMD wasted their money for nothing.." but w/e).

Point for the OP is to know if his games favor nvidia or not and whether he thinks risking a weaker card with the hope nvidia will keep doing whatever it is they are doing and doing so in the games he wants to play, is a good idea.
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
Except we have examples like Dirt showdown that changed the lighting to use direct compute at a time when nvidia cards weren't good at direct compute performance. The lighting did not look better and wasn't any more impressive than the previous game that did not use direct compute. That was a GE title. That's not really different than adding hair works or something. It can be used on any hardware but has an advantage in performance on certain hardware. Like it or not, it has been done by both camps.

What does this have to do with TPU's review?

If AMD wants to make their gaming technologies open source, more power to them. But NVidia has no obligation whatsoever to follow suit.

A good example of this is Mantle. AMD spent millions of dollars to develop it and said it would be open sourced, but it never came to be......until DX12 stole the limelight, the podium and the whole damn show.

Now Mantle is going to be the foundation of Vulkan, an open sourced API.. So basically AMD wasted their money for nothing..



And what light is that? Just because your GPU vendor isn't winning, you feel compelled to come up with excuses as to it's performance (or lack thereof) without even understanding the full picture.

I said it on the previous page, and I'll say it again. GE or GW has no bearing on the final performance of a game on a vendor's hardware.. We've seen it time and time again.

Metro 2033 and Metro Last Light ran better on AMD hardware when it first came out despite being GW titles. Far Cry 3 was faster on AMD hardware when it first came out, but after NVidia started optimizing it's drivers, they gained the edge..

The only performance benefit that GW and GE provide, are early access to the code so that IHVs can start optimizing their drivers for the game.

Mantle? What does this have to do with TPU's review? That is what we're talking about.

I don't care about noise levels personally. If the performance is right it has to be pretty loud for me to care about that. I've always been like that because I play my games with the sound up so I don't hear it.

What about throttling? I assume you care about that?
 

tential

Diamond Member
May 13, 2008
7,348
642
121
Given OP's willingness to spend to purchase Fury, if he's willing to get a monitor with it, then I say get the 4K Wasabi Mango monitor with Freesync and unlock your Fury to a Fury X(As much as possible anyway partial unlocks work too). Then I'd say it's a landslide to the Wasabi as this price is quite cheap at $680 for a REAL sized monitor you can use at 4K.
http://www.ebay.com/itm/Perfect-Pix...UHD-Monitor-/121697112519?hash=item1c55b691c7
Fury will handle the 4K (might have to turn down a few settings), and you'll be perfectly ready for whatever future upgrade card you want. But I'm biased into larger displays/higher resolutions, and I don't think a 1440p freeysnc monitor for the prices they are (I can't find them cheap but if someone can please correct this) the incremental cost to get a monitor that will last you that's cheap isn't bad. It'll probably be pretty easy to sell on craigslist too as a 4K budget "TV" to someone lol if you don't want it anymore.

They have a model up to 65 inches... I'm super excited now for freesync, AMD may have just become my best friend ever by putting freesync into large panels. Fury X is very much a choice again for me if I can use freesync on a 65 inch 4k display. Nvidia does NOT have that option.
 

you2

Diamond Member
Apr 2, 2002
7,158
2,201
136
Is amd or nvidia driver more stable (i.e, no crashing) ? Also What is this fury nano and release date ? Last but least the msi 390 g8 looks good (short enough to fit my case); only negative I've seen witht he 390 compared to 980 and fury is that it runs bloody hot.

I do want to stress that 1440p is about as high as I will go with regards to resolution until there is a 24 inch high resoltuion screen. My general view (for gaming) is 24 inch 1200x1920 is prefer but I'm happy with my 27 inch. I willl not be going larger though I could see a 27inch 1600 monitor if one was available at a good price.
 

tential

Diamond Member
May 13, 2008
7,348
642
121
Is amd or nvidia driver more stable (i.e, no crashing) ? Also What is this fury nano and release date ? Last but least the msi 390 g8 looks good (short enough to fit my case); only negative I've seen witht he 390 compared to 980 and fury is that it runs bloody hot.

I do want to stress that 1440p is about as high as I will go with regards to resolution until there is a 24 inch high resoltuion screen. My general view (for gaming) is 24 inch 1200x1920 is prefer but I'm happy with my 27 inch. I willl not be going larger though I could see a 27inch 1600 monitor if one was available at a good price.

As an HD7950 owner, I have 0 issues with drivers crashing usually.
The MSI R9 390?
https://www.youtube.com/watch?v=k9cKZiJw6Pk
I'd watch this youtube video if you want (seems youtube videos are the only thing peopel will watch these days).
https://www.youtube.com/watch?v=udXCusTnRsY

Another review.

The MSI R9 390 has gotten GREAT reviews.

Also, I'd like to tremind you many reviews you're seeing are comparing these cards to STOCK GTX 970s/980s. So it makes the GTX 970/980/980Ti look quiet. The cards you actually will purchase will be LOUDER because these cards all have factory OCs and are being pushed harder.

I wish I could find a specific R9 390 vs GTX 970 AIB temp chart but you may find it within those review videos.

http://www.bjorn3d.com/2015/06/msi-r9-390-gaming-8g-amd-300-series-with-a-custom-kick-from-msi/4/

As the R9 390 is the PERFORMANCE PER DOLLAR KING, there is a REASON it gets recommended you2. It's an all around great card that has been revamped (The r9 290 was the previous performance per dollar king), that I mean, we've talkd about a LOT of cards in this thread, it's the R9 390 that sits alone as a card most people will have a hard time saying is a bad choice.

The other thing to pay attention to you2, is that these cards we're all talking about will all be obselete next year. Next year, the flurry of hardware improvements over todays cards, it is in your BEST interest to purchase the cheapest, fastest card now, and then sell and purchase something next year or the year afterwards rather than investing a HUGE amount now, and being upset next year when a $300 GPU is now making your GPU look like an entry level piece of trash.
 

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
What does this have to do with TPU's review?


What about throttling? I assume you care about that?

I never said it had anything to do with that review. What I did say is that there have been both GE and GW titles that do things in the game that aren't necessarily beneficial to the competition. That was in direct response to criticism of GW.

As for throttling, what does that have to do with noise levels? Noise on a GPU comes from fans right? When fans get faster and push more air they get louder right? So it stands to reason you avoid temperature throttling by spinning up the fans. I don't know what you really mean with your question. My comment was pretty easy to understand I thought. If the performance is good, the fans can be loud to keep it cool and I'm ok with that since generally the noise only goes up when loading the card(s) (presumably during gaming) and I'll have my sound up so I can't hear the card(s) anyway.
 

flopper

Senior member
Dec 16, 2005
739
19
76
As an HD7950 owner, I have 0 issues with drivers crashing

http://www.bjorn3d.com/2015/06/msi-r9-390-gaming-8g-amd-300-series-with-a-custom-kick-from-msi/4/

As the R9 390 is the PERFORMANCE PER DOLLAR KING, there is a REASON it gets recommended you2. It's an all around great card that has been revamped (The r9 290 was the previous performance per dollar king), that I mean, we've talkd about a LOT of cards in this thread, it's the R9 390 that sits alone as a card most people will have a hard time saying is a bad choice.

The other thing to pay attention to you2, is that these cards we're all talking about will all be obselete next year. Next year, the flurry of hardware improvements over todays cards, it is in your BEST interest to purchase the cheapest, fastest card now, and then sell and purchase something next year or the year afterwards rather than investing a HUGE amount now, and being upset next year when a $300 GPU is now making your GPU look like an entry level piece of trash.

390 is the gamers card atm.
4k needs 2 cards anyhow.