Buying new card, asking for opinions GTX 780 vs R9 290

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

toyota

Lifer
Apr 15, 2001
12,957
1
0
Gigabyte Windforce 780 GHz (or the Ti version if you want to spend). Quieter, cooler, and yes I prefer Nvidia's drivers. I seriously doubt "next-gen" (HA) consoles with a tablet SoC will demand that much vRAM or that many threads. Just look at the dumbing down of Watch Dogs.
comments like that make no sense. the previous consoles where a joke but still many games pushed high end card to their limits with all settings cranked. we heard the same silly excuses all the time during the previous gen. first the 8800gt was fast enough because games were ports. then the gtx260 was overkill because games were ports. then the gtx470 was overkill because games were ports. none of that was ever true for long because pc versions of games got more and more demanding the whole time...
 

escrow4

Diamond Member
Feb 4, 2013
3,339
122
106
comments like that make no sense. the previous consoles where a joke but still many games pushed high end card to their limits with all settings cranked. we heard the same silly excuses all the time during the previous gen. first the 8800gt was fast enough because games were ports. then the gtx260 was overkill because games were ports. then the gtx470 was overkill because games were ports. none of that was ever true for long because pc versions of games got more and more demanding the whole time...

Pushed to the limit how? Like Crysis? With sloppy ports and little to no optimizations?
 

Cookie Monster

Diamond Member
May 7, 2005
5,161
32
86
Im sort of on the same boat. Can both cards actually sustain the performance @1440P without having to turn the details down?
 

toyota

Lifer
Apr 15, 2001
12,957
1
0
Pushed to the limit how? Like Crysis? With sloppy ports and little to no optimizations?
what does that even mean or have to do with what I was talking about? the only point is that the vram beyond 1.5 gb is fully functional and scales exactly the same across the board.
 

blastingcap

Diamond Member
Sep 16, 2010
6,654
5
76
If OP is gonna upgrade the video card within 18 months and never run over 1440p, 3GB VRAM is pretty safe.

But PC games often eat up VRAM as you crank settings and resolution high, especially with mods. Skyrim is perhaps the poster child of how the stock game runs "okay" on consoles, and on PCs with 1GB VRAM, but if you pile on the effects and mods and resolution, it will chew through 2GB VRAM. And that was a LAST gen console port. Anyone want to bet on how much VRAM Fallout 4 will eat at stock settings maxed out at 1440p? How much with lots of mods loaded?

But once again, I agree that if OP is gonna upgrade the video card within 18 months and never run over 1440p, 3GB VRAM is pretty safe.
 

guskline

Diamond Member
Apr 17, 2006
5,338
476
126
OP, read reviews of both cards. The MSI Gaming R9 290 sounds like a solid card. I have a EVGA Classified GTX780 in my rig below and love it. It appears in your situation with $200 of GC from Micro Center the MSI R9 290 might be a better fit.
 

VulgarDisplay

Diamond Member
Apr 3, 2009
6,188
2
76
If you play bf4 multiplayer go R9 290. Its the only reason I sent my gtx 780 back. The dx11 overhead makes the game run like crap on the gtx780.

If you aren't playing bf4 then you'd be fine with the gtx780.
 

desprado

Golden Member
Jul 16, 2013
1,645
0
0
comments like that make no sense. the previous consoles where a joke but still many games pushed high end card to their limits with all settings cranked. we heard the same silly excuses all the time during the previous gen. first the 8800gt was fast enough because games were ports. then the gtx260 was overkill because games were ports. then the gtx470 was overkill because games were ports. none of that was ever true for long because pc versions of games got more and more demanding the whole time...
I dont now what u are trying to say but i am pretty sure that 9800gt or 4770 can run any ported games from last gen console.

SORRY FOR GOING OUT OF THE TOPIC.
 

toyota

Lifer
Apr 15, 2001
12,957
1
0
I dont now what u are trying to say but i am pretty sure that 9800gt or 4770 can run any ported games from last gen console.

SORRY FOR GOING OUT OF THE TOPIC.
you really need to pay more attention. I really dont see the point in repeating the same thing if you did not get it the first time. easiest way to explain it was that games kept getting more and more demanding on the pc despite the fact that they were console ports.
 

desprado

Golden Member
Jul 16, 2013
1,645
0
0
you really need to pay more attention. I really dont see the point in repeating the same thing if you did not get it the first time. easiest way to explain it was that games kept getting more and more demanding on the pc despite the fact that they were console ports.

U can go and look at last gen games ported from Console were limited and can easily run on cards like 9800 GT but only if u run games on same resolution.I dont need remind that Console are closed hardware.

Only the way games can be demanding when develop games on PC like farcry 3,Crysis 2 and 3 battlefield 3 and 4 lastly Watch Dog.
 

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
Pushed to the limit how? Like Crysis? With sloppy ports and little to no optimizations?

You don't understand PC gaming. PC games are optimized only in the sense that they have a range of options for the low end and high end systems. The low end systems get what consoles get, and they don't require a lot of resources or power, but the high end settings are geared towards high end PC's. Those settings are far superior to consoles.

It is up to the user to optimize the settings for their own PC as every system is different and PC users are used to having the choice of high FPS or high IQ, sometimes both.

Crysis was great. It worked on low end and high end systems of the time, and to get good performance at the highest settings, you had to have a beefy system and the typical resolution at the time it was released (1024x728). If you had a higher resolution, you'd use lower settings.

PC games are not optimized so most people can play at max settings. Many PC games either require the absolute top in PC or even wait a couple years to max out. That does not mean it was not optimized, only that they gave options everyone and sometimes even for the future.
 

blackened23

Diamond Member
Jul 26, 2011
8,548
2
0
you really need to pay more attention. I really dont see the point in repeating the same thing if you did not get it the first time. easiest way to explain it was that games kept getting more and more demanding on the pc despite the fact that they were console ports.

I think you're both correct to an extent. I think desperado is saying that the VRAM increase is due to resolution increases. From 2008 VRAM requirements did go up. Console ports from 2010 use more VRAM than console ports from 2007. You're correct. However, resolution went up as well. VRAM scales with resolution and anti aliasing. VRAM also increases exponentially with AA.

Anyway, back in 2007 / 2008 there were certainly a ton of people who were still using 4:3 resolutions such as 1024*768. 1080p wasn't quite commonplace on the PC at that time, so the resolution increase is partially responsible for the VRAM increase. (IF i'm understanding desperado correctly here).

So you're essentially both correct. VRAM use increased due to games, but a lot of it was due to the resolution increase over time as compared to 2006-2007. Now desperado seems to suggest that resolution increase is the only reason for the VRAM increase. That wasn't the sole factor, there are multiple reasons. From my memory, most of my buds were using 1024*768 or 1280*1024 around the 2007 era. Certainly going from that to 1080p increased VRAM requirements. But games definitely used more in the way of target AA levels and assets as well, which is another contributing factor.

I do strongly disagree with desp that console ports didn't increase in quality over time. They definitely did. And the VRAM from those ports did increase a tad. The graphics from 2005 era xbox 360 to current day is noticeable difference in terms of quality for sure. Now I personally don't think this mirrors the current day situation. We're in a place where PC gamers are using more VRAM than they should be from anti aliasing than they are with game assets. That seems pretty backwards to me, but it's how things are. Game assets aren't increasing the same level as they did with the xbox 360 - the xbox 360 was a very powerful console at launch; the PS4 and XB1 aren't. So I don't foresee anything drastic occurring in the next several years in terms of "ports". We'll see though. But, if you do feel that way, there are options. 6GB VRAM is only a 50$ premium with the GTX 780 and GTX 780ti, those cards are being released in a month. And then there's what AMD has is you prefer that.
 
Last edited:

toyota

Lifer
Apr 15, 2001
12,957
1
0
U can go and look at last gen games ported from Console were limited and can easily run on cards like 9800 GT but only if u run games on same resolution.I dont need remind that Console are closed hardware.

Only the way games can be demanding when develop games on PC like farcry 3,Crysis 2 and 3 battlefield 3 and 4 lastly Watch Dog.
sorry but you are wrong and still not paying lots of attention to everything being said. again the excuse will always be that the consoles are way weaker but it does NOT matter as games have and will still get more and more demanding for the pc and that is a fact.
 

toyota

Lifer
Apr 15, 2001
12,957
1
0
I think you're both correct to an extent. I think desperado is saying that the VRAM increase is due to resolution increases. From 2008 VRAM requirements did go up. However, resolution went up as well. VRAM scales with resolution and anti aliasing. VRAM also increases exponentially with AA.

Anyway, back in 2007 / 2008 there were certainly a ton of people who were still using 4:3 resolutions such as 1024*768. 1080p wasn't quite commonplace on the PC at that time, so the resolution increase is partially responsible for the VRAM increase. (IF i'm understanding desperado correctly here).

So you're essentially both correct. VRAM use increased due to games, but a lot of it was due to the resolution increase over time as compared to 2006-2007. Now desperado seems to suggest that resolution increase is the only reason for the VRAM increase. That wasn't the sole factor, there are multiple reasons.
he does not understand that requirements in 2013 even for "ports" have made massive leaps over what was required in 2007. and to run newer games on higher settings requires quite a bit of horsepower. so again the point I was making is that games will always get more demanding on the pc whether they are ports or not. its foolish to look at the hardware in the consoles and think because you exceed them that you will be fine especially way down the road.
 
Last edited:

blackened23

Diamond Member
Jul 26, 2011
8,548
2
0
Yeah. I don't think desp is giving enough credit to console port quality increases over the years. I agree with you that they increased in quality and requirements considerably; Console ports nowadays are vastly better than what was churned out in 2007. He's definitely partially incorrect, especially on that argument.

I do not feel that XB1 and PS4 ports will match that level of increasing quality, since the PS3 and XB360 were very powerful at launch and well. PS4 is PS4. In terms of power, it's a far cry from how good the xbox 360 circa 2005 was. The 360 had a legit one of the best GPUs possible in 2005, and a great CPU. PS4 and XB1 not so much. We'll see though. Like I said earlier, more VRAM is going to be very cheap soon (50$ premium for a 780/780ti 6GB, and the AMD 4GB stuff), so, there are options either way.
 
Last edited:

railven

Diamond Member
Mar 25, 2010
6,604
561
126
"1.5GBs of usable VRAM on a 2GB card, really NV"

that is not true at all. a 660ti has 2gb of ram and can use it all. performance does not trail off when going over 1.5gb on a 660ti. I know this because I owned one for a year and tested the crap out of it. I also had a 192 bit gtx560 se 1gb and it scaled fine passed 768mb too.

You have your experience and I have mine. Sli gtx 660 ti turns into a stuttering mess in skyrim once I hit 1540mbs of vram load.

I didn't learn how to adjust texture sizes on mods for my own health. After realizing my vram cap I modded a lot of files to stay under 1540 MBS and magically the stuttering stopped.
 

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
You have your experience and I have mine. Sli gtx 660 ti turns into a stuttering mess in skyrim once I hit 1540mbs of vram load.

I didn't learn how to adjust texture sizes on mods for my own health. After realizing my vram cap I modded a lot of files to stay under 1540 MBS and magically the stuttering stopped.
I guess the problem wasn't so much SLI, but SLI mid/lower end cards. That's not generally a good idea.
 

toyota

Lifer
Apr 15, 2001
12,957
1
0
You have your experience and I have mine. Sli gtx 660 ti turns into a stuttering mess in skyrim once I hit 1540mbs of vram load.

I didn't learn how to adjust texture sizes on mods for my own health. After realizing my vram cap I modded a lot of files to stay under 1540 MBS and magically the stuttering stopped.
well magically no stuttering going over 1.5 gb on my 660ti or going over 768mb on my gtx560 se. so if vram with the mixed memory was the issue then it would happen for me too along with every one else. bottom line is there was some other issue going on.
 

railven

Diamond Member
Mar 25, 2010
6,604
561
126
I guess the problem wasn't so much SLI, but SLI mid/lower end cards. That's not generally a good idea.

These cards were temp as I sold off my 7970 during the bit mine craze. Cost me $140 to get sli going. Call me an optimistic person but I was not expecting the issues I ran into with this adventure. From a dead card to constant driver issues. I would rate the four months I've owned this setup as the worst period in my PC gaming history. But not once did I flame nvidia just chalked it up to bad luck.

Either way when they worked I didn't experience the microstutter free experience that was debated versus crossfire. Frankly found them to be equal. Moved to a 1080p screen but still persisted. The recent lack of sli profile for wildstar broke me. It wasn't the lack of sli profile it was the bsods I got from turning sli off in nv panel. Frankly I never got it off. Driver sweeper the driver suit and took out the second card. So I'm just looking to buy a card asap.
 
Feb 19, 2009
10,457
10
76
I guess the problem wasn't so much SLI, but SLI mid/lower end cards. That's not generally a good idea.

Well even SLI 680s will run into problems with 2gb vram in a few games, and it wasnt that long ago it was top dog.

Rail, if you decide on the R290, stay away from anything except for Sapphire Tri-X and Powercolor PCS. These two have the best custom cooler this gen, AMD OR NV, period, bar none.

I'd go with the R290 because here, even the R290 PCS is $100 cheaper than most 780s so its not a contest. But where you are, it depends on the price and how much you value extra NV features.
 

railven

Diamond Member
Mar 25, 2010
6,604
561
126
well magically no stuttering going over 1.5 gb on my 660ti or going over 768mb on my gtx560 se. so if vram with the mixed memory was the issue then it would happen for me too along with every one else. bottom line is there was some other issue going on.

Can only go by my experience. Modding files to go from 1580mbs to 1502mbs alleviated my stuttering. Nothing else did. Even running at a lower resolution didn't help. Whatever was the cause I only have one culprit.
 

railven

Diamond Member
Mar 25, 2010
6,604
561
126
Well even SLI 680s will run into problems with 2gb vram in a few games, and it wasnt that long ago it was top dog.

Rail, if you decide on the R290, stay away from anything except for Sapphire Tri-X and Powercolor PCS. These two have the best custom cooler this gen, AMD OR NV, period, bar none.

I'd go with the R290 because here, even the R290 PCS is $100 cheaper than most 780s so its not a contest. But where you are, it depends on the price and how much you value extra NV features.

Gonna see if microcenter price matches the 780 MSI gamer off newegg. $480. Tough to beat in mg opinion.
 

leeb2013

Junior Member
Nov 14, 2013
10
0
0
Generally, I'd prefer Nvidia, they just seem more efficient and cooler.

However, I got a Sapphire Tri-x R9-290 because;

It was cheaper in Oz

I run 3 monitors and it has 4GB Vram, I don't believe 3GB is enough with BF4 already using 2.7GB and Titanfall using all 4GB (despite the crap graphics), I'm sure others will follow suit.

I wanted the Sapphire for it's flex DVI outputs, I had issues with DP to DVI conversion, flex does not need this. I didn't know if Nvidia did or not.

The Sapphire cooler is amazing and very quiet. When gaming, I can barely hear it, even if the GPU is pushed to 100% and it holds the temp around 65-70C. Furmark pushes it up to 82C @ 51% fan speed, which you can hear, but isn't bad.

For the modern games I play (mainly FPS), all the benchmarks seemed to favour the R9 over the 780 with the odd exception, in fact many favoured it over the Titan.

AMD are generally more allowing of over voltage/ over clocking than Nvidia, although I did find a good BIOS editor for my GTX680.

I may want to crossfire in the future and AMD's PCIe solution may be better than using a bridge, although I've not looked into it so much.
 
Feb 19, 2009
10,457
10
76
@leeb2013

In Oz its no contest, people who buy NV here are specifically wanting NV features regardless of price or they simply have no clue.

I mean compare this (from our premier PC etailer, often with best prices): R290 Tri-X $539
http://www.pccasegear.com/index.php...=26373&zenid=906ec4b552efb5607660840e5d083a45

To a MSI Gaming 780: $679
http://www.pccasegear.com/index.php...=23962&zenid=906ec4b552efb5607660840e5d083a45

Seriously what the hell. Prices are like this in many Asian countries as well. There appears to be a massive NV tax which isn't occurring in the USA.
 

escrow4

Diamond Member
Feb 4, 2013
3,339
122
106
You don't understand PC gaming. PC games are optimized only in the sense that they have a range of options for the low end and high end systems. The low end systems get what consoles get, and they don't require a lot of resources or power, but the high end settings are geared towards high end PC's. Those settings are far superior to consoles.

It is up to the user to optimize the settings for their own PC as every system is different and PC users are used to having the choice of high FPS or high IQ, sometimes both.

Crysis was great. It worked on low end and high end systems of the time, and to get good performance at the highest settings, you had to have a beefy system and the typical resolution at the time it was released (1024x728). If you had a higher resolution, you'd use lower settings.

PC games are not optimized so most people can play at max settings. Many PC games either require the absolute top in PC or even wait a couple years to max out. That does not mean it was not optimized, only that they gave options everyone and sometimes even for the future.

If they were optimised at max settings you wouldn't have horrible ports like Crysis 2 and 3 - 2 rendered tessellation on everything even if you couldn't see it, and the genius who developed 3 decided to put physics on the CPU. Games only need more grunt because they are sloppy ports.