Radeon 7970 3GB vs. GTX 680 4GB vs. GTX 680 2GB for 1440p

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Radeon 7970 3GB vs. GTX 680 4GB vs. GTX 680 2GB for 1440p

  • Radeon 7970 3GB

  • GTX 680 4GB

  • GTX 680 2GB


Results are only viewable after voting.

Erenhardt

Diamond Member
Dec 1, 2012
3,251
105
101
Funny...not!
In my country (Europe - Poland) 450$ gives you 7950 at best...
I would go for AMD
 

Jacky60

Golden Member
Jan 3, 2010
1,123
0
0
Why would I do that, that is silly. Bias?
10%-15% IS a small difference. Think about it, what is that really? It certainly doesn't mean the difference between playable and unplayable. 99% of all people wouldn't even notice 10%-15%. We are still speaking about the average, just a reminder ;)
GTX680 is 9% faster than GTX670 at TPU, and 7970 GE is 10.5% faster than GTX680 at TPU (both at 1600p). So basically the distance between those pairs is the same. People always recommend the 670 over the 680 because the performance delta is so small. But when it comes to AMD vs Nvidia, the same performance delta suddenly is not so small? That is bias at its best.

20-25%, that in my opinion is a significant difference that actually begins to matter. With the current cards, it comes down to what games you like. It's best to judge on a game-to-game basis.

Just to make it clear:
I would recommend the 7970 GE because it is a tad faster and cheaper, especially with the good game bundle. But I don't like people who blow things out of proportion when the numbers don't lie.

Errr...Hello 10-15% isn't a small difference, its the difference between a 6" and a 7" epeen which for all you girly men would be quite significant I'm sure.
 

Lepton87

Platinum Member
Jul 28, 2009
2,544
9
81
Funny...not!
In my country (Europe - Poland) 450$ gives you 7950 at best...
I would go for AMD

http://techplanet.pl/produkty/karty...-ddr5384bit-dvihdmimdp-pci-express,30587.html

Not quite 450$ but close, remember THE US don't pay VAT taxes and mostly sales taxes as well. Ah, and the great divide in salary, 4x more average salary per hour, electricity cheaper by a few times and they argue that a little less power hungry component will shave off a few $ from their utility bill :D It's about 470 - 23%VAT as if 22% wasn't enough and what you get is 360$ so the prices are pretty similar, it's the government that is rubbing us.
 
Last edited:

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
Why would I do that, that is silly. Bias?
10%-15% IS a small difference. Think about it, what is that really? It certainly doesn't mean the difference between playable and unplayable. 99% of all people wouldn't even notice 10%-15%. We are still speaking about the average, just a reminder ;)
GTX680 is 9% faster than GTX670 at TPU, and 7970 GE is 10.5% faster than GTX680 at TPU (both at 1600p). So basically the distance between those pairs is the same. People always recommend the 670 over the 680 because the performance delta is so small. But when it comes to AMD vs Nvidia, the same performance delta suddenly is not so small? That is bias at its best.

20-25%, that in my opinion is a significant difference that actually begins to matter. With the current cards, it comes down to what games you like. It's best to judge on a game-to-game basis.

Just to make it clear:
I would recommend the 7970 GE because it is a tad faster and cheaper, especially with the good game bundle. But I don't like people who blow things out of proportion when the numbers don't lie.

15% of 80fps is only 12fps. If you can really see 12FPS differences when you're over 60fps to begin with you have the eyes of superman. I can tell the difference between 60 and 80fps to a degree in some games like Battlefield, but that's a 20fps difference.

Neither card is looking at 25fps slideshows with any game I know of at that resolution. Even still 15% more than 30fps is 34.5fps. Both would be crap and you'd need to think about your settings choices anyway. I don't think 15% is that much when you look at the numbers.

[H]'s review shows the HD 7970 Ghz being able to run FC3 with HDAO at 1600p while remaining close to 40 fps which the GTX 680 cannot. The OP's resolution is 1440p. so the 1600p performance is relevant. HDAO provides the best image quality in Farcry 3.

there are many games where the gap is 15 - 20% or more at 1440p. BF3, MOH Warfighter, Sleeping Dogs, Skyrim, Witcher 2, Alan Wake, Metro 2033 you just need to look at the reviews before saying it occurs only in 1 game.

HDAO is not optimized for Nvidia hardware in that title. It's like Tessellation was to AMD with Crysis 2 before. This has been known, supposedly the new 310.70 drivers help some but they don't specify if it improves HDAO performance or not. Either way the difference is relatively minor considering the performance jump. I wouldn't feel bad if I had to shut off that particular setting. If I had to drop shadows and stuff down then I'd be upset. Luckily I don't have to shut off any setting with SLI. Probably sometime in the future I'll be looking at that situation.

at 1440 resolution a single card cant do 60 FPS in some games on max settings

It really depends on the game you're looking at. The OP mentioned nothing really GPU intensive which would imply to me that a Ghz edition 7970 is a bit too much for what he is looking to accomplish.

I'd be looking at the 670 or 7950 instead. Maybe even a 7870 and a bit of an overclock. No specific games were mentioned though.

this is about single cards not sli/cf

It has been mentioned that single cards can have microstutter to some degree too. Depending on the game and resolution (2560x1600 in this case) the game can experience significant fps drop and poor gameplay.
 
Last edited:

Lepton87

Platinum Member
Jul 28, 2009
2,544
9
81
15% of 80fps is only 12fps. If you can really see 12FPS differences when you're over 60fps to begin with you have the eyes of superman. I can tell the difference between 60 and 80fps to a degree in some games like Battlefield, but that's a 20fps difference.

15% is just the average you hardly ever get that exact difference in an actual game. In some games there is no difference but in some games the difference is massive. Funny how nV fanboys were ecstatic about GTX580 where the average difference in performance between its predecessor was similar and everyone had to upgrade. Not to mention almost no existent difference after OC.
sniperelitev2_2560_1600.gif

sleepingdogs_2560_1600.gif

alanwake_2560_1600.gif


In the rest of the games they may be similar, but in those the difference will be very hard to NOT notice. I bet that in the future the performance gap will just get bigger and bigger.
In games where GTX680 is faster the difference is really unnoticeable
borderlands2_2560_1600.gif

Well under 10%. Anything under 10% should not be noticeable. OC both cards and that difference will melt away, because 7970 overclocks better.

I don't even know why people are arguing that he should buy BOTH slower and more expensive card because THEY are biased. Shove your bias up your body part and keep it there.
 
Last edited:

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
You guys keep mentioning the 580. WHY? What relevance is that? I do not remember people scrambling to buy the 580 if they had a 480.

Until you get close to 60fps it's not enough IMO. I still stand by what I said, you wanna keep claiming "15% faster" like it's some magical jump. I'll keep pointing out that 15% is not a lot of FPS difference. Now looking at the graphs above, going from 41 to 58fps is a difference of almost 30% which is a lot different than what you guys are constantly claiming.

Also I wonder where they got 4x AA in sleeping dogs from because that's not how the game settings work. Anyone who has the game can attest that it is off, low, med, high
 

Lepton87

Platinum Member
Jul 28, 2009
2,544
9
81
For those who can't count 7970GHz is 42%,28% and 21% faster and 9% slower in those games. Yeah, really no difference in game-play.
 

Lepton87

Platinum Member
Jul 28, 2009
2,544
9
81
You guys keep mentioning the 580. WHY? What relevance is that? I do not remember people scrambling to buy the 580 if they had a 480.

Until you get close to 60fps it's not enough IMO. I still stand by what I said, you wanna keep claiming "15% faster" like it's some magical jump. I'll keep pointing out that 15% is not a lot of FPS difference. Now looking at the graphs above, going from 41 to 58fps is a difference of almost 30% which is a lot different than what you guys are constantly claiming.

Also I wonder where they got 4x AA in sleeping dogs from because that's not how the game settings work. Anyone who has the game can attest that it is off, low, med, high

Great math bro, it's well over 41%, if you can't do basic math please refrain from further discussion.
 

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
Great math bro, it's well over 41%, if you can't do basic math please refrain from further discussion.

So? you kept saying 15%, 15%, 15% faster before.

30% of 58 is 17.4 so 58-17.4 = 40.6

So the 680 is 30% slower no?

It's ~70% of the performance. 41 / 58 x 100 = 70.68965517241379

The math you're doing is backwards. You're taking 41fps and adding 40% of 41. which is 17.4 I did it by calculating what percentage of 58 is 41 and it's about 70%. That means there's 30% unaccounted for. Which means 30% more performance brings you to the level of the faster card.

When talking percentages you start with 100%.
 
Last edited:

Lepton87

Platinum Member
Jul 28, 2009
2,544
9
81
So? you kept saying 15%, 15%, 15% faster before.

30% of 58 is 17.4 so 58-17.4 = 40.6

So the 680 is 30% slower no?

It's ~70% of the performance. 41 / 58 x 100 = 70.68965517241379

The math you're doing is backwards. You're taking 41fps and adding 40% of 41. which is 17.4 I did it by calculating what percentage of 58 is 41 and it's about 70%. That means there's 30% unaccounted for.

OMG..... I'm not talking to you until you get at least a basic course in maths. If you have two numbers like 58 and 41 then 58 is 41% more than 41 and 41 is 30% less then 58. OMG. GET BACK TO SCHOOL KID.
 
Last edited:

UaVaj

Golden Member
Nov 16, 2012
1,546
0
76
both of you are correct in your math.

if 58 is the base. then 41 is a 30% decrease in performance from 58.
if 41 is the base. then 58 is a 41% increase in performance from 41.

since we are talking about is "increase in performance." 41% is the correct math. :)
 

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
both of you are correct in your math.

if 58 is the base. then 41 is a 30% decrease in performance from 58.
if 41 is the base. then 58 is a 41% increase in performance from 41.

since we are talking about is "increase in performance." 41% is the correct math.

But it's getting 70% of the performance which is the number that I was really figuring out.

Anyway a better way for me to think is by using 60fps as the baseline. How much more performance is required to hit 60fps. That's my baseline standard. Under 60 is no good unless it's real close.

Also don't use all caps and don't call people kid, that's ridiculously immature just because you have an agenda to push here.
 
Last edited:

Lepton87

Platinum Member
Jul 28, 2009
2,544
9
81
both of you are correct in your math.

if 58 is the base. then 41 is a 30% decrease in performance from 58.
if 41 is the base. then 58 is a 41% increase in performance from 41.

since we are talking about is "increase in performance." 41% is the correct math. :)

No he's not correct, in fact he couldn't be more wrong. I used the word faster not slower. He's a complete math illiterate.
 

Lepton87

Platinum Member
Jul 28, 2009
2,544
9
81
LOL I just figured out that 41 is 70% of 58. Where's 40%? We're talking about the fps number here.

reverse your calculations Id.... 58/41. how much is that? You just showed that you can use a calculator so calculate it yourself.
 

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
I'm done derailing the thread but enjoy talking to yourself and upping your post count. You know the edit button works fine.
 

Lepton87

Platinum Member
Jul 28, 2009
2,544
9
81
I'm done derailing the thread but enjoy talking to yourself and upping your post count. You know the edit button works fine.

Yeah, I expected nothing else from you it takes balls to admit that you don't know 4th grade math.
Techpowerup should implement some algorithm like this:
http://www.computerbase.de/artikel/grafikkarten/2012/test-vtx3d-hd-7870-black-tahiti-le/3/

That should partly take care of math illiterate, other then that it's just convenient.




If you can't make your point without attacking members here, don't post.


esquared
Anandtech Forum Director
 
Last edited by a moderator:

ShadowOfMyself

Diamond Member
Jun 22, 2006
4,230
2
0
Lol, amazing the arguments people come up with to justify their bias

Since when does anyone measure cards by how much "slower" they are?
What a lame attempt at trying to make Nvidia look better... I look forward to seeing the same kind of math when the situation is reversed
 

Eureka

Diamond Member
Sep 6, 2005
3,822
1
81
Someone on here can't do math.

Anyway, I think everyone missed the point, kind of. First off, the OP said he's not playing GPU-intesive games. Why is he paying out the nose for a top card? No one in here recommended a 7950/670? Or even a 7870/660? All of those cards will drive games at 1440p, and if he's not utilizing the full power of the top card then why bother?

Secondly, even if it was down to these choices, if the games aren't top-end, it doesn't really matter which card to get. I believe the 7970 would be much more future proof than the 680 with bigger VRAM and memory bus.