Why would I do that, that is silly. Bias?
10%-15% IS a small difference. Think about it, what is that really? It certainly doesn't mean the difference between playable and unplayable. 99% of all people wouldn't even notice 10%-15%. We are still speaking about the average, just a reminder
GTX680 is 9% faster than GTX670 at TPU, and 7970 GE is 10.5% faster than GTX680 at TPU (both at 1600p). So basically the distance between those pairs is the same. People always recommend the 670 over the 680 because the performance delta is so small. But when it comes to AMD vs Nvidia, the same performance delta suddenly is not so small? That is bias at its best.
20-25%, that in my opinion is a significant difference that actually begins to matter. With the current cards, it comes down to what games you like. It's best to judge on a game-to-game basis.
Just to make it clear:
I would recommend the 7970 GE because it is a tad faster and cheaper, especially with the good game bundle. But I don't like people who blow things out of proportion when the numbers don't lie.
Using a GTX680 2GB myself on a Dell U2713HM. Always maxing everything in all games
Funny...not!
In my country (Europe - Poland) 450$ gives you 7950 at best...
I would go for AMD
Why would I do that, that is silly. Bias?
10%-15% IS a small difference. Think about it, what is that really? It certainly doesn't mean the difference between playable and unplayable. 99% of all people wouldn't even notice 10%-15%. We are still speaking about the average, just a reminder
GTX680 is 9% faster than GTX670 at TPU, and 7970 GE is 10.5% faster than GTX680 at TPU (both at 1600p). So basically the distance between those pairs is the same. People always recommend the 670 over the 680 because the performance delta is so small. But when it comes to AMD vs Nvidia, the same performance delta suddenly is not so small? That is bias at its best.
20-25%, that in my opinion is a significant difference that actually begins to matter. With the current cards, it comes down to what games you like. It's best to judge on a game-to-game basis.
Just to make it clear:
I would recommend the 7970 GE because it is a tad faster and cheaper, especially with the good game bundle. But I don't like people who blow things out of proportion when the numbers don't lie.
[H]'s review shows the HD 7970 Ghz being able to run FC3 with HDAO at 1600p while remaining close to 40 fps which the GTX 680 cannot. The OP's resolution is 1440p. so the 1600p performance is relevant. HDAO provides the best image quality in Farcry 3.
there are many games where the gap is 15 - 20% or more at 1440p. BF3, MOH Warfighter, Sleeping Dogs, Skyrim, Witcher 2, Alan Wake, Metro 2033 you just need to look at the reviews before saying it occurs only in 1 game.
at 1440 resolution a single card cant do 60 FPS in some games on max settings
this is about single cards not sli/cf
15% of 80fps is only 12fps. If you can really see 12FPS differences when you're over 60fps to begin with you have the eyes of superman. I can tell the difference between 60 and 80fps to a degree in some games like Battlefield, but that's a 20fps difference.
You guys keep mentioning the 580. WHY? What relevance is that? I do not remember people scrambling to buy the 580 if they had a 480.
Until you get close to 60fps it's not enough IMO. I still stand by what I said, you wanna keep claiming "15% faster" like it's some magical jump. I'll keep pointing out that 15% is not a lot of FPS difference. Now looking at the graphs above, going from 41 to 58fps is a difference of almost 30% which is a lot different than what you guys are constantly claiming.
Also I wonder where they got 4x AA in sleeping dogs from because that's not how the game settings work. Anyone who has the game can attest that it is off, low, med, high
Great math bro, it's well over 41%, if you can't do basic math please refrain from further discussion.
So? you kept saying 15%, 15%, 15% faster before.
You know what the concept of average is? If you add several numbers together and then divide that by the total number of numbers what you get is an average.
So? you kept saying 15%, 15%, 15% faster before.
30% of 58 is 17.4 so 58-17.4 = 40.6
So the 680 is 30% slower no?
It's ~70% of the performance. 41 / 58 x 100 = 70.68965517241379
The math you're doing is backwards. You're taking 41fps and adding 40% of 41. which is 17.4 I did it by calculating what percentage of 58 is 41 and it's about 70%. That means there's 30% unaccounted for.
OMG..... I'm not talking to you until you get at least a basic course in maths.
both of you are correct in your math.
if 58 is the base. then 41 is a 30% decrease in performance from 58.
if 41 is the base. then 58 is a 41% increase in performance from 41.
since we are talking about is "increase in performance." 41% is the correct math.
both of you are correct in your math.
if 58 is the base. then 41 is a 30% decrease in performance from 58.
if 41 is the base. then 58 is a 41% increase in performance from 41.
since we are talking about is "increase in performance." 41% is the correct math.
LOL I just figured out that 41 is 70% of 58. Where's 40%? We're talking about the fps number here.
I'm done derailing the thread but enjoy talking to yourself and upping your post count. You know the edit button works fine.
Thanks. I didn't think my comments on this forum were really all that valuable. I'm more of a reader than a writer.
You've got over 9k posts. LOL