Look at the individual charts to confirm what I said rather than their misleading averaged figures.
When did anyone make a conclusion that 384-bit bus cards are automatically faster than 256-bit bus cards? That in itself is an erroneous statement that has been made in this thread. We know for instance that GTX680 is faster than 384-bit bus HD7950.
Yet, the 770 beat (or equaled) the 7970GE in 4 games vs 2 @ 2560x1440 and in 5 games vs 1 @ 5760x1200. After this I will never look at averaged performance of games in cards, incl that of TPU. But rather look at each individual games performance to arrive at what I think would be a clearer view a cards overall performance.
Did you actually analyze the significance of those "wins"?
770 vs. 7970GE - 2560x1440
BF3 = +1 fps
Bioshock = +0.6 fps
Crysis 3 = +0.8 fps
Far Cry 3 = +4.5 fps
GRID 2 = -14.7 fps
Hitman Absolution = - 7.2 fps
770 vs. 7970GE - 3 monitors
BF3 = tie
Bioshock = +0.5 fps
Crysis 3 = +1.1 fps
Far Cry 3 = +1.0 fps
GRID 2 = +4.2 fps
Hitman Absolution = - 4.1 fps
Those wins 770 has don't mean anything as they are so minor. When 7970GE won, it won by much larger margins which makes it a more consistent card in those games. At triple monitor, both cards are basically tied. You keep ignoring the $100-150 price difference but I have no idea why that even makes sense when 770
cannot provide a higher level of real world playability in these titles.
If you want to ignore averages in GPU reviews on the net, that's up to you. We can't possibly know what games Gamer ABCD plays. We always use averages for that reason. If all you do is play games where NV is faster (i.e., Secret World, Lost Planet 2, Assassin's Creed IV), by all means get an NV card. That is how GPU buying has always worked for each individual consumer. The whole point of looking at averages is to include as many games as possible as it gives a better representation of the card's overall performance in different games.
Also, you say that GTX770 beat HD7970GE but you completely ignored that the wins are small. If you choose to exclude GRID 2 then why shouldn't we exclude Far Cry 3? It doesn't work that way. With 770 you pay $100-150 more for a card that's merely ties or per TPU/Computerbase, loses to the 7970GE at high resolution gaming.
Finally, anyone who is dead serious on playing on triple monitor gaming understands that a single 770/7970GE isn't enough for those titles to begin with.
There is also
TechReport which found that 7970GE actually beat 770 in frame rate smoothness.
"Meanwhile, the GTX 770 is in a tougher spot. When it was introduced a couple of weeks ago, its $399.99 price tag undercut the Radeon HD 7970 GHz Edition. The price advantage was especially welcome since the 7970 GHz is apparently still the faster card."
We have then at least 4 reviews (TPU, ComputerBase, Alt+Esc, Tech Report) that all universally showed that 770 cannot beat 7970GE at high resolution gaming on average despite an unjustifiable price increase. If you want to ignore those reviews, it's totally up to you. No matter how you slice it, 770 is hugely overpriced right now and nearly every recent review site confirms this fact.
Comparison of 256-bit vs. 384-bit is really irrelevant since gamers look at features, price and performance. Specs on paper are meaningless in this case.
Anyways, amenx was talking about memory size and memory bus width, which are not directly related to core clock speed. Even if overclocking the 7970 gives it more of an advantage, it is not the memory bus or memory size that grants this advantage, which is what I think amenx's point was.
Ok but this comparison came out of nowhere. Who has claimed in this thread or otherwise that a 384-bit bus card needs to be faster than a 256-bit card? GTX680 vs. 7950 already proved otherwise.
If you have a car with a twin-turbo 3.8L 6-cylinder engine vs. a car with a 6.2L naturally aspirated V8, both approaches can produce a very fast car. We already know you can't directly compare NV and AMD GPUs based on memory bandwidth. Not sure why this is even mentioned but the terrible price/performance of 770 is completely ignored? Sounds like a strawman to me. How AMD or NV derive their performance makes no difference to me as a gamer. If NV releases a Volta card with 2560 CUDA cores that beat's AMD's card with 6000 Shaders, I could care less. I don't buy CUDA cores or shaders --> I buy performance.