[Hardcorp] GALAXY GTX 660 Ti GC OC vs. OC GTX 670 & HD 7950

Page 13 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

sontin

Diamond Member
Sep 12, 2011
3,273
149
106
Nobody cares in large volumes yet. GCN architecture itself is compute heavy. This was explained in great detail in AnandTech's review of Tahiti XT and why AMD went that route. I said it's a risk but I am not a strategy officer or a GPU designer for AMD or NV graphics. I am just telling you what they are telling us. Read the Surround Computing presentation at TechPowerup. It's not my strategy, it's AMD's strategy and HSA has been discussed for a while now; and compute/GPU processing is a key factor in making it work. HSA may or may not allow AMD to hang in there with their APUs against the otherwise much superior Intel CPUs.

You telling me AMD's PR nonsense. GK104 and the lower chips are compute heavy, too. They are better than everything AMD brought to the market before GCN. The difference is that AMD did a GF11/00, a very inefficient way of designing a GPU for every market. Even nVidia went away from this.

Currently, compute is already used in 3 games; and in all it GK104 is 0/3 against the 7970. Whether or a lot more games use DirectCompute shaders I don't know. If they do, GK104 is basically scap. The guys at NV are very intelligent and I bet they know there is a trend towards GPU computing in the industry. If more games use DirectCompute shaders for graphical effects, you can bet NV will address it.

Wow - 3 games? THREE games? You must be kidding me. Ignoring Dirt 2 and 3? Max Payne 3? Batman AC? Anno 2070? Lost Planet 2? Dragon Age 2? Crysis 2? Battlefield 3? Metro2033? Secret World?
 

f1sherman

Platinum Member
Apr 5, 2011
2,243
1
0
Wow - 3 games? THREE games? You must be kidding me. Ignoring Dirt 2 and 3? Max Payne 3? Batman AC? Anno 2070? Lost Planet 2? Dragon Age 2? Crysis 2? Battlefield 3? Metro2033? Secret World?

Hey sontin, DirectCompute in BF3, Civ V and Dirt 3 does not count!

Only Sniper Elite v2, Sleeping Dogs and Dirt Showdown matter.
Because AMD optimized those games superbly. They really tweaked them like there's no tomorrow.

Sleeping Dogs

sdd1ukf.png


Sniper Elite v2

sewzu78.png



Unlike Nvidia and Crytek cheating and tesselation brute-forcing it's way to fps victory in

Crysis 2

c2eluzj.png


:hmm:
 
Last edited:

raghu78

Diamond Member
Aug 23, 2012
4,093
1,475
136
Wow - 3 games? THREE games? You must be kidding me. Ignoring Dirt 2 and 3? Max Payne 3? Batman AC? Anno 2070? Lost Planet 2? Dragon Age 2? Crysis 2? Battlefield 3? Metro2033? Secret World?

Metro 2033 uses compute shaders for DOF just as Sniper Elite V2 does. Bandwidth is crucial for compute shader performance. The GTX 680 gets hammered in these games. especially with DOF and MSAA or DOF and AAA. GTX 580 is much faster than HD 6970 in Metro 2033 which is a TWIMTBP game.

Metro 2033 DOF with AAA - HD 7970 Ghz is 17.5% faster than GTX 680
http://www.guru3d.com/article/radeon-hd-7970-ghz-edition-review/16

Metro 2033 DOF with MSAA - HD 7970 Ghz is 25% faster than GTX 680
http://www.legitreviews.com/article/1979/8/

Anno 2070 runs faster on HD 7900 cards. HD 7950 OC will be faster than GTX 670 OC because a HD 7950(800 Mhz) is matching a GTX 670 boosting to 1 Ghz speeds

http://www.hardware.fr/articles/869-8/benchmark-anno-2070.html

http://www.hardwareluxx.de/index.php/artikel/hardware/grafikkarten/23375-test-nvidia-geforce-gtx-660-ti.html?start=10

Compute shaders are the way forward with games looking to emulate the photorealism of movies. Complex lighting calculations are done using compute shaders in Dirt Showdown.

http://www.rwlabs.com/article.php?cat=articles&id=636&pagenumber=2
 
Last edited:

sontin

Diamond Member
Sep 12, 2011
3,273
149
106
Metro 2033 uses compute shaders for DOF just as Sniper Elite V2 does. Bandwidth is crucial for compute shader performance. The GTX 680 gets hammered in these games. especially with DOF and MSAA or DOF and AAA. GTX 580 is much faster than HD 6970 in Metro 2033 which is a TWIMTBP game.

Metro 2033 DOF with AAA - HD 7970 Ghz is 17.5% faster than GTX 680
http://www.guru3d.com/article/radeon-hd-7970-ghz-edition-review/16

Metro 2033 DOF with MSAA - HD 7970 Ghz is 25% faster than GTX 680
http://www.legitreviews.com/article/1979/8/

25% is less than in Dirt:Showdown and Sniper. Hm.

Batman AC with 8x MSAA is running faster on HD 7970 cards because of better bandwidth. Anno 2070 runs faster on HD 7970 cards.

http://www.hardware.fr/articles/869-8/benchmark-anno-2070.html

http://www.hardwareluxx.de/index.ph...ten/23375-test-nvidia-geforce-gtx-660-ti.html
What has MSAA to do with DirectCompute? UE3 is Deferred Rendering. MSAA will always need bandwidth in such engines - look at some scenes in Max Payne 3.

The GTX680 is on par or slightly slower than the 7970 in Anno 2070. That's normal for a chip which is 19% smaller and has 19% less compute performance. Btw: The GTX580 is only 10% faster in Anno2070 at 1080p than 6970.

Compute shaders are the way forward with games looking to emulate the photorealism of movies. Complex lighting calculations are done using compute shaders in Dirt Showdown.

http://www.rwlabs.com/article.php?cat=articles&id=636&pagenumber=2
"Complex lightning calculations"? You mean like in every game with a Deferred Rendering engine? D:
 
Last edited:

raghu78

Diamond Member
Aug 23, 2012
4,093
1,475
136
25% is less than in Dirt:Showdown and Sniper. Hm.

25 - 30% is the margin by which HD 7970 Ghz leads GTX 680 and HD 7970 OC leads GTX 680 OC in Sniper Elite V2.

The GTX680 is on par or slightly slower than the 7970 in Anno 2070. That's normal for a chip which is 19% smaller and has 19% less compute performance. Btw: The GTX580 is only 10% faster in Anno2070 at 1080p than 6970.
I mentioned Anno 2070 as you said why Anno 2070 is ignored other than the 3 games Dirt showdown, sniper elite v2, sleeping dogs. Anno 2070 is one of the most demanding DX11 games. At its highest settings HD 7970 Ghz is significantly faster than GTX 680 even at 1080p.


"Complex lightning calculations"? You mean like in every game with a Deferred Rendering engine? D:
Dirt Showdown uses a new rendering technique called Forward+ which allows the advantages of both forward and deferred rendering while avoiding the drawbacks.

http://blogs.amd.com/play/2012/06/20/dirt-showdown-on-gcn/

So it remains to be seen how well AMD evangelizes this technology. As far as compute shaders is concerned irrespective of whether its a forward , deferred or forward+ renderer these are going to be more widely used as game developers look further to match the photorealistic quality of movies.
 
Last edited:

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
I like the idea of forward+ rendering and by AMD trying to evangelize this more may force nVidia to improve through software and hardware. Competition just doesn't provide more dollar value but also innovation, that may improve performance, immersion, fidelity and improved gaming experiences, too.
 
Last edited:

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
HIS IceQ Radeon HD 7870 GHz Edition = $199.99 on Newegg. Wow! $100 less than the 660Ti. This card will be selling out every day at this price.

Even nVidia went away from this.

We don't know that yet. This generation may have been an exception due to 28nm wafer constraints and yield issues. It would have been very expensive to manufacture a 600mm^2 die in the consumer space and sell it at reasonable price I imagine. GTX690 is $1k and that's 294mm^2 x 2 of 28nm silicon. You get the picture that NV is probably waiting until things improve which would make GK110 viable. The other possibility is that they will sell the full 15 SMX GK110 parts starting in Q4 2012 and then build up a supply of failed parts for 4 months and launch these chips in the consumer market as GK110 by Spring 2013.

Wow - 3 games? THREE games? You must be kidding me. Ignoring Dirt 2 and 3? Max Payne 3? Batman AC? Anno 2070? Lost Planet 2? Dragon Age 2? Crysis 2? Battlefield 3? Metro2033? Secret World?

Those games don't use DirectCompute as far as I am aware. Not sure why you brought them up? We were just saying that thus far NV is behind in new games that use compute shader for lighting effects and NV has not been able to correct this. Unlike BF3, the performance delta between 7970 GE and 680 in those games is huge.
 
Last edited:

sontin

Diamond Member
Sep 12, 2011
3,273
149
106
I mentioned Anno 2070 as you said why Anno 2070 is ignored other than the 3 games Dirt showdown, sniper elite v2, sleeping dogs. Anno 2070 is one of the most demanding DX11 games. At its highest settings HD 7970 Ghz is significantly faster than GTX 680 even at 1080p.

Is there a reason why you are using the 7970 GHz edition and not the normal one?
I hope that this card is faster because it needs 41% more power for 16% more performance in Anno2070: http://www.hardware.fr/articles/873-7/consommation-performances-watt.html

The normal version is 4% faster. That's nothing for a "compute heavy" architecture.

Dirt Showdown uses a new rendering technique called Forward+ which allows the advantages of both forward and deferred rendering while avoiding the drawbacks.

http://blogs.amd.com/play/2012/06/20/dirt-showdown-on-gcn/
Wow - someone repeating AMD's marketing.
Forward+ is a useless feature for every nVidia-User and every AMD User with less than 2,5 TFLOPs/s SP compute performance. Max Payne 3 is playable with more than 50 FPS on a GTX560TI. In Dirt:Showdown GTX560TI users get less than 30 FPS without AA and AF: http://www.computerbase.de/artikel/grafikkarten/2012/test-amd-radeon-hd-7950-mit-925-mhz/26/

Yes, that's "avoiding the drawbacks".

So it remains to be seen how well AMD evangelizes this technology. As far as compute shaders is concerned irrespective of whether its a forward , deferred or forward+ renderer these are going to be more widely used as game developers look further to match the photorealistic quality of movies.
And "match the photorealistic" has to do what with this? Sniper Elite 2 looks like shit and runs like shit on Kepler. How is DirectCompute used to make this game look "photorealistic" - btw the DoF in the Benchmark is hilarious and stupid at the same time.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
I hope that this card is faster because it needs 41% more power for 16% more performance in Anno2070: http://www.hardware.fr/articles/873-7/consommation-performances-watt.html

HD7970 is 8% faster than a stock 680 and GE is 19% faster. No point in using an overclocked 680 in your calculations since GE card can overclock to 1150-1250mhz and you can buy a quiet after-market 1000mhz HD7970 for $420. Also, another way to look at it is the 7950 V2 is just 3 fps slower than a GTX680 for $180 less.

IMG0037894.gif


When you move to 2560x1600, that performance grows to 23-26% for the 7970 GE.

As has been stated using the power consumption numbers for reference 7970 / GE cards is not very useful since most people buying 7970 on our forum who intend to overclock and want a quiet card will get after-market versions. This Visiontek 7970 GE uses basically the same power as a 680 and costs $430. You can buy Sapphire Dual-X for $428, Gigabyte Windforce 3x 1Ghz for $420, Sapphire Vapor-X for $450.

When a person is buying a top-of-the-line card and putting it in a modern rig, 30-40W of power is nothing if your card is giving up 19-26% performance in games the minute you step outside the popular BF3/WOW games. Not everyone just plays BF3/WOW. If you only play the World of Planes, Secret World, BF3, WOW, Lost Planet 2, then this may not matter to you.

What if I a gamer wants play something new? It's the same story with NV, the minute you step outside the most popular games, GTX680 just can't compete given its $500 price. Then there is the funny fact that 7970 GE is faster in Crysis 1 / Metro 2033 / Witcher 2, that happen to be in the top 5 of the best looking games on the PC still. GTX680 also lost in what seems to be 90% of recent games that came out: Guild Wars 2, Darksiders II, Dirt Showdown, Sniper Elite V2, Sleeping Dogs vs. The Secret World where it won. It's not that GTX680 is a bad card, it's just it's terribly overpriced right now for the performance it offers. At $400-430, it would be a viable purchase.

FirefallClient_2012_08_27_13_50_31_566.jpg

firefall%202560.png


For a card that costs $500+ and can't beat a $420 card, one has to wonder what the premium is for??? Just 3 fps faster than an HD7950 860mhz FLEX? Some people are not fine with buying a $500 videocard and playing a less known game and seeing it perform barely faster than a $320 competitor. Even in the most popular titles, NV already lost the lead in Dirt 3, Batman AC, Skyrim. There aren't many games where the 680 is winning much and when it wins, its by the slimmest of margins, not 20-30% as is the case when 7970 GE is in the lead in the titles it plays well.

==================================

NV's pricing premium is very hard to justify right now.

HD7750/7770 = no competing 28nm card from NV for 9 months now.
HD7850 2GB = $200 and $200 => NV has no competing card. This card made GTX560Ti 448, GTX570, GTX580 irrelevant for people who follow videocards.
HD7870 2GB = as low as $200 to $240 vs. $300 for 660Ti that offers just 10% more performance
HD7950 3GB = ~ $320 versions that are faster than a 660Ti out of the box and OC = GTX670 OC for $60-80 less than after-market 670s. I can see how the 670 may make sense for someone who doesn't want to OC. Also, it's a good choice for SLI.
 
Last edited:

sontin

Diamond Member
Sep 12, 2011
3,273
149
106
HD7970 is 8% faster than a stock 680 and GE is 19% faster. No point in using an overclocked 680 in your calculations since GE card can overclock to 1150-1250mhz and you can buy a quiet after-market 1000mhz HD7970 for $420. Also, another way to look at it is the 7950 V2 is just 3 fps slower than a GTX680 for $180 less.

Also, when you move to 2560x1600, that performance grows to 23-26% for the 7970 GE.

It's 4%. Every GTX680 is going up to 1124MHz.
And 23% with FXAA is nothing if you need 41% more power. It's the same problem like nVidia had with GF100/GF110. Only difference was that a GTX580 was as fast as a 6970 and used 25% more power.

When a person is buying a top-of-the-line card and putting it in a modern rig, 30-40W of power is nothing if your card is giving up 19-26% performance in games the minute you step outside the popular BF3/WOW games. Not everyone just plays BF3/WOW.
And you ignoring that people can buy custom GTX670 cards which make more sense than a GTX680 like the Gigabyte GTX670. A little bit faster than a GTX680 with the same pcb, less power and a huge oc potential without skyrocking the power consumption.

Yap, what's with all the people who want play Firefall at 1080p?
firefall%201920.png


How many are playing Sleeping Dogs and Sniper?

I'm playing Sleeping Dogs. And it's running great on Kepler. The problem with Sleeping Dogs is that the engine needs >55FPS to be stutterfree. That makes OGSSAA a useless function for everyone. And without it there no real difference between a GTX680 and 7970 even at 1440p.
 
Last edited:

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
Naming games that nobody cares about and claiming "see Nvidia sucks in games that aren't popular" means zero. What matters is what people are playing. I'm sorry that BF3 is popular, I'm sorry that people still enjoy skyrim (although I think most people are burned at out on it), I'm sorry that WoW has millions of people still playing, I'm sorry Sniper Elite didn't compete with BF3 on popularity, I'm sorry that Dirt Showdown isn't outselling the other titles released around the same time.

I'm sorry very few people have heard of Firefall are aren't following it.
 
Last edited:

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
How many are playing Sleeping Dogs and Sniper?

I'd rather play Sleeping Dogs than Lost Planet 2, Hawx 2, The Secret World and Guild Wars 2. I don't play MMOs. Someone else loves BF3 above all. The point is if you look at 20-30 games, AMD is winning in a lot more games. Further, let's consider the price and it's obvious NV needs to drop prices. They are losing in price/performance in all levels below $400 and overall performance at the $430-500 level as well. GTX670 is about the only thing they got.

Check out this GTX680 vs. 7970 GE review and it's clear when you tally up the results, when NV wins, it's by less than 10%, when 7970 GE wins, it's by 15-30%. Pretty much most reviews show 7970 GE > GTX680. When HD7970 GE goes for $430-450, that's a no brainer against the 680 since it's faster and only a handful of 680s can overclock as well as the GTX680 Lightning. Primarily the Vapor-X is the flagship 7970 GE card for $450 and the MSI GTX680 Lightning is $550 on Newegg. $100 more for what exactly? PhysX?

The rest of the lineup, outside of GTX670, NV has nothing worth buying by virtue of MIA for 3 quarters now without any 28nm next gen parts on the desktop. 660Ti is hopelessly overpriced against the 7870 and loses badly to an overclocked 7950, and can't even win against after-market 7950s out of the box.

I'm sorry very few people have heard of Firefall are aren't following it.

Not saying anyone cares about Firefall at all. I think you guys are missing the point I am making: When nV is winning, it's by the slimmest of margins, when AMD is winning, it's 10-20%, sometimes 30% and it happens when games are really demanding, not in WOW when a GPU is getting 100 fps. NV only leads by 10% or less in BF3 with MSAA and yet this game is talked about non-stop....Not sure why you mentioned Skyrim since NV loses in Dirt 3, Skyrim and Batman AC. Skyrim with Mods is a no-contest. Those are very popular games but you guys didn't talk about them even though 680 loses in all 3. Also, you have to take into account that you are paying a price premium to get that 10% extra in BF3. 660Ti looks even worse in that regard since the price premium vs. the 7870 is 30-40%, and compared to an overclocked 7950, it has no chance.

You guys keep talking about BF3 and WOW like they are the most important games. Sure they may be but when HD7970 GE winning in 80% of benchmarks against the 680 in 18 games, and it costs less, GTX680 is overpriced. That's how it works and always worked in the past. No one was cherry-picking Crysis 1 against the GTX480 vs. the 5870. We looked at 15-20 games when we declared the 480 the winner.

If you play BF3/WOW/The Secret World and Crysis 2 90% of the time, well sure for you maybe a $400 GTX670 makes sense. It also is a good option for SLI.

All it takes is 10 seconds to look at a review that looks at a variety of games and it's clear as mud that 660Ti is overpriced.
 
Last edited:

raghu78

Diamond Member
Aug 23, 2012
4,093
1,475
136
Is there a reason why you are using the 7970 GHz edition and not the normal one?
I hope that this card is faster because it needs 41% more power for 16% more performance in Anno2070: http://www.hardware.fr/articles/873-7/consommation-performances-watt.html

The normal version is 4% faster. That's nothing for a "compute heavy" architecture.

Wow - someone repeating AMD's marketing.
Forward+ is a useless feature for every nVidia-User and every AMD User with less than 2,5 TFLOPs/s SP compute performance. Max Payne 3 is playable with more than 50 FPS on a GTX560TI. In Dirt:Showdown GTX560TI users get less than 30 FPS without AA and AF: http://www.computerbase.de/artikel/grafikkarten/2012/test-amd-radeon-hd-7950-mit-925-mhz/26/

Yes, that's "avoiding the drawbacks".

And "match the photorealistic" has to do what with this? Sniper Elite 2 looks like shit and runs like shit on Kepler. How is DirectCompute used to make this game look "photorealistic" - btw the DoF in the Benchmark is hilarious and stupid at the same time.

Even if take that Sniper Elite V2 does not look amazing according to your view, are you going to say the same about Metro 2033 because GTX 600 cards suck in Metro 2033 especially with DOF and MSAA. :thumbsdown:

As for perf/watt well it can really be taken out of context if you compare a single game. Across the most demanding games HD 7970 wins more and with significant margins which is when you need all that performance. There are lots of HD 7970 (1 Ghz) running at 1.175v and using less power than HD 7970 Ghz even when overclocked to 1125 Mhz

http://www.anandtech.com/show/5314/...ouble-dissipation-the-first-semicustom-7970/6

Power
XFX HD 7970 (1125) - 424
GTX 480 - 425

http://www.anandtech.com/show/6025/radeon-hd-7970-ghz-edition-review-catching-up-to-gtx-680/17

HD 7970 GE (1050) - 429

If somebody were to choose a HD 7970 Ghz it would be more for maximizing their chances of a 1200+ Mhz overclock. the HD 7970 (1 Ghz) chips are good for perf/watt.
 
Feb 19, 2009
10,457
10
76
You guys trying to claim Sleeping Dogs isn't popular? It was hitting top 4 steam sales last i checked and its relatively new.

One of the highest rated games in recent times to boot, with a huge cast team of voice actors/actresses.

BF3 and Skyrim is popular, no doubt. No major lead for 680 in there, heck as resolutions is increased it even loses. Whats the point?

Last i checked its also as fast in Guild Wars 2. Whats the rambling about, HAWX2 and Lost Planet 2? Don't be lame please. We're talking recent dx11 titles, show casing the huge performance lead for Tahiti vs Kepler. Its a pretty damn good sign of things to come.
 

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
It doesn't show anything for the future. Nobody knows what Nvidia's next one will do and whether AMD will compete well or not.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
BF3 and Skyrim is popular, no doubt. No major lead for 680 in there, heck as resolutions is increased it even loses. Whats the point?

Not sure why people still talk about Skyrim as a win for NV. 7970 GE blows the 680 away in that game. 20% lead with 8xMSAA or SSAA at 1080P. NV hasn't had a lead in Skyrim since June 22nd. AT's review is being lazy by not including mods and keeping the cards CPU limited at their 4xMSAA settings. That's not real world Skyrim testing.

I am still waiting for a reasonable explanation in this thread why someone should spend $300 on a GTX660Ti when it can't handle MSAA, and yet costs 35-40% more than an HD7870?

GTX660Ti slower than a 1.5 year old GTX580 with MSAA, costs $300. That's a fail.
Skyrim%202560.png


Last i checked its also as fast in Guild Wars 2.

I haven't seen anyone here mention SLI scaling issues that exist in GW2. Someone posted a link with a guy playing GW2 on 4 MSI Lightning 7970s with what looks to me like fairly decent CF scaling.

I find it interesting that we should suddenly consider the Secret World and WOW above Crysis 1 and Metro 2033 when TSW incorporates the world's most ugliest use of Tessellation , looks like a blur-fested mess with TXAA and has crappy graphics to boot, while WOW is blasting at 110-130 fps on modern GPUs. Why would I spend $500 to go from 110 fps to 140 fps in WOW? The game is going to benefit more from a 5.0ghz Core i CPU overclock to sustain those min fps I bet. Considering that game actually looks with FXAA+ no tessellation vs. TXAA+tessellation, the performance becomes like this --> 91 fps for a 925mhz 7970 vs. 96 fps for a GTX680. :sneaky:

Sorry, but not many people are going to spend $500 on a GTX680 to get 5 fps more than a 925mhz 7970 that costs $80 less.
 
Last edited:
Feb 19, 2009
10,457
10
76
It doesn't show anything for the future. Nobody knows what Nvidia's next one will do and whether AMD will compete well or not.

It certainly would offer more potential future proofing for current owners, GCN is gaming and compute beast. Kepler is gaming beast. If more compute is used in gaming = ??

You can stretch it all you want, but if more game engines use DX11 compute for lighting and especially if they go with Forward+, i'm more confident in my OC 7950 performing well in future titles than if i had bought a 670. Even in heavy tessellation as per Crysis 2, its winning. There's no way for NV to exploit architectural weakness in Tahiti (because its inherently stronger, similar to 480 vs 5870 where the 480 was a stronger architecture) with TWIMTBP unless they force Physx on by default...
 
Feb 19, 2009
10,457
10
76
Not sure why people still talk about Skyrim as a win for NV. 7970 GE blows the 680 away in that game.

Exactly, times have changed, AMD really delivered with their drivers, in games where NV won, they now draw or lose to AMD. But games that AMD won, and won with big margins, NV is still faceplanting with no improvement in sight.
 

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
I buy for now, if I was buying for the future I'd invest in something other than my PC.

Exactly, times have changed, AMD really delivered with their drivers, in games where NV won, they now draw or lose to AMD. But games that AMD won, and won with big margins, NV is still faceplanting with no improvement in sight.

Is that why people running dual card setups have more micro stutter issues with AMD? Faceplant? Not really. Nvidia delivered from day one it's not their fault AMD has a slow driver team and needed to play catch up. I didn't buy on a "maybe in the future the performance will catch up". That's the difference.
 
Last edited:
Feb 19, 2009
10,457
10
76
I buy for now, if I was buying for the future I'd invest in something other than my PC.

Is that why people running dual card setups have more micro stutter issues with AMD? Faceplant? Not really. Nvidia delivered from day one it's not their fault AMD has a slow driver team and needed to play catch up.

NV seems to delivered crap performance in recent games lately. Let's not mention that? And catch up, they NV havent even caught up in Alan Wake and thats old.

As to buying for now, well yeah the OC 7950 is faster for cheaper, NOW. Its more likely to be faster in future dx11 games as well.

When all fail, lets bring up the SUBJECTIVE issue of microstutter that affects how many ppl??

Edit: And if anyone brings up sleeping dogs "not popular": recent launch, top ten consistently: http://www.gamespot.com/ & http://store.steampowered.com/
 
Last edited:

raghu78

Diamond Member
Aug 23, 2012
4,093
1,475
136
I am playing Sleeping Dogs with a HD 6950 2GB (860 Mhz) with all settings maxed except anti aliasing at High instead of extreme. 35 - 45 fps and i am really enjoying the game.

For people who keep telling Sleeping Dogs is not popular . Its topping the charts in UK .

http://www.eurogamer.net/articles/2012-08-28-uk-chart-sleeping-dogs-denies-darksiders-2

vgchartz shows both the X360 and PS3 in the top 5 global sellers

http://www.vgchartz.com/

Sleeping Dogs with maximum settings Extreme antialiasing is much faster on the HD 7900 cards.

http://www.xbitlabs.com/articles/graphics/display/msi-n680gtx-lightning_8.html#sect3

At 1080p definitely playable on maximum settings on a HD 7970(1.1 Ghz). At 1600p the antialiasing will have to be run at High and it should be be fine on HD 7970.
 

brandon888

Senior member
Jun 28, 2012
537
0
0
bla bla bla bla amd sux .... bla bla bla nvidia sux ..... aren't you all tired ?

RussianSensation

no doubt that 660 TI is weak for AA but this card is made for 1080P ... not for 1600P ... even nvidia said that on main site :) it is made for most gamers who use 1080 P ...


i went with 670 anyway cause it will own AA better even at 1080P .... but does not mean that 660 TI is bad .... but agree ... it's a bit overpiced ;/


if i do like amd .... i will go for same price for 7950 twin frozr for sure :)

no offence to nvidia fans ... im in your side but just trying to be objective ....




anyone else think that it could be better cut off one more smx then ROPS and bits ? and make 660 TI with 1152 cuda cores and 256 bit ?
 

sontin

Diamond Member
Sep 12, 2011
3,273
149
106
Even if take that Sniper Elite V2 does not look amazing according to your view, are you going to say the same about Metro 2033 because GTX 600 cards suck in Metro 2033 especially with DOF and MSAA. :thumbsdown:

DoF is a wortless feature in games. Why should i sacrifice performance for something my eyes doing for free? DoF is only working for a media which people consume passiv. And yes, the benchmark looks bad.

As for perf/watt well it can really be taken out of context if you compare a single game. Across the most demanding games HD 7970 wins more and with significant margins which is when you need all that performance. There are lots of HD 7970 (1 Ghz) running at 1.175v and using less power than HD 7970 Ghz even when overclocked to 1125 Mhz

http://www.anandtech.com/show/5314/...ouble-dissipation-the-first-semicustom-7970/6

Power
XFX HD 7970 (1125) - 424
GTX 480 - 425

http://www.anandtech.com/show/6025/radeon-hd-7970-ghz-edition-review-catching-up-to-gtx-680/17

HD 7970 GE (1050) - 429

If somebody were to choose a HD 7970 Ghz it would be more for maximizing their chances of a 1200+ Mhz overclock. the HD 7970 (1 Ghz) chips are good for perf/watt.
Fine, but you brought the GHz into the discussion. Now life with it.

Sleeping Dogs with maximum settings Extreme antialiasing is much faster on the HD 7900 cards.

http://www.xbitlabs.com/articles/graphics/display/msi-n680gtx-lightning_8.html#sect3

At 1080p definitely playable on maximum settings on a HD 7970(1.1 Ghz). At 1600p the antialiasing will have to be run at High and it should be be fine on HD 7970.

No, it's not playable: http://www.pcgameshardware.de/Sleep...sts/Sleeping-Dogs-im-DirectX-11-Test-1020709/

That's ingame and not the benchmark. 38 FPS on a 7970@1200MHz it's not playable in this game.
 
Last edited:

raghu78

Diamond Member
Aug 23, 2012
4,093
1,475
136
DoF is a wortless feature in games. Why should i sacrifice performance for something my eyes doing for free? DoF is only working for a media which people consume passiv. And yes, the benchmark looks bad.

Next you would say super sampling anti aliasing should not be used in games because its too bandwidth intensive. :thumbsdown:

Blame the developers for using SSAA in Sleeping Dogs. You don't understand that features are available. Use them as you want. But don't draw general performance conclusions based on how you utilize the features of the game. You might like to play with AAA in Metro 2033 or FXAA in BF3. But other might just not prefer blurring of the images and still prefer MSAA or SSAA (if the developer implemented it ,as they have in Sleeping Dogs). :D