• Guest, The rules for the P & N subforum have been updated to prohibit "ad hominem" or personal attacks against other posters. See the full details in the post "Politics and News Rules & Guidelines."

Geforce GTX 670 for 2560 x 1440 gaming

Page 5 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

f1sherman

Platinum Member
Apr 5, 2011
2,243
0
0
well you can clearly see here that FXAA can blur in some games. http://hardforum.com/showthread.php?t=1698508&highlight=fxaa&page=3
yes, but if you're not satisfied with driver or game FXAA,
you can use FXAA 3.11 injection and fine tune blur amount, pre/post sharpening,

and SMAA gives good results out of the box

IQ is truly subjective based on tastes and tolerances. Personally would trade-off a bit of clarity for a clean screen while moving. My beef has been, really since the introduction of the shader and specular to titles is innovation was needed to combat the aliasing of the modern abilities.

Doom3 and FarCry were staples with benchmarking years ago -- always numbers but what these titles offered were screaming aliasing from specular to alphas.. Would see investigations zoomed in to compare image quality but in motion, where most aliasing was seen was rare in discussions.

Transparency AA was a god send to have.

FXAA and MLAA are a god send based on compatibility, low hit, enhances the entire screen.

Super-sampled is a god send if one has the performance for the gaming title.

Multi-sampling is a god send.

Tools offered to the gamer to try to find the right balance for their subjective tastes, tolerances, application, platform and display. If developers can have tools; why not gamers?
pretty much, there's a time and game for everything - from FXAA, to down-and-super-sampling :)
 

Grooveriding

Diamond Member
Dec 25, 2008
9,080
1,218
126
Post-AA modes are a blurfest. Nothing innovative about taking a step back in IQ and trying to pass it off as an improvement.

Don't be so dramatic and it's embarrassing reading something like 'disgusted by not being grateful' for junk like post-AA. Really ?

Post-AA was introduced by a third party and then AMD/NV both decided to roll out their knock off versions of the feature. It's a blurfest, no game does not look blurry when playing it having it on. If you prefer to settle for sub par IQ that is your choice. At least be realistic about what you're getting.

I don't have to use post-aa in any title because I bought a setup that can handle not having to use it.

Trying to claim FXAA is anywhere near the quality of even simple MSAA is laughable. Why go down that road when a couple screenshots will make the foundation of that claim erode away. Post-AA actually tends to look better in screenshots as well, when it comes to actual gameplay in motion, it gets much worse and really blurry. Battlefield 3 is atrocious with FXAA on, blur city.
 

f1sherman

Platinum Member
Apr 5, 2011
2,243
0
0
Grooveriding you are right, I shouldn't fret about this.

As SirPauly said, AA is a matter of taste and preference.
so it shouldn't come as a surprise that Joe the Gamer is not very subtle about it.



Oh and next time... if in motion, it gets much worse and really blurry.

go into Game Setings AND TURN OFF THE MOTION BLUR

^_^
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
746
126
Despite what everyone says in this thread

My single 7970 with 12.6 catalyst drivers at 2560x1440 gets about 55fps on ultra in BF3 on 64 man servers. This is overclocked to only 1100/1500. I'm not sure what the min frame rates are, probably something around 35 but I average 55.
Some people play single-player too and 7970 can handle MSAA better at 2560x1600 than a GTX670.



I'd rather take playable performance over AA. AA isn't 100% absolutely necessary. If you can have both fine, with a single card at 2560x1440/1600 you can't. That was my point. I'm saying you can show 8xaa is better with one card over another but when both are around 35fps I'll take neither and remove AA entirely.
Ya, but that's not always the case. You can still use some lower AA such as 2-4AA and beat that GTX670 without much effort at 2560x1440/1600. The superclock Giga card is factory preoverclocked to 1080mhz (The Vortex II on the Egg is 1100mhz for $440). Take a look at Crysis 2 where GTX670 manages just 28 fps and HD7970 1080mhz gets 34 fps. Do you think if you remove AA from Crysis 2 that suddenly GTX670 will beat it? With overclocking, GTX670 won't catch up either since 7970 can scale to 1150+:



Even with FXAA, 670 still can't win at 2560x1440/1600:


With $20-40 now separating factory aftermarket 7970s and GTX670s, stock vs. stock, or overclocked vs. overclocked, the 670 is not necessarily the better card. With the Radeon you get the option of turning on 2-4AA and still be faster than 670. For example in Batman AC, HD7970 GE has enough power for AA at 2560x1600 (7970 GE = 51.2, GTX680 = 45.8, GTX670 = 39.4).
 
Last edited:

cmdrdredd

Lifer
Dec 12, 2001
26,848
280
126
Russian, you missed my point. Since neither is playable in my opinion with 4xAA you can turn AA off and make both playable. The option then turns out to be a wash unless you get into playing online and in that case you'll have to run 1080p or run a dual GPU solution.

I suppose it depends on the intention of the individual to some degree.

Despite what everyone says in this thread

My single 7970 with 12.6 catalyst drivers at 2560x1440 gets about 55fps on ultra in BF3 on 64 man servers. This is overclocked to only 1100/1500. I'm not sure what the min frame rates are, probably something around 35 but I average 55.

Keep in mind that gulf of oman for example, uses a TON of vram (beyond 2GB) so the 7970 simply crushes the competition simply because of the ram issue.
Not to get into a vram debate but what shows up as used in the OSD isn't always accurate. It's been reported that afterburner and PrecisionX may report huge VRAM usage then other tools show much lower amounts actually in use.
 
Last edited:

RussianSensation

Elite Member
Sep 5, 2003
19,458
746
126
Russian, you missed my point. Since neither is playable in my opinion with 4xAA you can turn AA off and make both playable. The option then turns out to be a wash unless you get into playing online and in that case you'll have to run 1080p or runa dual GPU solution.

I suppose it depends on the intention of the individual to some degree.
Yes of course in cases where AA brings down performance to unplayable levels, and in heavier titles such as BF3, 4xMSAA probably needs 2 GPUs at 2560x1600.

Imo, this argument becomes circular in nature though. If you are going with a single $300-500 GPU and don't intend to use AA, and think the 15-20% performance delta is a wash when modern GPUs are "equally slow" so to speak (at 30-35 fps), a gamer can simply save $90 and get MSI TwinFrozr 7950 and overclock that to 1.1ghz and put that $90 aside towards the next generation upgrade. Using the same train of thought, the performance difference will be "equally slow" with a $400 GTX670. Why spend the extra $90 for it then if it won't be any more playable?
 
Last edited:

The_Golden_Man

Senior member
Apr 7, 2012
816
0
0
Players who have 2560x1440 or 2560x1600 monitors should go for GTX 670/680 SLI, GTX 690 or 7950/7970 Crossfire.

Even I, who game at 1920x1200 see the benefits of 2x GTX 670 SLI. In games like The Witcher 2 I always stay over 60FPS (100fps + very often) with everything maxed out except Uber Sampling. Another thing I've noticed in this game, and a few others, is that If I stay at 80FPS + with Vsync disabled I get no input lag + very responsive gameplay with no screen tearing. If it closes down to 60FPS I get screen tearing with Vsync disabled. Vsync enabled is fine as long as it's always on 60FPS, but has minor input lag and does not feel as fast and fluid as ~ 80FPS + with Vsync disabled. Oh, and I have 60Hz LCD.

But each to his own.
 

cmdrdredd

Lifer
Dec 12, 2001
26,848
280
126
Yes of course in cases where AA brings down performance to unplayable levels, and in heavier titles such as BF3, 4xMSAA probably needs 2 GPUs at 2560x1600.

Imo, this argument becomes circular in nature though. If you are going with a single $300-500 GPU and don't intend to use AA, and think the 15-20% performance delta is a wash when modern GPUs are "equally slow" so to speak (at 30-35 fps), a gamer can simply save $90 and get MSI TwinFrozr 7950 and overclock that to 1.1ghz and put that $90 aside towards the next generation upgrade. Using the same train of thought, the performance difference will be "equally slow" with a $400 GTX670. Why spend the extra $90 for it then if it won't be any more playable?
Then you have the question of drivers. I know you hate hearing it but if I were to put money down on a bet on who would have a performance driver for a new game first, I'd bet on Nvidia. That's my experience anyway.


I think this quote from Hardocp forums really sums up the AA debate for me and I agree with it in a lot of ways.

With regards to some of the 'blurring' in the screenshots, in some cases there is definitely detail reduction, but in other cases, there's simply a reduction in shader aliasing. MSAA appears to be parroted as the superior technique in this thread, and in tandem with a post-process anti-aliasing method, that's accurate, but MSAA alone does not attack shader aliasing. That's easily one of the more objectionable artifacts in modern games, but we've become so accustomed to seeing it that a reduction in it appears incorrect.
For many of you, the first response will be to judge TXAA compared to other AA or no-AA filters using still images, and then remark how TXAA does look less aliased, but also looks less sharp. This is the correct response too, because it is physically impossible to remove aliasing, especially temporal aliasing, without resulting in a perceptual reduction of sharpness. Motion however is where the real battle for AA is fought, and where TXAA really starts to shine compared to all prior methods.
 

SirPauly

Diamond Member
Apr 28, 2009
5,187
0
0
Even with FXAA, 670 still can't win at 2560x1440/1600
The GTX 670 wins not only over the HD 7970 but the HD 7970ghz edition. Why are you using this OC edition and blanketing all HD 7970's?

How about a factory GTX 670 OC model vs a Factory HD 7970 OC model and then compare performance, power efficiency, acoustics, thermals and price.
 

The_Golden_Man

Senior member
Apr 7, 2012
816
0
0
The GTX 670 wins not only over the HD 7970 but the HD 7970ghz edition. Why are you using this OC edition and blanketing all HD 7970's?

How about a factory GTX 670 OC model vs a Factory HD 7970 OC model and then compare performance, power efficiency, acoustics, thermals and price.
I didn't know the GTX 670 was faster than the 7970, let alone the GHz edition? I thought AMD had taken the performance crown back with the GHz edition, with regards to single GPU's.
 

ShintaiDK

Lifer
Apr 22, 2012
20,378
131
106
I didn't know the GTX 670 was faster than the 7970, let alone the GHz edition? I thought AMD had taken the performance crown back with the GHz edition, with regards to single GPU's.
AMD didnt take the performance crown back.

Anandtech said:
The end result is that while AMD has tied NVIDIA for the single-GPU performance crown with the Radeon HD 7970 GHz Edition, the GeForce GTX 680 is still the more desirable gaming card. There are a million exceptions to this statement of course (and it goes both ways), but as we said before, these cards may be tied but they're anything but equal.
Not to mention, the HD7970GE is still nowhere to be found. Vaporware?
 
Last edited:

YBS1

Golden Member
May 14, 2000
1,931
111
106
Then you have the question of drivers. I know you hate hearing it but if I were to put money down on a bet on who would have a performance driver for a new game first, I'd bet on Nvidia. That's my experience anyway.
This is a really big point, and just what I was thinking as I read RussianSensation's first couple of posts in this thread. He was referring to how others were posting benchmarks using outdated drivers, that they were irrelevant now. No, not really. In fact they pretty much drive home a simple point. BF3 was released October of last year and in public beta for quite a bit before that. It's taken how long now for AMD to be "on par"??? This isn't a one off occurrence, it happens over and over with AMD, and it's amplified further under multi-gpu situations.

The flip side to this...Release day of Ghost Recon Future Soldier running a GTX680. Launch game....say to myself "Wow, this is running like moldy ass, WTH?" Open nV control panel, click check for updates, "profile update found". Relaunch, uh huh.....smooth as glass. This is how it should be and this is what AMD is failing at, not hardware. They (almost) always have good hardware.

I don't want to play games 6 months from now when they get the drivers sorted out....I want to play them on MY schedule.
 
Last edited:

RussianSensation

Elite Member
Sep 5, 2003
19,458
746
126
The GTX 670 wins not only over the HD 7970 but the HD 7970ghz edition. Why are you using this OC edition and blanketing all HD 7970's?

How about a factory GTX 670 OC model vs a Factory HD 7970 OC model and then compare performance, power efficiency, acoustics, thermals and price.
How is it 5 pages into this thread and you are still on this? I already addressed both of those points you mentioned earlier in the thread but maybe you missed it -- here.

It's been stated at least 3-4x in this thread the discussion is about GTX670 at $400 OR $420-440 1000-1100mhz for quiet after market versions of HD7970 which may be worth spending the extra $ for at this resolution for the OP. Not one person in this thread has once said anything about a 925mhz HD7970. This isn't about HD7970 vs. GTX670, but what's the best card around $400 range is for 2560x1440/1600 and providing info around that to help the OP make a more informed decision and provide other gamers who haven't upgraded yet with more up-to-date information using the latest drivers.

Also, GTX670 doesn't even win against stock HD7970 925mhz in case you want to know at 2560x1440/1600:

Computerbase
TPU

So let's not start making stuff up now.

Well, well...

waiting to see what RussianSensation has to say about this :D
5 pages later and we still get a post such as this one below despite 5 professional reviews stating otherwise.

AMD didnt take the performance crown back.
Can you please link 1 review that shows that HD7970 (esp. at 1000-1100mhz) isn't faster than GTX670 at 2560x1440/1600? Also, feel free to find a review anywhere where a GTX680 is recommended over HD7970 for higher resolutions (not talking about SLI drivers here). I provided information from at least 5 separate professional publications (TechPowerup, Computerbase, H4Tu, TechReport, TechSpot) and in each of those HD7970 1050mhz is the fastest GPU in the world.

Here is Review #6 from KitGuru:
"Until today, the incredible KFA2 GTX680 Limited OC Edition claimed the ultimate single GPU performance spot, however in the majority of the real world game testing, the Sapphire HD7970 6GB Toxic Edition managed to outperform the overclocked GTX680. The performance results are unquestionably impressive. In 7 out of 11 tests, The Sapphire HD7970 6GB Toxic Edition outperformed the KFA2 GTX680 Limited OC Edition."

Here is Review #7 from Xbitlabs where GTX680 @ 1290mhz couldn't even beat an HD7970 @ 1165, which means GTX670 has no chance at all.

Here is Review #8 from BitTech.net: "the 7970 3GB GHz Edition is the fastest single GPU card when compared to the GTX 680 2GB across all of our benchmarks"

This is a really big point, and just what I was thinking as I read RussianSensation's first couple of posts in this thread. He was referring to how others were posting benchmarks using outdated drivers, that they were irrelevant now. No, not really. In fact they pretty much drive home a simple point. BF3 was released October of last year and in public beta for quite a bit before that. It's taken how long now for AMD to be "on par"??? This isn't a one off occurrence, it happens over and over with AMD, and it's amplified further under multi-gpu situations.
BF3's performance was fixed right around the launch of HD7970 GE (June 22nd). GTX680 launched on March 22, 2012 (or almost 3 months after HD7970). AMD fixed BF3 performance about 3 months after GTX680 launched. Until March 22nd, HD7970 (esp. overclocked) was faster than GTX580 in BF3. So if waiting 3 months is such a big deal to get good BF3 performance for you, you would have been gaming on HD7970 for 3 months and not waited for GTX680 since it was faster than GTX580. If BF3 performance was such a big deal, how did you endure it on GTX500 series cards until March 22nd then?

Secondly, what about the games where AMD was faster from Day 1: Anno 2070, Bullet Storm, Serious Sam 3, Alan Wake, Metro 2033, Deus Ex? Those don't count for some reason? What about games where NV's performance was laughable due to their own driver issues -- you didn't mention Shogun 2, despite that being a modern game like BF3 is. Let's not pretend only AMD has driver issues with newer games.

Thirdly, this isn't about what NV vs. AMD. I don't know why you guys are started to talk about how things were in March, etc. How does this apply to this particular thread and the recommendation to the OP? I provided information to the OP regarding current up-to-date drivers and performance in games. It's not how GTX680 performed 3 months ago but what card is the best card for 2560x1440/1600 today. Also, why is SLI even being brought into this? No one is saying anything about multi-GPUs or drivers for such setups. Please read the OP. The responses addressed specifically single-GPU gaming at 2560x1440/1600.

I figured this thread couldn't possibly be neutral, especially since with the latest drivers and price drops AMD has regained both the price/performance (HD7950) and performance crown (HD7970 GE). This is exactly why this thread is so useful because it seems the perception for the cards is still stuck in March 2012, but it's no longer the case. New buyers should be aware of more up-to-date information imo.
 
Last edited:

SirPauly

Diamond Member
Apr 28, 2009
5,187
0
0
How is it 5 pages into this thread and you are still on this?
Indeed! For the life of me, don't understand how you can say 20 percent advantage with a HD 7970, when it was actually the Ghz edition. For the life of me, don't understand how one can say the GTX 670 can't win with FXAA -- and yet the benchmark you offer, the GTX 670 clearly is leading over the HD 7970 and HD 7970 ghz edition.

Over-clocking is not absolute and some receive better cores than others.
 

blackened23

Diamond Member
Jul 26, 2011
8,548
1
0
Well, well...

waiting to see what RussianSensation has to say about this :D
I'll tell you what I have to say. I bought crossfire 7970s on release and immediately overclocked them to 1150mhz in crossfire, which I was able to do easily I might add thanks to not having locked voltage (nvidia may want to emulate this with GK110). Now after GTX 680 was released I saw all the hype and all the websites telling me that the GTX 680 is now the clear winner and the single GPU champion, so I fell for the hype and immediately bought 2 EVGA GTX 680 cards to run in SLI.

Fast forward a few days later. Keep in mind above, like I said, my crossfire 7970s were overclocked to 1150 in xfire from day 1. Now since all the websites told me that the gtx 680 was the single GPU winner, I was expecting major performance increases across the board. I had a direct comparison of overclocked 7970s in crossfire vs 680s in SLI.... What I actually found is that many games were slower, sometimes signifigantly so (crysis warhead, metro 2033, alan wake). I mentioned that here on this very board and of course I got attacked and was told that I was absolutely crazy by a lot of hardcore nvidia fans here. Now that isn't to say AMD had a victory , nvidia definitely won other games like skyrim and bf3. But it wasn't what websites led me to believe, it wasn't a clear win especially for a card touted as the "single GPU champion".

Well fast forward 4 months later and you people are seeing what I saw before, but obviously nobody would believe it because the hype machine tells you that the 680 is the best. The 7970 scales really, really, really well with higher clockspeeds and outperforms the 680 in many titles when overclocked. The point i'm getting at here is that the 7970 scales with clockspeeds much better than the 680 does, and after having used both extensively I'll say that is a 100% fact. Obviously the 7970 loses to the 680 when it is clocked at 925mhz, and i'm sure all of the quad GTX 480 users here will mock the power consumption of the 7970. Whatever.

Now having said all that, nvidia had a clear victory over AMD on the software side, there is little doubt that nvidia had a smoother experience than AMD did and this I think is one critical reason why nvidia won more mind share this time around. I'm not slighting the 680 here, it is an awesome card and I enjoy my cards greatly. As I said nvidia had better software support and this is one reason why the 680s had a much smoother launch than AMD did - heck AMD didn't even have WHQL drivers for a month or so, and that is ridiculous. But, now that the software issues are sorted (from what i understand) nobody should be stating that either one wins. They're both great cards, just buy what you prefer, they both perform in real world scenarios similarly.
 
Last edited:

YBS1

Golden Member
May 14, 2000
1,931
111
106
People can choose to believe all they want and if their brand loses, they start bringing drivers into it, or something else like PhysX.
I'm running SLI'ed MSI Lightning 680's, I'm not "losing" to anything. If it's playable on a 7970 then generally it's going to be playable on the 680 and vise versa. This is for the most part going to be true with any of the cards in the ~$400 and up category.

Of course people are going to bring up drivers as they have a very real and tangible benefit to the end user experience. Inadvertently you showed this to be true by highlighting how well the Catalyst driver has come along over the months, of course this really just demonstrates how inadequate it was to begin with. Expect this cycle to continue with most new game releases, as it's par for the course with AMD lately.

The only thing AMD really has a clear win in is price, and I'm not going to deal with their driver and multi-gpu B.S. to save a few bucks.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
746
126
That core wasn't 1290 for the GTX 680 and yet you continue to say 1290; it was 1212.
Who cares about the core speed. It's 1212 (+ GPU Boost) = 1290mhz. So it's effectively a 1290mhz GTX680 card vs. an 1165mhz HD7970. In other words, an overclocked GTX680 could not beat an overclocked HD7970, so what conclusion can be made regarding GTX670 OC vs. HD7970 OC?

Also, if you read the responses carefully over 5 pages, I specifically stated the OP can do better if he wants to spend $20-40 extra for factory pre-overclocked aftermarket 7970 versions. I also said that an overclocked HD7970 will open up a 20% lead over an overclocked 670. Did I once link reference 925mhz HD7970 cards to the OP? All the benchmarks are implicit of HD7970 at 1000-1100mhz (i.e., HD7970 GE is a good approximation of that). If you can't understand that, I don't know what to tell you. Now this is why the benchmarks I linked use 1050mhz GE cards and where I am getting 20% extra performance advantage. Thirdly, even at stock 925mhz, HD7970 is still faster at 2560x1440/1600. What's so hard to understand about this?

All these points have been addressed as early as Post #5 and you are still talking about 925mhz HD7970 for some unknown reason.

I'm running SLI'ed MSI Lightning 680's, I'm not "losing" to anything. If it's playable on a 7970 then generally it's going to be playable on the 680 and vise versa. This is for the most part going to be true with any of the cards in the ~$400 and up category.The only thing AMD really has a clear win in is price, and I'm not going to deal with their driver and multi-gpu B.S. to save a few bucks.
If we were discussing multi-GPUs in this thread, I would have recommended GTX670 SLI. You started talking about driver scaling, profile updates (i.e., making this GTX680 SLI vs. HD7970 CF comparison). As such, you chose to address my posts by bringing extraneous information to justify why you bought GTX680 SLI which has nothing to do with recommending a single-GPU for the OP at 2560x1440/1600. I never once recommended HD7970 CF in this thread over GTX670/680 SLI or otherwise.

In Post #60, I addressed the points you are making in case the OP decided to proceed with a multi-GPU solution down the line:

"I don't see how buying a slower card with worse overclocking and less VRAM when spending $400+ is a valid reason to choose the 670 right now for a gaming enthusiast unless someone wants specific NV features (PhysX, 3D Vision Surround, Adaptive Vsync) or is going for SLI, which admittedly is smoother than CF." :)

I'll tell you what I have to say. ... They're both great cards, just buy what you prefer, they both perform in real world scenarios similarly.
Great post, but it's not going to change the fact that most people still think that GTX670 > HD7970 and GTX680 > HD7970 1050mhz (GE) performance wise. More up-to-date information in this thread should be an eye-opener for people who haven't upgraded yet and are still stuck in the March 2012 time zone. This thread going to 5 pages long shows that some people are probably surprised to see that GTX670/680 are no longer the "slam dunk" choice after rounds of price cuts and driver updates on the AMD side.
 
Last edited:

cmdrdredd

Lifer
Dec 12, 2001
26,848
280
126
GPU boost isn't constant and locked. It varies from benchmark to benchmark and game to game depending on the GPU load. Even on the same benchmark I can see higher or lower boost clocks. It's pretty weird so it's hard to lock down a constant number and say "it was running at x Mhz." I wish they didn't do this, but I understand why they did I guess. It just makes it harder to find the right overclock.
 
Last edited:

YBS1

Golden Member
May 14, 2000
1,931
111
106
If we were discussing multi-GPUs in this thread, I would have recommended GTX670 SLI. You started talking about driver scaling, profile updates (i.e., making this GTX680 SLI vs. HD7970 CF comparison). As such, you chose to address my posts by bringing extraneous information to justify why you bought GTX680 SLI which has nothing to do with recommending a single-GPU for the OP at 2560x1440/1600. I never once recommended HD7970 CF in this thread over GTX670/680 SLI or otherwise.
I didn't recommend anything to the OP, I was simply offering an explanation as to why many people would choose even the 670 over the 7970 in general though the 7970 will usually bench faster. I mean really, looking at this as non biased as possible, taking 8 months to get your drivers where they should be for the (like the game itself or not) biggest benchmark/performance title of the past year is not acceptable. Most likely the only reason the performance came up to where it has in BF3 was because it was taking a drubbing against the 680 in reviews. Otherwise they probably wouldn't have bothered with improving it at all, and if I owned a 7970 that would kinda piss me off.
 

SirPauly

Diamond Member
Apr 28, 2009
5,187
0
0
Great post, but it's not going to change the fact that most people still think that GTX670 > HD7970 and GTX680 > HD7970 1050mhz (GE) performance wise.
Where do you get this stuff from?

The HD 7970 clearly offers more performance over-all than a GTX 670 but the key decider may be is the efficiency, differentiation and cheaper MSRP of the GTX 670 compelling enough to warrant strong sales compared to the HD 7970?

Personally allow the market decide.
 

cmdrdredd

Lifer
Dec 12, 2001
26,848
280
126
I didn't recommend anything to the OP, I was simply offering an explanation as to why many people would choose even the 670 over the 7970 in general though the 7970 will usually bench faster. I mean really, looking at this as non biased as possible, taking 8 months to get your drivers where they should be for the (like the game itself or not) biggest benchmark/performance title of the past year is not acceptable. Most likely the only reason the performance came up to where it has in BF3 was because it was taking a drubbing against the 680 in reviews. Otherwise they probably wouldn't have bothered with improving it at all, and if I owned a 7970 that would kinda piss me off.
I think that they would have increased performance in BF3 anyway, but I agree that they were really taking a huge hit in the reviews. We all know that weighs on the minds of gamers.

3/22/12



7/20/12
 

ASK THE COMMUNITY