[H] - GTX680 3-Way SLI vs. 7970 Tri-Fire review

Page 6 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Jaydip

Diamond Member
Mar 29, 2010
3,691
21
81
jMi8K.png


ApaqV.png


Least AMD didn't furbar tri and quad scaling for single monitors. Srsly thats just pure fail there.

Fail review.U shouldn't test single monitor gaming with 3 or 4 680s or 7970s
 

aaksheytalwar

Diamond Member
Feb 17, 2012
3,389
0
76
These gpus can't even maintain minimum a lot above 60 fps at these settings anyway, so these are the ideal settings to test as they are the real best perfectly enjoyable settings wi zero compromise
 

boxleitnerb

Platinum Member
Nov 1, 2011
2,605
6
81
He probably has them running in PCIe 3.0 whereas the review is 2.0

You have to be a true fan to cite scaling, while ignoring the fact that it has more pronounced MS because of it though.

Everyone knows 5xxx series had issues with scaling, and AMD solved that with 6xxx by no longer attempting to combat MS.

He has a i7-920@3GHz I think and he runs 2560x1600 and 3840x2160 with SGSSAA if performance allows it.
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
This whole debate about what is a sufficient amt. or too much VRAM needs to be retired to the equine graveyard. It's been had numerous times.

131784.gif



IMHO, a 2nd 7970 sku wouldn't generate any more sales at the moment. It would just take sales from the more expensive sku. Maybe in the future, the market will change and make it a viable option, business wise. I believe we are likely to see a 1.5gig 7950 first. The 7870 is likely going to drop in price to make it a closer value compared to the 7850. That will create enough space between the 7870 and 7950 for an additional sku to help maintain the prices of the 3gig models. Whatever nVidia does will also effect what AMD does with these market points. In the end though it will be a business decision. Not what's best for you or me.
 

Jacky60

Golden Member
Jan 3, 2010
1,123
0
0
and you are missing MY POINT. you dont gimp your fastest card by not giving it enough vram to handle every playable situation. you dont cut down the vram and tell customers not to play games with settings that create oddball situations. 1.5gb will limit you now in some cases so it would be stupid to equip a 7970 with hit.
Don't be silly no card needs more than 256mb memory because as the Frostbite engine dynamically adjusts the level of detail you wouldn't suffer any loss of quality.
 

Jaydip

Diamond Member
Mar 29, 2010
3,691
21
81
Maybe "U" in fact should when even 4 flagship cards can't get you a solid 60fps at 2560x1600.
That proves how poorly optimized the game is.Tri or quad fire or sli should be tested in eyefinity resolutions.
 

Bobisuruncle54

Senior member
Oct 19, 2011
333
0
0
Read the forum thread over at hardocp...it scarees me that so many people are ignorant about microstuttering.

It's like watching lemmings...go ape*bleep* over the current FOTM:

OMG...GHZ!!!!
OMG...FPS!!!!

WTF...IPC?!
WTF...Micro-what?!

:(

Using a frame limiter to match the refresh rate of the monitor can eliminate practically all traces of MS, but it can be annoying to get it to work across all games.
 

Mistwalker

Senior member
Feb 9, 2007
343
0
71
Who ever promised you that?

Fallacy time?
What? I was responding to a post stating single monitor shouldn't be tested at all, and my point was that it can be far from a useless inclusion.

That proves how poorly optimized the game is.Tri or quad fire or sli should be tested in eyefinity resolutions.
Personally I agree about the poor optimization and desire for more focus on (multi-monitor) resolutions people will be buying these cards to play at. Just remarking that in certain cases even single monitor resolutions aren't even maxed so it's informative to include those (as a measure of just how unoptimized some things remain if nothing else).
 

realjetavenger

Senior member
Dec 8, 2008
244
0
76
Using a frame limiter to match the refresh rate of the monitor can eliminate practically all traces of MS, but it can be annoying to get it to work across all games.

I was getting really tired of microstutter. Last night I tried adaptive vsync for the first time and so far signs of ms have been decreased to the amount that it no longer bothers me. Sure, it's only after one night of using it but it looks promising.
 

realjetavenger

Senior member
Dec 8, 2008
244
0
76
Again... BF3 adjusts LOD depending on available VRAM, the idea is to max out your VRAM regardless of how much you have. If you have 3gb, bf3 will max it out. If you have 2gb, bf3 will max it out. Its not like your performance will degrade if you go from 3gb > 2b, LOS just lowers slightly and the performance will be the same. This was all covered by a DICE developer at last years Geforce LAN....

This is the only game i'm aware of that even has such a mechanic. Anyway, i'm saying, give customers a choice. I don't see an issue with that - it won't happen anyway because AMD are clueless.

I remember when DICE was talking about this before the game was released. It sounded awesome that they had created an engine that was capable of this. However, unless they only meant for gpu's that have more than 1GB of vram, I don't believe it.
Now, fwiw I am only a sample of one, but my cards only have 1GB of vram. If I turn up textures, the game becomes a slide show where everything starts teleporting even though fraps is telling me the fps are 50 or more.
This is with textures set to high or ultra. When turning textures down to medium with no other changes, the fps go up a hair but the game runs smoothly. So, if the claim by DICE were true, shouldn't the engine adjust LOD so that the game doesn't become a slideshow? Or am I looking at it upside down?
 

Zanovar

Diamond Member
Jan 21, 2011
3,446
232
106
I was getting really tired of microstutter. Last night I tried adaptive vsync for the first time and so far signs of ms have been decreased to the amount that it no longer bothers me. Sure, it's only after one night of using it but it looks promising.

Im looking forward to trying out adaptive vsync,sounds great on paper.
 
Feb 6, 2007
16,432
1
81
Important information for the discerning customer willing to spend $2,000 on graphics cards but no more than $200 on a monitor. I'm going to go out on a limb and guess that people who are actually going to run three or four 7970s or 680s are going to be running multiple monitor setups, and, as of right now, the only ATI driver that allows that is 5 months old. That's a pretty major oversight, and it makes nVIDIA the de facto winner at the bleeding edge level (multi-GPU with multi-monitor).
 

Bobisuruncle54

Senior member
Oct 19, 2011
333
0
0
Important information for the discerning customer willing to spend $2,000 on graphics cards but no more than $200 on a monitor. I'm going to go out on a limb and guess that people who are actually going to run three or four 7970s or 680s are going to be running multiple monitor setups, and, as of right now, the only ATI driver that allows that is 5 months old. That's a pretty major oversight, and it makes nVIDIA the de facto winner at the bleeding edge level (multi-GPU with multi-monitor).

AMD does seem to be pretty slow with driver fixes and support recently, and some problems have just been ignored completely. I guess it's due to supporting 3 architectures at once (VLIW-4, VLIW-5 and GCN), so it's good to hear they are dropping VLIW-5 support in the near future, but I hope this translates into strides with their GCN drivers.
 

blckgrffn

Diamond Member
May 1, 2003
9,687
4,348
136
www.teamjuchems.com
AMD does seem to be pretty slow with driver fixes and support recently, and some problems have just been ignored completely. I guess it's due to supporting 3 architectures at once (VLIW-4, VLIW-5 and GCN), so it's good to hear they are dropping VLIW-5 support in the near future, but I hope this translates into strides with their GCN drivers.

Oh, I don't think they are dropping VLIW-5 any time soon. There are 7xxx series cards (mobility) that are still VLIW-5, and Llano. That's here for a looooong time to come :)

It's Pre-DX11 cards that they have pushed in front of the bus :)
 

Bobisuruncle54

Senior member
Oct 19, 2011
333
0
0
Oh, I don't think they are dropping VLIW-5 any time soon. There are 7xxx series cards (mobility) that are still VLIW-5, and Llano. That's here for a looooong time to come :)

It's Pre-DX11 cards that they have pushed in front of the bus :)

Of course I forgot that the 4000 and the 5000 series are both VLIW-5, d'oh! :) Shame about not all of the 7000 mobility series being GCN too, oh well.

Well that's not very good news then tbh, ever since the introduction of a new arch it seems that AMD has been slower to respond. Hell I'm still waiting for 6900 series support for GTA IV!
 

blackened23

Diamond Member
Jul 26, 2011
8,548
2
0
You might as well inquirer to your sig now, its pretty well known that the severance of the global foundries relationship is the reason for the 1 time charge off and loss. Don't let that curb your enthusiasm though since the graphics market share chart was released last week as well :thumbsup:
 

jacktesterson

Diamond Member
Sep 28, 2001
5,493
3
81
I would love to see this review again with 12.7 Beta's used.

I know they improved performance greatly for me.
 

Jacky60

Golden Member
Jan 3, 2010
1,123
0
0
So would I but the problem I have is that it's more than 6 months since they released this card and AMD typically stop making improvements to the performance of a card once it's successor arrives on the scene so we have until this autumn probably before the 7900 is 'retired' and we likely see no further improvements. I hope I'm wrong but that appears to me to be the case (I've been with ATI/AMD since 4870 replaced my 8800 gtx but the green team's driver support is pushing me back towards their camp despite myself). AMD need to spend a little more on driver engineers for their GPU's so stuff works off the bat not 6 months after launch.
 

Final8ty

Golden Member
Jun 13, 2007
1,172
13
81
So would I but the problem I have is that it's more than 6 months since they released this card and AMD typically stop making improvements to the performance of a card once it's successor arrives on the scene so we have until this autumn probably before the 7900 is 'retired' and we likely see no further improvements. I hope I'm wrong but that appears to me to be the case (I've been with ATI/AMD since 4870 replaced my 8800 gtx but the green team's driver support is pushing me back towards their camp despite myself). AMD need to spend a little more on driver engineers for their GPU's so stuff works off the bat not 6 months after launch.

Im still seeing improvements on my 5xxx cards.