[H] - GTX680 3-Way SLI vs. 7970 Tri-Fire review

Page 4 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

blackened23

Diamond Member
Jul 26, 2011
8,548
2
0
you are right. AMD is never conservative and just put useless amounts of vram on their cards. perhaps you should email them and let them know that 1.5gb would have been plenty as no game needs more than that even at 2560x1600. AGAIN I hit my vram limit in some games at just 1920x1080 with just a gtx570 but I guess higher settings or a higher res would not ever go over 1.5gb according to you.

Actually if you've read any launch review they designed the 7970 for surround gaming. THATS why it has 3gb.

I'm still asking for proof that 3gb benefits a single screen resolution, I have not seen any ? I provided my citations earlier. Benchmarks please.
 

Jaydip

Diamond Member
Mar 29, 2010
3,691
21
81
at which graphs for which card? they do not have a 1.5g card that is as fast as a 7970.
They have 580 in skyrim,look at my earlier link.
The BF3 graphs between 680 and 7970 @ eyefinity resolution.
 

toyota

Lifer
Apr 15, 2001
12,957
1
0
Actually if you've read any launch review they designed the 7970 for surround gaming. THATS why it has 3gb.

I'm still asking for proof that 3gb benefits a single screen resolution, I have not seen any ? I provided my citations earlier. Benchmarks please.
you seem like a pretty smart guy so answer me this. if my gtx570 can use its max vram in some games at 1920x1080 then would you not think that at a higher setting and res that a faster 7970 could not utilize more than 1.5gb of vram? and again fps do not tell the whole story and I know that from hitting vram limits on my gtx260 896mb.
 

Jaydip

Diamond Member
Mar 29, 2010
3,691
21
81
I don't understand why people have problem with 1.5Gb 7970 @ 399$?It will put tremendous pressure on NV as 670TI won't be able to match it.This will force NV to bring out better cards at lower price points,win win for consumers i think.
 

Jaydip

Diamond Member
Mar 29, 2010
3,691
21
81
you seem like a pretty smart guy so answer me this. if my gtx570 can use its max vram in some games at 1920x1080 then would you not think that at a higher setting and res that a faster 7970 could not utilize more than 1.5gb of vram? and again fps do not tell the whole story and I know that from hitting vram limits on my gtx260 896mb.
As i said n times in this thread already,there will always be exceptions ():)
 

toyota

Lifer
Apr 15, 2001
12,957
1
0
As i said n times in this thread already,there will always be exceptions ():)
and as I have said you dont buy a flagship card to only play games where there are no exceptions. its silly to spend over 400 bucks and EVER have to turn down settings for the sole reason of not having enough vram. next year we can look back at this thread and laugh just like with all of the other vram arguments where people thought 512mb, 768mb or 1gb were enough.
 

Jaydip

Diamond Member
Mar 29, 2010
3,691
21
81
and as I have said you dont buy a flagship card to only play games where there are no exceptions. its silly to spend over 400 bucks and EVER have to turn down settings for the sole reason of not having enough vram. next year we can look back at this thread and laugh just like with all of the other vram arguments where people thought 512mb, 768mb or 1gb were enough.
We can laugh now if u want:)
As i said spending 400$ doesn't entitle u to play all games @ max settings.It has nothing to do with vram but mostly with gpu itself.
 

aaksheytalwar

Diamond Member
Feb 17, 2012
3,389
0
76
The answer is simple. There may not be any or many games which need more than 1.5 gb VRAM today, but within another 6-12 months there will probably be a number of taxing games which will need more than 1.5 gb, perhaps not 3 full but more than 1.5. Within 12-18 months half of the wanted games may need btw 2-3gb VRAM when 4gb would be the norm on high end gpus.

The 768 mb 460 had a good 12 months run but became crap after that, while the 1gb 460 lasted twice as long

The 1gb 4870 cost a bit more than 512mb but lasted at least 1.5 times as long

So a top end card with 1.5gb VRAM makes no sense

It does make sense for 7950 but the price will decrease by a max of $50 making it very tough btw the 7870 and 7950 which are very close in terms of performance and price anyway. If a 1.5 gig 7950 was priced at $350 it might sell well, but the 7870 would need to go below $300 and the 7850 won't last at $250+ then. So their whole strategy of high pricing takes a roll there,
 

fuzzymath10

Senior member
Feb 17, 2010
520
2
81
splitting a low-volume SKU will also probably increase fixed production costs. When buying a flagship GPU I can't imagine cost is as significant a factor as it is in the $200-300 class. The "at any cost" mentality is probably driving flagship sales sufficiently that it offsets the cost of getting a second cut-down model out on shelves. With lower sales volumes, the ideal amount of production is harder to predict, and the risk of idle inventory is high.

If having a flagship GPU's performance was important to me, I would acknowledge that I have to pay to play, and $50 isn't a whole lot (in the 10% ballpark which should be immaterial for anyone desiring premium products).
 

aaksheytalwar

Diamond Member
Feb 17, 2012
3,389
0
76
splitting a low-volume SKU will also probably increase fixed production costs. When buying a flagship GPU I can't imagine cost is as significant a factor as it is in the $200-300 class. The "at any cost" mentality is probably driving flagship sales sufficiently that it offsets the cost of getting a second cut-down model out on shelves. With lower sales volumes, the ideal amount of production is harder to predict, and the risk of idle inventory is high.

If having a flagship GPU's performance was important to me, I would acknowledge that I have to pay to play, and $50 isn't a whole lot (in the 10% ballpark which should be immaterial for anyone desiring premium products).
Perhaps this as well
 

jacktesterson

Diamond Member
Sep 28, 2001
5,493
3
81
I've generally used more AMD/ATI cards over the years than Nvidia and have always been happy.

I am still now, but what seems to be the delay on driver updates for the 7000 series?
 

Elfear

Diamond Member
May 30, 2004
7,169
829
126
Not modding isn't turning down settings though, but after reading what else you had to say the extremist logic is clearly at play.

Turning off AA IS turning down settings though. If you think that is extremist thinking, than you're in the wrong forum.
 

blackened23

Diamond Member
Jul 26, 2011
8,548
2
0
And still there are no benchmarks to collaborate claims to > 1.5gb being beneficial to single monitor resolutions. After staring at 2 pages of benchmarks of GTX 580 1.5gb vs 3gb reviews showing no difference, I just want some proof of claims here ;) Citation needed, I already provided some earlier using a GTX 580 1.5gb vs 3gb testbed.

As far as being future proof...yeah right...lets face it games are developed on consoles first then the PC.
 
Last edited:

Lonbjerg

Diamond Member
Dec 6, 2009
4,419
0
0
Read the forum thread over at hardocp...it scarees me that so many people are ignorant about microstuttering.

It's like watching lemmings...go ape*bleep* over the current FOTM:

OMG...GHZ!!!!
OMG...FPS!!!!

WTF...IPC?!
WTF...Micro-what?!

:(
 

toyota

Lifer
Apr 15, 2001
12,957
1
0
And still there are no benchmarks to collaborate claims to > 1.5gb being beneficial to single monitor resolutions. After staring at 2 pages of benchmarks of GTX 580 1.5gb vs 3gb reviews showing no difference, I just want some proof of claims here ;) Citation needed, I already provided some earlier using a GTX 580 1.5gb vs 3gb testbed.

As far as being future proof...yeah right...lets face it games are developed on consoles first then the PC.
please not the "games are made for consoles" BS. have pc games including ports gotten more demanding in last 5-6 years since consoles have been "limiting" us? HELL YES. even in the last year games are much more demanding than they used to be so give that tired old excuse a rest.
 
Last edited:

Elfear

Diamond Member
May 30, 2004
7,169
829
126
And still there are no benchmarks to collaborate claims to > 1.5gb being beneficial to single monitor resolutions. After staring at 2 pages of benchmarks of GTX 580 1.5gb vs 3gb reviews showing no difference, I just want some proof of claims here ;) Citation needed, I already provided some earlier using a GTX 580 1.5gb vs 3gb testbed.

As far as being future proof...yeah right...lets face it games are developed on consoles first then the PC.

The vram debate has heated up in the last few weeks and I've been thinking about starting a thread with some user input. The overwhelming majority of games out there will run just fine on a single screen with 1.5GB of vram. I'm curious about the few games that might push that though. Benchmarks would go a long way in settling the matter.

I'll try and make time in the next couple days to start a thread and see if we can get some data input from Anandtech members.
 

blastingcap

Diamond Member
Sep 16, 2010
6,654
5
76
The vram debate has heated up in the last few weeks and I've been thinking about starting a thread with some user input. The overwhelming majority of games out there will run just fine on a single screen with 1.5GB of vram. I'm curious about the few games that might push that though. Benchmarks would go a long way in settling the matter.

I'll try and make time in the next couple days to start a thread and see if we can get some data input from Anandtech members.

I would go one step further and say 1GB is enough if you are willing to live with lesser MSAA or less-intensive AA types, for 1080p or less.

The false logic of people in this thread completely ignores WHY it is that VRAM requirements have gone up even during the lifetime of the XBOX360 and PS3.

Part of it is that PC games sometimes get higher-res textures and such, true.

But part of it is because console programmers got more efficient as they climbed the learning curve.

And lastly, and this is VERY important: resolutions went up. 6 years ago, many people were gaming at lower resolutions than they are today. This explains a huge part of why VRAM requirements went up. 1366x768 is half the resolution of 1080p for instance. Six years ago I was at 1280x1024, which is about 57% of 1920x1200.

If you plan to keep your current resolution for the next 2 years, chances are you will not need more than whatever VRAM you've already got, unless you are already hitting the VRAM wall today and can't bear the thought of turning down AA a little or to use a less stressful AA style. New consoles won't come out at affordable prices until 2+ years from now, so most games are unlikely to really push VRAM until then, mods and ultrahighrez texture packs notwithstanding.

I also think that you rapidly get diminishing returns on VRAM-hogging stuff like AA and high-rez texture packs, btw. Going from 0x to 2x MSAA is nice, and maybe even 2x to 4x MSAA can be a notable difference, but 4x to 8x MSAA won't make nearly as much of a difference. Just as one example.

Also, for many (most?) people, by the time their VRAM becomes a limiting factor in more than a few games, the GPU itself will likely be outdated anyway. I'd rather have a fast card with 1GB VRAM than a medium-speed card with 1.5GB VRAM for instance, because by the time 1GB VRAM really isn't enough, that medium-speed 1.5GB VRAM card will likely be too slow anyway. A little extra VRAM will not do much to "futureproof" yourself... not that I believe in the concept of futureproofing yourself in an industry as fast-moving as GPUs anyway. (All bets are off if you decide on a resolution upgrade of course.)
 
Last edited:
Feb 6, 2007
16,432
1
81
The main thing I took away from that article was that ATI's driver support has been unforgivable this generation, first in taking months to release WHQL drivers that supported the 7970, and still not having WHQL drivers that support Crossfire on Eyefinity set ups (which, let's face it, if you're running two or more 7970s, you're probably hooking them up to multiple monitors). It's absolutely ridiculous, and if I were in the market for a multi-GPU, multi-monitor setup, I couldn't even consider ATI's offering this generation. It simply doesn't work. And that's shocking.

All other things aside (minimum framerates, microstuttering, ease of hardware installation/setup, cost), the fact that 5 months after release there are still no working drivers for ATI's flagship in extreme rigs is a dealbreaker.
 

jacktesterson

Diamond Member
Sep 28, 2001
5,493
3
81
Read the forum thread over at hardocp...it scarees me that so many people are ignorant about microstuttering.

It's like watching lemmings...go ape*bleep* over the current FOTM:

OMG...GHZ!!!!
OMG...FPS!!!!

WTF...IPC?!
WTF...Micro-what?!

:(


I rarely notice it.
 
Last edited:

blastingcap

Diamond Member
Sep 16, 2010
6,654
5
76
The main thing I took away from that article was that ATI's driver support has been unforgivable this generation, first in taking months to release WHQL drivers that supported the 7970, and still not having WHQL drivers that support Crossfire on Eyefinity set ups (which, let's face it, if you're running two or more 7970s, you're probably hooking them up to multiple monitors). It's absolutely ridiculous, and if I were in the market for a multi-GPU, multi-monitor setup, I couldn't even consider ATI's offering this generation. It simply doesn't work. And that's shocking.

All other things aside (minimum framerates, microstuttering, ease of hardware installation/setup, cost), the fact that 5 months after release there are still no working drivers for ATI's flagship in extreme rigs is a dealbreaker.

What else is new? NVidia almost always has better SLI support than AMD has CF support.

That said I think multi-GPU setups are almost always a bad idea for power/heat/noise and perf/price reasons. Microstutter, driver bugs, lack of support in profiles, etc.... who needs that headache?