Resolution. As it goes up Cayman gets faster relative to Fermi.
Now, there is/was an issue with bandwidth through the crossfire connection that seems to bottleneck crossfire as you add more cards. This might be avoided/limited by going 6990/6970, only 2 cards, instead of 3, or more, discrete cards. This might be why this particular combo competes so well against the nVidia solutions. We've seen comparisons of tri/quad crossfire against tri/quad SLI not be as dominant. If so, and I'm only speculating, then it's not an issue of architecture, simply hardware implementation.
There's an article @ Kitguru that talks about the next gen crossfire being "the fastest yet" by improving the interface between the cards. Maybe, the best is yet to come?
http://www.kitguru.net/components/graphic-cards/dragan/3rd-generation-crossfire-will-be-fastest-yet/
No matter. For now, the 6990/6970 trifire setup is the dominant setup in it's class. this latest article at [H], leaves no doubt about that.
I grabbed a pair of Palit 3GB GTX580s to replace my EVGA reference cards, after talking to Vega (http://hardforum.com/member.php?u=93212) @ HardOCP. I don't game at 3x(2560x1600) but 3x(1920x1200), and I don't liquid cool. I also run ECC memory, which limits my CPU overclocks. The cost was about the same as a 6990+6970, and to which may prove to be my dismay, I do prefer the green team.
I had the sudden feeling last night that nV being the compute monsters they are, that they were only severely VRAM limited. The reason for this thought was a guy at work who installed standard nV drivers for his Quadro, which then turned into (basically) a GTX470 with 6GB of memory, while loading a huge map on ARMA-II.
I'll post benchmarks (and pics) using CIV-5, Metro and BF2 @ 3600x1920, as they seem to be the most popular benchmarks (I only play Civ-V).
Daimon
Don't forget pics of the system too.
So you use portrait mode?
Why test those games, they're not GPU intensive, two of them are console ports in fact. Hardly anyone gives a flying toss about gpu physx and the 15 games in the world that use it, 80% of which are utter trash.
They picked the most intensive games around and the 580 Tri-Sli got taken to the cleaners with it's $500 more expensive pricetag.
It's an excellent review, because it highlights for a buyer considering wasting $1000 on 580 Dual SLI the fact that for the same money they can get faster performance from 6990+6970 Tri-Fire for the same $1000. And the icing on the cake is that the $1000 6990+6970 Tri-Fire is faster than the $1500 580 Tri-Sli setup as well.
There is no argument here. If the best one can muster up is turning on physx in one game or resorting to benches of a console port used with drivers that didn't support Crossfire, they may as well not come to the party.
What % of people are using these card setups? And of those, what % are using the multi-monitor resolution?
The reviews are fun to look at, but mean nothing to pretty much anyone.
What % of people are using these card setups? And of those, what % are using the multi-monitor resolution?
The reviews are fun to look at, but mean nothing to pretty much anyone.
What % of people are using these card setups? And of those, what % are using the multi-monitor resolution?
The reviews are fun to look at, but mean nothing to pretty much anyone.
Embarrassment, LOLWrong. You can rationalize all day but high-end holds symbolic and psychological importance for market share in the overall "gamer's" market.
Nvidia's failure to produce video cards with more than 1.5 GB vRAM this gen is an embarrassment for a company that's always prided itself on producing premium products at premium price. Especially for the 1GB and 1.28GB configurations, what you've got now is overpriced hardware compared to AMD that is hampered in multimonitor configurations or use of supersampling AA modes. Even 3D uses more vRAM than 2D, which is why I wish AMD had its own first-party 3D solution instead of the convoluted crap they have now.
Also, 3x1080p is actually not *that* expensive these days and thousands of gamers are moving toward Eyefinity/Surround, so yes it is important to those people - aka, people in this forum and other similar forums.
Embarrassment, LOL
Lot of drama . You are ignoring you can buy a 3gb 580 if that is the intended mountain to climb.
The 580 @1.5gb is enough for 25600x1600 30 inch monitors, some people decide to get 2 and 3 of these cards for added fps. Or for 3D
Multi-monitor is more of a niche than 3D, something AMD cards are way behind in. iz3d option for AMD 3D does not support crossfire.
Its good for gamers, that either brand has its advantages.
edit: The issue is not that 3x 1080p monitors don't cost that much. I'm never going to upgrade to that setup. Its just not a desirable option for some, that would consider a 30inch monitor ideal. Which does cost a lot. There are people with dual gtx 580/570 type power pushing single monitors.
There are people with dual gtx 580/570 type power pushing single monitors.
My worst mistake was being loyal to ATI for the last 8 years. Im sorry but their cards just feel castrated at the important resolutions.
HardOCP is more popular and established than that website, which I've never even heard of, so it's not suprising you are seeing their articles being made into topics of discussion. If you think your article should get attention, then create a new thread. So it's head-scratching how you come to the conclusion people are ignoring something (this article) they most likely didn't even know existed.I tried staying quiet but lol a configuration that almost no one plays at gets so much attention because AMD wins but a case where the 570 stomps a more expensive 6970 in an overclock test that is far more usefull to alot more people gets ignored http://www.insidehw.com/Reviews/Grap...d-to-head.html
AMD put 2GB on these cards so they wouldn't flounder at the more demanding resolutions. You know, the resolutions that actually put these powerful cards through the most work. Resolutions below 1080p can get by with 1GB of VRAM (for now), but they can also get by with slower GPUs.And very lol at the guy saying Nvidia failed for not adding more than 1.5gb of vram. That would have been useless to almost everyone. 2gb saved AMD because of noobs that think its nessesary for 1920x1200 and below. It would have been very ugly for AMD if they didnt add 2gb of vram. They would have lost on every front. You know resolutions that people actually play at. My worst mistake was being loyal to ATI for the last 8 years. Im sorry but their cards just feel castrated at the important resolutions.
Surprising considering GeForce 3 > Radeon 8500, GeForce 6800GT > X850Pro and GeForce 8xxx series walked all over HD29xx / 38xx. So how did you manage to only purchase ATI cards all this time? hhee
Either way, don't take it too personally. Once HD7000 and Kepler arrive, this 'refresh' generation will be forgotten rather quickly. What won't be forgotten are $100-200 saved by going with 6950/70 cards over the laughably overpriced 580, which will be just as obsolete in 12 months from now as the 15% slower 6970 cards.![]()
Surprising considering GeForce 3 > Radeon 8500, GeForce 6800GT > X850Pro and GeForce 8xxx series walked all over HD29xx / 38xx. So how did you manage to only purchase ATI cards all this time? hhee
Edit: @ AdamK47: I'll kill your i7 on Handbrake! (but not on FPS gaming, at all)
I had to lookup what Handbrake is.
Wrong. You can rationalize all day but high-end holds symbolic and psychological importance for market share in the overall "gamer's" market.
NewEgg has the Palit 3GB cards again... same bullshit "reviews" from people who don't own the card. I wonder why NewEgg allows that.
http://www.newegg.com/Product/Produc...scrollFullInfo
Daimon
De-activated. So these cards are not for sale anymore?
