• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

R580 Beating GTX512 by 25% +

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
I think in terms of high FPS, the 7900GTX will win hands down. But in terms of making a card that will take less hits with all eye-candy on and remain at a 65++ FPS, the X1900XT will win. But not hands down =)
 
Originally posted by: TecHNooB
I think in terms of high FPS, the 7900GTX will win hands down. But in terms of making a card that will take less hits with all eye-candy on and remain at a 65++ FPS, the X1900XT will win. But not hands down =)

% wise, ATI cards are in fact much more efficient per AA losses. With that in mind, Nvidia does seem to perform better when no AA/AF are enabled; but I dont recall anyone bringing this up. This would suggest that Nvidia cards will generally age better once you can no longer enable AA/AF wouldn't it?

Also I thought X1800xt/xl are 16x1 and R580 will be 16x3? Can someone correct me on this?
 
SLI isnt worth the money IMO... i mean, if i buy 1 card for 100% of the performance, why buy 2 for only 160%? or 170%? sure im a tech guy, i know it doesnt work that way with bandwidth/cross chip communication and all that... but im not getting 30% off the price off the second card either. with the possibility some/most (dunno what to put these days) dont benifit from it. some games do not even need to benifit from it. i dug up the old 7800 GTX 512mb review on anadtech.. seems that in DOD:S at 1600x1200 4aa 8af, a 512 delivers 60.8 fps. smooth i'd recon, since 60 is just that. though, the SLI 512's give 59.9 fps. its not worth paying another 700 bucks for the same smoothness. at higher resolutions, yes, it'll become more neccesary. but im not planning on squinting on my 19"CRT and i think the majority of people play at 1600x1200 or less.

funny thing is though, on this computer, even with quad SLI i would see frame drops to sub 25fps levels. bloody HDD. anyways, SLI's really only usefull if you wanna play on a big screen, with a big resolution, and you have too much money anyway. which cuts out like 99% of all consumers. so its not really SLI FTW. its SLI, because the single cards arent fast enough to power my huge expensive monitor which i HAVE to run at high resotions else it looks like crap.
 
Originally posted by: Corporate Thug
Originally posted by: Killrose
The release of the X1900XT/Xwhatever will at least make prospective buyers of 512mb 7800GTX's happy, the price will now be substantialy lower. $750 won't be possible to charge for one of these anymore.

The 'ol price/performance ratio will come into play.

i'm not sure about that. just take a look at the 6800U 512mb. they are still goin for $500+


Think that's bad? Check this $h!t out!
 
The problem with SLI & Crossfire that i see is that one doesn't get a 100% improvement in performance.
You maybe get 80%, & with many games, alot less.

So the "value" of buying two cheaper - let's say midrange - cards vs. one more expensive card is somewhat lost, not to mention the adding cost of the SLI motherboard.

I've been considering doing something like SLI 7800GTs myself, but everytime i think about it, the negatives outweigh the positives...
 
Originally posted by: TecHNooB
I think in terms of high FPS, the 7900GTX will win hands down. But in terms of making a card that will take less hits with all eye-candy on and remain at a 65++ FPS, the X1900XT will win. But not hands down =)

ATi is better at proccessing complex shaders, nvidia is trying to win with pure muscles.
 
Originally posted by: PC Surgeon
Originally posted by: Corporate Thug
Originally posted by: Killrose
The release of the X1900XT/Xwhatever will at least make prospective buyers of 512mb 7800GTX's happy, the price will now be substantialy lower. $750 won't be possible to charge for one of these anymore.

The 'ol price/performance ratio will come into play.

i'm not sure about that. just take a look at the 6800U 512mb. they are still goin for $500+


Think that's bad? Check this $h!t out!

That belongs in the hotdeals forum
 
Originally posted by: xtx4u
Originally posted by: PC Surgeon
Originally posted by: Corporate Thug
Originally posted by: Killrose
The release of the X1900XT/Xwhatever will at least make prospective buyers of 512mb 7800GTX's happy, the price will now be substantialy lower. $750 won't be possible to charge for one of these anymore.

The 'ol price/performance ratio will come into play.

i'm not sure about that. just take a look at the 6800U 512mb. they are still goin for $500+


Think that's bad? Check this $h!t out!

That belongs in the hotdeals forum


LOL!
 
Originally posted by: Rollo
You can rationalize having less flexibility and power in your computer graphics all you like, but the fact remains, SLI is MUCH better.

Who's rationalizing anything? And your above claim is only true for games that work with dual PEG solutions. A single high-end board has 100% compatibility in today's games. . .SLI and X-fire do not.

As for your Call of Duty 2 example, I played it at 1680x1050 with 2x AA, 4x AF, DX9, high texture settings, etc., and it ran fine on a X800 XT (500/500). Hardly the poster child for your SLI advertising, IMO.
 
Originally posted by: Rollo
From my perspective, nVidia parts are already offering a pretty wide variety at the highend. With 256 7800 GTX SLI, 256 7800GT SLI, 256 7800GT SLI on one card, 512 7800GTX, and 512 7800GTX SLI either for sale or supposedly for sale in a week or two, users have a lot of nVidia options without them even launching a part.

At this point, SLI options FTW. Come month end, who knows?

you are absolutely rignt . . . until the 24th when the balance shifts and ATi releases their new lineup . . . then nVidia must reply and everything changes . . . i even expect to see a more refined x-fire to compete with sli.

which is good . . . the performance of the 512gtx will become affordable.
 
Wow.

Over three pages arguing about a rumored, yet-to-be-released part being faster than an existing sort-of-available part, that may be outperformed by another yet-to-be-released part for which we also only have rumored specs.

Oh yeah. And the old "SLI is pointless/SLI is king" debate thrown in for good measure.

Move along. Nothing to see here.
 
Originally posted by: Tanclearas
Wow.

Over three pages arguing about a rumored, yet-to-be-released part being faster than an existing sort-of-available part, that may be outperformed by another yet-to-be-released part for which we also only have rumored specs.

Oh yeah. And the old "SLI is pointless/SLI is king" debate thrown in for good measure.

Move along. Nothing to see here.

Too funny:laugh:


 
Nvidia more or less won 2005, due to ATI being a No-Show for such a longtime, but 2006 could end being a major scrap! Bring on the Speed, bring on the Innovation!!
 
Originally posted by: John Reynolds
Originally posted by: Rollo
You can rationalize having less flexibility and power in your computer graphics all you like, but the fact remains, SLI is MUCH better.

Who's rationalizing anything? And your above claim is only true for games that work with dual PEG solutions. A single high-end board has 100% compatibility in today's games. . .SLI and X-fire do not.
:roll:
Examples of some games that are not compatible with SLI and Crossfire?

As for your Call of Duty 2 example, I played it at 1680x1050 with 2x AA, 4x AF, DX9, high texture settings, etc., and it ran fine on a X800 XT (500/500).
Wow John! You must have one of those magic X800XTs[/b] I've heard about!

All the ones I see reviewed REALLY SUCK at COD2!
http://www.xbitlabs.com/articles/video/display/games-2005_9.html

Hmmm.
A whole 13.2 fps at 12X10 4X16X with a 4000+.

<a target=_blank class=ftalternatingbarlinklarge href="http://www.firingsquad.com/hardware/call_of_duty_2_midrange_graphics/page8.asp">http://www.firingsquad.com/hardware..._of_duty_2_midrange_graphics/page8.asp</a>

Wow! The X800XL, a bit below your card, can muster a whole 18 fps at 16X12 2X8X with a 3500+.

How wrong I was. 🙁 I'm sold. X800XTs are CLEARLY the card to have for sub 30 fps gaming fury!
 
Originally posted by: Rollo
:roll:
Examples of some games that are not compatible with SLI and Crossfire?

Roll your little eyes all you want, but there are a helluva lot of games that aren't compatible with either dual PEG solution. Of course the major titles have profiles in the drivers for them.

Wow John! You must have one of those magic X800XTs[/b] I've heard about!

All the ones I see reviewed REALLY SUCK at COD2!

. . . .

How wrong I was. 🙁 I'm sold. X800XTs are CLEARLY the card to have for sub 30 fps gaming fury!

Considering I played the game on a FX-57 and 2GB of RAM, I probably averaged around 30-35fps except for really intense firefights. The frame rate never had a negative impact on my sniping, that's for sure. That out of the way, I never said it was the best, the highest frame rate, I said it played fine for me. I don't have to have that marketing-driven, magic bullet # of 60fps floating in front of me to play a game.

And for someone who apparently goes running to the board mods quite often crying foul over the posting style of tohers, you need to learn how to have a mature conversation yourself.
 
Originally posted by: John Reynolds
Originally posted by: Rollo
:roll:
Examples of some games that are not compatible with SLI and Crossfire?

Roll your little eyes all you want, but there are a helluva lot of games that aren't compatible with either dual PEG solution. Of course the major titles have profiles in the drivers for them.
If you actually knew something about SLI, you'd know that:
1. You can easily create your own profiles, the ones in the drivers are just the settings nV found the most benefit at.
2. You can select which version of SLI you want to force on non profiled games in the control panel, AFR1, AFR2, and SFR. For cpu limited games that get no scaling benefit from two cards, you can choose SLI AA and use 8XAA or 16X AA.
So you see, just about every game gets some benefit from SLI.


Considering I played the game on a FX-57 and 2GB of RAM
Errrr, so what? You think a FX57 is going to boost you much over a 4000+? CPU gains among A64s are trivial. The XBit review used 2GB RAM as well.


I probably averaged around 30-35fps except for really intense firefights.
You consider that playable??!?!
I consider that a reason to buy SLI. You admit your game is lagging and chugging during firefights, and at pretty low AVERAGE when it's not. I want my games to look like real life, or movies. You apparently don't.


The frame rate never had a negative impact on my sniping, that's for sure.
What's the difference? If you have to watch the game charaters moving like break dancers, does your "sniping" really matter?

That out of the way, I never said it was the best, the highest frame rate, I said it played fine for me. I don't have to have that marketing-driven, magic bullet # of 60fps floating in front of me to play a game.
Maybe if you did, you'd understand my perspective a little better and mock it less.

And for someone who apparently goes running to the board mods quite often crying foul over the posting style of tohers, you need to learn how to have a mature conversation yourself.
When you spread FUD on a board I care about I'm going to call you on it John.
 
Look at this thread - arguing over the g71 vs r580, dual cards vs single, oc'd 3500+ vs a fx-57... this is so pointless, even I dont feel like jumping into the argument. But I will say that whatever new cards come out is a good thing, because it's ridiculous that someone can buy a $600+ card and yet it still chokes on certain games like FEAR and COD2.
 
Originally posted by: Rollo
Originally posted by: John Reynolds
Originally posted by: Rollo
You can rationalize having less flexibility and power in your computer graphics all you like, but the fact remains, SLI is MUCH better.

Who's rationalizing anything? And your above claim is only true for games that work with dual PEG solutions. A single high-end board has 100% compatibility in today's games. . .SLI and X-fire do not.
:roll:
Examples of some games that are not compatible with SLI and Crossfire?

As for your Call of Duty 2 example, I played it at 1680x1050 with 2x AA, 4x AF, DX9, high texture settings, etc., and it ran fine on a X800 XT (500/500).
Wow John! You must have one of those magic X800XTs[/b] I've heard about!

All the ones I see reviewed REALLY SUCK at COD2!
http://www.xbitlabs.com/articles/video/display/games-2005_9.html

Hmmm.
A whole 13.2 fps at 12X10 4X16X with a 4000+.

<a target=_blank class=ftalternatingbarlinklarge href="http://www.firingsquad.com/hardware/call_of_duty_2_midrange_graphics/page8.asp">http://www.firingsquad.com/hardware..._of_duty_2_midrange_graphics/page8.asp</a>

Wow! The X800XL, a bit below your card, can muster a whole 18 fps at 16X12 2X8X with a 3500+.

How wrong I was. 🙁 I'm sold. X800XTs are CLEARLY the card to have for sub 30 fps gaming fury!

Rofl @ magic X800XT!
 
Originally posted by: KeepItRed
Originally posted by: Rollo
Originally posted by: John Reynolds
Originally posted by: Rollo
You can rationalize having less flexibility and power in your computer graphics all you like, but the fact remains, SLI is MUCH better.

Who's rationalizing anything? And your above claim is only true for games that work with dual PEG solutions. A single high-end board has 100% compatibility in today's games. . .SLI and X-fire do not.
:roll:
Examples of some games that are not compatible with SLI and Crossfire?

As for your Call of Duty 2 example, I played it at 1680x1050 with 2x AA, 4x AF, DX9, high texture settings, etc., and it ran fine on a X800 XT (500/500).
Wow John! You must have one of those magic X800XTs[/b] I've heard about!

All the ones I see reviewed REALLY SUCK at COD2!
http://www.xbitlabs.com/articles/video/display/games-2005_9.html

Hmmm.
A whole 13.2 fps at 12X10 4X16X with a 4000+.

<a target=_blank class=ftalternatingbarlinklarge href="http://www.firingsquad.com/hardware/call_of_duty_2_midrange_graphics/page8.asp">http://www.firingsquad.com/hardware..._of_duty_2_midrange_graphics/page8.asp</a>

Wow! The X800XL, a bit below your card, can muster a whole 18 fps at 16X12 2X8X with a 3500+.

How wrong I was. 🙁 I'm sold. X800XTs are CLEARLY the card to have for sub 30 fps gaming fury!

Rofl @ magic X800XT!


Don't laugh man, some cards are sprinkled with magic dust aka white powder. When you snort the magic dust, even a voodoo 2 works at 60 fps+ 1920x1200 4xaa/16x af.
 
Hmm, the faster these high end cards are getting, the more I'm hoping that any 7600 will have more performance for the budget gamer when It's released.
 
Originally posted by: Rollo
When you spread FUD on a board I care about I'm going to call you on it John.

You know, the really amusing thing is that you probably wrote that post without any thought of looking in the mirror. Hilarious hypocrisy coming from you, the master poster of FUD. The problem trying to have a discussion with someone like you, Rollo, is that your agenda as a fanboy colors everything you write, every angle of every argument you engage in. One of these days I'll learn to stop wasting my time with your ilk.

Yea, 30-35fps is low but still playable enough for someone who wanted to wait until this spring's refreshes before upgrading. But don't let me dare interrupt your need to advertise for SLI. And, yes, I'm fully aware of user profiles but there are still games that won't work with SLI's rendering modes. You're the one claiming 100% compatiblity and anyone who does know a thing or two about dual PEG solutions knows that isn't true. My only claim is that, yes, a single board has higher compatiblity than a dual PEG setup. Whether or not that's an issue for someone is up to them to decide since you never see me telling people how they should spend their money.
 
30-35 fps in Call of Duty 2 is absolutely not playable for this game. For some reason, higher framerates are needed for smooth gameplay. I am finding that even around 50 fps in this game is pushing it. If your playing CoD2 at 30-35 fps, I expect you to be getting owned all over the place in multiplayer.

Many other games will do fine at those framerates, but not this one. My GTX averages around 80-85 fps at 1024x768 4xAA 8xAF everything set to high in DX9 mode. When tons of action, fog, etc. happen, those fps can, and have dropped to 37 fps. Now if your X800 is averaging 30-35, you know those frames drop into the teens in a heartbeat.
 
Originally posted by: keysplayr2003
30-35 fps in Call of Duty 2 is absolutely not playable for this game. For some reason, higher framerates are needed for smooth gameplay. I am finding that even around 50 fps in this game is pushing it. If your playing CoD2 at 30-35 fps, I expect you to be getting owned all over the place in multiplayer.

Many other games will do fine at those framerates, but not this one. My GTX averages around 80-85 fps at 1024x768 4xAA 8xAF everything set to high in DX9 mode. When tons of action, fog, etc. happen, those fps can, and have dropped to 37 fps. Now if your X800 is averaging 30-35, you know those frames drop into the teens in a heartbeat.

Actually, COD2 runs fine for me at 35-45 for me @ 1280X1024, with everything turned up but AA/AF.
 
Originally posted by: keysplayr2003
30-35 fps in Call of Duty 2 is absolutely not playable for this game. For some reason, higher framerates are needed for smooth gameplay. I am finding that even around 50 fps in this game is pushing it. If your playing CoD2 at 30-35 fps, I expect you to be getting owned all over the place in multiplayer.

Many other games will do fine at those framerates, but not this one. My GTX averages around 80-85 fps at 1024x768 4xAA 8xAF everything set to high in DX9 mode. When tons of action, fog, etc. happen, those fps can, and have dropped to 37 fps. Now if your X800 is averaging 30-35, you know those frames drop into the teens in a heartbeat.
elitist opinion stated as fact. 😛

speaking of elitist:

i noted that "Rollo's benchs" were at 4xAA/16AF and John Reynolds' was playing at 2xAA/4xAF . . . definitly makes a difference in playability . . . i doubt there is any "magic dust " involved [unless you are snorting it before posting]

in CoD2 demo, with my aging x850xt@PE, i find 10x7 [even 11x8] with everything on and high and minimal - 2xAA/4xAF - to be quite playable . . . but then i do not demand flawless FPS
::roll:
[nor do i like CoD2]
 
Back
Top