X1900XT FEAR benches at VRZone

Page 4 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Munky

Diamond Member
Feb 5, 2005
9,372
0
76
Originally posted by: Cookie Monster
Originally posted by: munky
Originally posted by: keysplayr2003
Originally posted by: munky
Originally posted by: cronic
Originally posted by: RussianSensation
If there is 1 card that was a joke from this generation it is certainly 7800GTX. Not only can you get 7800GT OC for $280-300 that matches 7800GTX performance but costs $150 more, but 7800GTX gets stomped by X1800XT where it matters the most -- shader intensive games.

Proof:

FEAR - 1280x960 4AA/16AF
7800GTX 256 = 39
X1800XT = 54 (+38%)

Call of Duty 2 - 1024x768 4AA/16AF
7800GTX = 46.3
X1800XT = 55.6 (+20%)

Battlefield 2 - 2048x1536 4AA/16AF
7800GTX = 37.6
X1800XT = 48.2 (+28%)

Far Cry - 1600x1200 4AA/16AF
7800GTX = 48.2
X1800XT = 65.2 (+35%)

Splinter Cell: CT - 1600x1200 4AA/16AF
7800GTX = 41.5
X1800XT = 47.4 (+14%)

Now to make matters worse, the 7800GTX which is now losing to X1800XT in some OpenGL games.

Quake 4 - 2048x1536 - 4AA/16AF
7800GTX = 37
X1800XT = 47.1 (+27%)

IL2 - 2048x1536 - 4AA/16AF
7800GTX = 41.6
X1800XT = 45.9 (+10%)

Doom 3 - 1600x1200 4AA/16AF
7800GTX = 51.9
X1800XT = 53.7 (+3%)

So unless you are a HUGE fan of Chronicles of Riddick, pacific Fighters and AOE3, it certainly doesn't look good for 7800GTX 256mb card as more games start to become more shader intensive. Unless of course you've been smoking some cronic :)


Funny how you don't compare it to the current performance king tthe 7800 GTX 512. I guess we know why, because that wouldn't be a fair comparrison now would it. By the way were those numbers with your x1800xt? I didn't think so. Why do so few a people actually own the 1800 series? Hmmm.....

Maybe you didnt see these benches?
http://www.xbitlabs.com/articles/video/display/games-2005.html

Anyone claiming the 256gtx is equal to the x1800xt must be looking at the 0x0x benches, because once you add AA and AF the performance lead of the $1000 512gtx king of the hill becomes a joke.

The performance lead of the GTX512 over the X1800XT at 0x 0x is actually stellar in most games in that review you linked to. Crank up AA and AF, and all it does is equal the playing field with the advantage still to the GTX in "most" games benched there. So, you calling a card that can whup an X1800XT with or without AA AF a joke is actually the real joke. Also would like to note that for some reason, xbit and a ton of other review sites only bench up to 16x12, which is great, but would like to see the ultra high res benches as well. That 30" Dell LCD is going to be sweet and high res benches would apply. Unless they limit the native res on that LCD to something incredibly stupid.

If a card that costs 2x as much if you can find one, wins by 1-10% when AA is enabled and still loses in FEAR (a NV-sponsored game!) is not a joke, then... :roll:

Higher res tests like 2000x1500 would be nice, but only SLI and Crossfire would be able to pull playable fps at those settings.

Shader intensive... Those games are no way shader intensive. Plus, those benchs dont prove X1800XT has a more "efficent' shader performance. You are just showing how "efficent" AA is on the X1800 XT.

IF you look at benchs with 0x0x that really shows which card has better shader performance, and clearly the 7 series does. This has been long known stuff, that NV had the edge with shader performance. This is a common misconception, and its really the X1800s low hit on the AA thats letting it win most of the time NOT its "efficent" shaders.

That low hit on the performance (when AA is applied) is also due to the bottleneck by its limtied pixel shaders. However, the R580 isnt bottlenecked by pixel shaders anymore. This can result in a much bigger hit than that of the R520 when AA is applied.

True, but do you think a r580 will have a bigger or a smaller AA performance hit than the 512gtx? I'm thinking percentage-wise, the ring bus controller should still enable it to have a smaller hit.
 

Cookie Monster

Diamond Member
May 7, 2005
5,161
32
86
Well, based on current benchs and the G70 architecture, the R580 might have a smaller hit than the 512 GTX. Well this is all speculation, but im not too sure as of now. What i do know is that the R580 will have a bigger hit than the R520, because the bottleneck is moved to the AA/AF on the R580 as it is no more bottlenecked by pixel shaders.

Lets see, thre X1800XT gets a 10~15%hit. Im saying around 20~30%.
 

Gamingphreek

Lifer
Mar 31, 2003
11,679
0
81
That low hit on the performance (when AA is applied) is also due to the bottleneck by its limtied pixel shaders. However, the R580 isnt bottlenecked by pixel shaders anymore. This can result in a much bigger hit than that of the R520 when AA is applied.

Where in the world did you get that. There is no shader bottleneck on the R5xx series. They win in AF simply because of their Programmable AF. They win in AA because they use a different, more efficient anti aliasing algorithm than Nvidia. Plus i dont think ATI does SuperSampling, i think they only do multi sampling. SS incurs a HUGE hit compared to MS, and while it can look better, in a lot of cases the difference is minimal.

It has nothing to do with a shader bottleneck.

-Kevin
 

Munky

Diamond Member
Feb 5, 2005
9,372
0
76
This theory actually has been discussed at b3d, and I posted the same thing here a while ago. Basically, we know that without AA/AF the 512gtx has a big lead over the r520, which can be explained by it's huge fillrate advantage. When you add AA/AF all of a sudden the r520 becomes almost even with the 512gtx, despite the 512gtx using faster mem. So, that means for the given fillrate of the r520, it's mem clocks and controller efficiency are sufficient to keep the gpu fed with data, while the 512gtx cant use it's available bandwidth as efficiently which causes the massive AA hit.

When you take a card like the r580, and put it in a situation where it can put all its 48 shaders to good use, the massive fillrate needs more memory bandwidth to feed the gpu, and enabling AA puts even more stress on the mem. The way I said it is simplifying it a lot, and there are other factors involved, but it may turn out that the r580 suffers a bigger AA hit than the r520 because the increase in bandwidth is not proportional to the increase in raw gpu fillrate. However, the AA hit should still be smaller than NV cards due to a more efficient mem controller.
 

Cookie Monster

Diamond Member
May 7, 2005
5,161
32
86
Originally posted by: munky
This theory actually has been discussed at b3d, and I posted the same thing here a while ago. Basically, we know that without AA/AF the 512gtx has a big lead over the r520, which can be explained by it's huge fillrate advantage. When you add AA/AF all of a sudden the r520 becomes almost even with the 512gtx, despite the 512gtx using faster mem. So, that means for the given fillrate of the r520, it's mem clocks and controller efficiency are sufficient to keep the gpu fed with data, while the 512gtx cant use it's available bandwidth as efficiently which causes the massive AA hit.

When you take a card like the r580, and put it in a situation where it can put all its 48 shaders to good use, the massive fillrate needs more memory bandwidth to feed the gpu, and enabling AA puts even more stress on the mem. The way I said it is simplifying it a lot, and there are other factors involved, but it may turn out that the r580 suffers a bigger AA hit than the r520 because the increase in bandwidth is not proportional to the increase in raw gpu fillrate. However, the AA hit should still be smaller than NV cards due to a more efficient mem controller.

QFT. This was what ment to say. :)
 

Cookie Monster

Diamond Member
May 7, 2005
5,161
32
86
We still dont know. It could be a tweaked/altered G70 core, or just a die shrink, or who knows what else it could be.

Unlike the R580, not many of us know about G71. By that i mean, its features, architecture, etc.
Well, its been now around 7 months from the 7800GTX launch, so i do believe NV had quite sometime tweaking the G70 core even more. Well, NV just needs to focus abit on IQ, and i believe they wont let anyone down.

 

ewitte

Junior Member
Apr 1, 2005
4
0
0
It does not count. I put my X1800XT at 1024x768 with default settings and got 115fps average. Results

X2 @ 2.6Ghz
X1800XT 750/850

This is version 1.0 because I can't update the Ukraine version with the standard US patch. These framerates are not that bad.

All Effects: Maximum or on
All Graphics options: Soft shadows off (default), 16xAF, texture medium (1gb ram stuttering issue), everything else maximum

All picutes halved in Infranview
Picture 1: 1024x768 no AA
Picture 2: 1024x768 4xAA
Picture 3: 1920x1200 2xAA (my normal settings for F.E.A.R.)
Picture 4: 1920x1200 4xAA

http://s150233688.onlinehome.us/fear/fear1.jpg
56 minimum / 115 average / 253 maximum

http://s150233688.onlinehome.us/fear/fear2.jpg
42 minimum / 90 average / 235 maximum

http://s150233688.onlinehome.us/fear/fear3.jpg
32 minimum / 52 average / 106 maximum

http://s150233688.onlinehome.us/fear/fear4.jpg
25 minimum / 42 average / 85 maximum

Bonus 1920x1200 no AA with softshadows on
http://s150233688.onlinehome.us/fear/fear5.jpg
14 minimum / 26 average / 48 maxiumum

1024x768 no AA with softshadows on
http://s150233688.onlinehome.us/fear/fear6.jpg
32 minimum / 68 average / 140 maximum