ATI X800 Pro or XT = Shader 3.0

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Originally posted by: CarrotStick
ATI has fallen behind. There is no firmware upgrade for shader 3.0 as it is all hardware based. The X800XT is just last generation technology supercharged. I like ATI cards but as of now Nvidia is clearly in the lead....

Really?

That's why in "The Fastest Graphics Cards of Summer 2004" article at Xbitlabs (can't post the link right now as the server is down), which shows the most extensive comparison of 35 games , if you compare X800Xt to 6800ultra in quality modes (no one plays without AA/AF with these top end cards) and actually write down on a piece of paper the wins and loses of both cards, then you will see that X800XT is a faster card than a 6800ultra. I would agree with you that 6800gt is better than x800pro, but x800xt is at least as good as 6800ultra. Considering that ATI is actually faster in the only game that supports PS3.0 (Far Cry), I wouldn't put too much hope into PS3.0 performance just yet. ATI is faster in Far Cry, nvidia is faster in Doom 3. As far as I am concerned both are excellent games and this makes it 1:1. (But ATI is generally faster in newer Direct3D games such as IL-2 Sturmovik, Unreal Tournament 2004, etc. And if you consider all other OpenGL games where Nvidia is faster, X800xt can run all of them at 100fps+ @1600x1200 4AA/8AF). Also you cannot forget that in HL2, most likely, ATI will be faster as well. And by the time games on Doom 3 engine come out, new generation of cards will have arrived (be it an updated R480 or new R520/NV50). People who spend $400-500 on a videocard will probably buy the next wave and so on and do not purchase for long-term usage. Once you weigh in lower PSU requirements, single slot design, lower heat dissipation, quieter onboard fan, then ATI doesnt seem to be such a bad purchase afterall. The problem is availability and price gouging. Considering 6800Gt can be found for under $400 and x800xt is around $540, the $140 extra spend is hardly justified. Of course if doom 3 performance is the mother of all, then sure Nvidia is in the lead. Otherwise I would still consider X800xt the fastest card.

Moreso, I think by the time PS3.0 becomes mainstream, the current generation of videocards will become too slow based on enthusiasts' standards. PS3.0 has not shown any image enhancements either over PS2.0, aside for some minor performance improvement that are even made less important with ATI's PS2.0b (extended shader instruction set). Of course 3Dc has not lived up to its expectations either.

Originally posted by: Shamrock

Let's also not forget the ATI card runs HOTTER. And the GT uses one slot, only the ultra uses 2, but that is just for safe keeping.

"Factoring in the 69% efficiency of our power supply, the X800 XT required roughly 17 Watts less than the GF 6800 Ultra in our tests." - Source

Considering the power consumption of a videocard is directly proportional with its heat output, you might want to reconsider that statement. But since both videocards run without artifacts and do not experience instability, I understand how this might be a moot point. Then again, for those that spend $60 on a cpu cooler for extra 4-5*C improvement, the extra heat from the card might be something to consider. Let's not forget that ATI cards are also quieter.
 

Blastman

Golden Member
Oct 21, 1999
1,758
0
76
Those HL2 benches for all we know are running the same lower shader path on the 6800?s that the NV30 used -- so I would take them with a grain of salt until the game comes out.

Originally posted by: Shamrock

Yeah, at the expense of lowering IQ via the "adaptive algorithm" aka AF cheats...which, btw CANNOT be turned off by the consumer, but it has been hacked, and is approximately 22% lower performance. It uses "TRY"Linear, not tri-linear. I give you this though, NV has the same thing, BUT IT HAS THE OPTION of being turned off.

The adaptive-Trilinear on the X800 is much better (at blending mipmaps) than the Brilinear on the 6800. It?s NV that is getting the advantage of lower IQ in the benchmarks.

With the FX series NV was so far behind technically the R300 they reduced IQ drastically to try and keep up in the benchmarks. The problem is that they pushed their Trilinear so far towards Bilinear that it became an IQ problem that people could spot (that?s how it was noticed in the first place). They couldn?t have such lousy filtering for the 6800 so they had to ratchet things back towards the quality side of things. So while the current Brilinear is better than it was in previous drivers, they are still pushing filtering close to the point where it is an IQ problem -- that?s why NV had to put a OFF switch for their Trilinear optimizations. On the other hand, ATI?s adaptive-Tri is so close in quality to their ?old? Tri their really isn?t any point in putting a switch on the control panel to turn it off. You won?t get better IQ with the ?old? Trilinear.
 

Shamrock

Golden Member
Oct 11, 1999
1,441
567
136
Originally posted by: SickBeast
Originally posted by: Shamrock
I didnt tout the benefits with SM 3.0 until I saw their performance and glamour.

? :confused:

Far Cry barely scratches the surface of SM3, and any "glamour" you saw was also possible on ATi's X800 cards with the same boost in framerate.

It's funny that you bring up Far Cry when you're obviously on the side of the nVidia cards; the X800XT *dominates* in Far Cry.


I didnt say anything about ATI dominating in FarCry or not. I stated that SM 3.0 brought performance and glamour to NV cards. Considering before SM 3, NV was using 1.1 shaders before the patch. And Ironically, I dont find it as dominating as you might think.

http://www.ixbt.com/video2/images/nv40-3/fc.png
http://www.ixbt.com/video2/images/nv40-3/fc-aaa.png

As for retracting my statement. While the wattage may be higher, the temperatures are not. Until you can show me an ATI XT under heavy load, running under 67c. I just saw not 3 minutes ago, of a NV 6800U running at 67c after playing a level of Domm3. Everyday I see people reporting temps on the NV GT at about 72c (it's higher because of one slot design) under load, and I see ATI over 80c. I can give you a power supply that outputs 1000w, and one that outputs 250w, and I can get the 1000w with lower temperatures than your 250w. It's the cooling, is the reason for the wattage consumption.
 

Gamingphreek

Lifer
Mar 31, 2003
11,679
0
81
What everyone still is not grasping is that Far Cry is not a native SM3.0 game. Therefore some of the features were left out.

ATI's SM2.0b incorporates some of SM3.0's features. It just so happens that when the Far Cry 1.2 patch came, since it wasn't native SM3.0 and could not support everything, it supported everything or close to everything of SM2.0b. Therefore ATI and Nvidia are seeing similiar gains in performance. If this game was natively coded in for SM3.0 then we would probably see many more gains in Nvidias favour.

We will just have to wait until SM3.0 becomes mainstream per se and games are natively encoded in it to se the performance increases.

-Kevin
 

Blastman

Golden Member
Oct 21, 1999
1,758
0
76
ixbt has already shown that the Farcry SM3.0 1.2 patch speed improvements really had ?nothing? to do with SM3.0. All they did was collapse several light passes into one. That?s why they were able to run the patch on the X800 and get large speed increases also.

There is also some good reasons why they didn?t bother with any true SM3.0 features on the Farcry 1.2 patch. Even in a shader heavy game like Farcry the vast majority of PS?s have a very small instruction count. One site counted the shaders (I think on one level) and over 2/3 of the shaders are PS1.1 -- shaders limited to an instruction count of 8 or less. Most of the Floating-Point shaders in PS2.0 are probably quiet similar --ie ? a relatively small instruction count -- well under even 20-30. SM3.0 is not going to do anything performance wise for all these small shaders. Now Farcry apparently has a couple of shaders ~ 100 instructions and maybe SM3.0 could help these a little -- but since 98% of the shaders you?re running can?t be helped by SM3.0 the overall speed improvement by adding SM3.0 to these couple of shaders is going to be miniscule. NV/Crytek knew they couldn?t get any significant speed improvement by toying with the shaders so they made other speed improvements and passed them off as SM3.0 .

Even looking to the future, I just don?t think there is going to be a lot of use for shaders 100++ instructions long. I mean what kind of effect is going to use a shader 200-300 instructions long????????? Even the limited instruction count of 8 on PS1.1 produces some very good lighting (water) effects (like in the 3Dmark2001 nature demo). Game producers are going to be using more and more shaders for effects, but mostly small ones. So while 3.0 ? ?can? ? be used for performance gains in very long shaders -- the rub is that these long shaders are likely only ever going to make up a fraction of the shader effects that are being used. This scenario doesn?t translate into SM3.0 having any significant performance advantages over SM2.0b -- even in the long run.

PS2.0 was the big jump. We had to get to FP shaders and get more precision for longer ones with a higher dynamic range. What really counts is being able to run a lot of small shaders fast and have enough precision for high-dynamic-lighting (PS2.0 24bit aka R300), not being able to run real long shaders which probably have extremely little use in general for games. I suspect ATI engineers knew this so they beefed up SM2.0+++ on the X800 rather than go SM3.0.

I?m not saying SM3.0 is totally useless, but even a shader heavy game like HL2 that uses Shader Model 2.0 extensively the shaders are only 30-40 instructions long -- well short of the 96 maximum instruction limit even in the standard PS2.0. The standard PS2.0 was threatening to become a ??little bit limited?? so we have PS2.0+++ ( up to 1536 on one pass) on the X800. SM3.0 opens things up for NV.

fs crytek interview

FiringSquad: Oh really, you?re at 96 instructions with 2.0 now?

Cevat Yerli: Yes, and we?re not using more because we don?t need more. If you look at the image quality you can achieve with 2.0 ? with 2.0 you can achieve pretty much any quality you want for right now?..
 

GoldFiles

Senior member
Jul 27, 2004
209
0
0
This is all true, but none of these cards are going to be king if they never make any of them!! I've been waiting since June 4th (the ship date) to purchase the x800XT PE. And it has been backordered for months (and months to come). The 6800 ultra card is in the same boat as ATI when it comes to backorders. I'm tired of waiting, and I'm likely gonna buy the first one that is for sale.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Originally posted by: Shamrock

As for retracting my statement. While the wattage may be higher, the temperatures are not. Until you can show me an ATI XT under heavy load, running under 67c. I just saw not 3 minutes ago, of a NV 6800U running at 67c after playing a level of Domm3. Everyday I see people reporting temps on the NV GT at about 72c (it's higher because of one slot design) under load, and I see ATI over 80c. I can give you a power supply that outputs 1000w, and one that outputs 250w, and I can get the 1000w with lower temperatures than your 250w. It's the cooling, is the reason for the wattage consumption.

Fair enough. Here is another article on power consumption for ATI cards for now. Power Consumption of Contemporary Graphics Accelerators. Part I: Graphics Cards on ATI Chips

"One Thought about Heat Dissipation
Linking power consumption and heat dissipation of a graphics card I follow the law of conservation of energy. Evidently, the graphics card is not a power source for any other PC component, so all the energy it consumes is exuded in the form of heat. Thus, all the power consumption numbers that are listed below can be referred to as heat dissipation." - Tim Tscheblockov, XbitLabs

ATI cards of this generation are just as hot as the previous generation of cards, but 2x as fast. Waiting for the follow-up with Nvidia's data.... Of course the heat issue is a moot point as enthusiasts have exhaust, intake fans, proper cooling, and even if the card ran at 100*C, but was stable, then what gives? But since we were on the topic...
 

Matthias99

Diamond Member
Oct 7, 2003
8,808
0
0
Originally posted by: RussianSensation
Originally posted by: Shamrock

As for retracting my statement. While the wattage may be higher, the temperatures are not. Until you can show me an ATI XT under heavy load, running under 67c. I just saw not 3 minutes ago, of a NV 6800U running at 67c after playing a level of Domm3. Everyday I see people reporting temps on the NV GT at about 72c (it's higher because of one slot design) under load, and I see ATI over 80c. I can give you a power supply that outputs 1000w, and one that outputs 250w, and I can get the 1000w with lower temperatures than your 250w. It's the cooling, is the reason for the wattage consumption.

Fair enough. Here is another article on power consumption for ATI cards for now. Power Consumption of Contemporary Graphics Accelerators. Part I: Graphics Cards on ATI Chips

"One Thought about Heat Dissipation
Linking power consumption and heat dissipation of a graphics card I follow the law of conservation of energy. Evidently, the graphics card is not a power source for any other PC component, so all the energy it consumes is exuded in the form of heat. Thus, all the power consumption numbers that are listed below can be referred to as heat dissipation." - Tim Tscheblockov, XbitLabs

I agree with this quote, but a *lot* of people confuse heat dissipation/power draw and temperature. Putting better cooling on a video card (like those 2-slot 6800Us and some 6800GTs) lowers the temperature that it runs at, but the amount of heat it outputs overall (and the amount of power it draws) stays the same. Quoting temperatures for graphics cards is *completely useless* unless both cards use the same (or very similar) cooling and are in the same environmental conditions (case temperature and airflow).
 

Shamrock

Golden Member
Oct 11, 1999
1,441
567
136
Originally posted by: RussianSensation
Originally posted by: Shamrock

As for retracting my statement. While the wattage may be higher, the temperatures are not. Until you can show me an ATI XT under heavy load, running under 67c. I just saw not 3 minutes ago, of a NV 6800U running at 67c after playing a level of Domm3. Everyday I see people reporting temps on the NV GT at about 72c (it's higher because of one slot design) under load, and I see ATI over 80c. I can give you a power supply that outputs 1000w, and one that outputs 250w, and I can get the 1000w with lower temperatures than your 250w. It's the cooling, is the reason for the wattage consumption.

Fair enough. Here is another article on power consumption for ATI cards for now. Power Consumption of Contemporary Graphics Accelerators. Part I: Graphics Cards on ATI Chips

"One Thought about Heat Dissipation
Linking power consumption and heat dissipation of a graphics card I follow the law of conservation of energy. Evidently, the graphics card is not a power source for any other PC component, so all the energy it consumes is exuded in the form of heat. Thus, all the power consumption numbers that are listed below can be referred to as heat dissipation." - Tim Tscheblockov, XbitLabs

ATI cards of this generation are just as hot as the previous generation of cards, but 2x as fast. Waiting for the follow-up with Nvidia's data.... Of course the heat issue is a moot point as enthusiasts have exhaust, intake fans, proper cooling, and even if the card ran at 100*C, but was stable, then what gives? But since we were on the topic...


I'll Agree there, I have a total of 10 fans in my PC. & in the case, 1 vid, 1 cpu, 1 chipset. I'm not worried about cooling myself, but some people are. It's a compromise I guess :)