ATI X800 Pro or XT = Shader 3.0

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

dguy6789

Diamond Member
Dec 9, 2002
8,558
3
76
ATI still has better cards IMO.

ATI cards are still faster then nvidia ones with aa/af, not to mention better 2d image quality. ATI also has truform 2.0 which i havent seen in use yet, but could help alot.

Not only that, but ati cards have temporal AA, which loses almost NO performance.

My friend's 9600 pro loses 3fps at MOST playing any game using 2x temporal aa over none. And i have not seen any flickering what so ever anywhere.
 

dguy6789

Diamond Member
Dec 9, 2002
8,558
3
76
Originally posted by: dguy6789
ATI still has better cards IMO.

ATI cards are still faster then nvidia ones with aa/af, not to mention better 2d image quality. ATI also has truform 2.0 which i havent seen in use yet, but could help alot.

Not only that, but ati cards have temporal AA, which loses almost no performance at 2x and ALOT less at 4 and 6x. Now 6x temporal aa is faster then 4x non temporal aa.

My friend's 9600 pro loses 3fps at MOST playing any game using 2x temporal aa over none. And i have not seen any flickering what so ever anywhere.
 

Schadenfroh

Elite Member
Mar 8, 2003
38,416
4
0
ATI is not the best card for me, dguy6789, the 2d IQ superiority is debatable, almost all the v/c by nvidia i have seen in recent years built buy decent manufacturers have just as good 2D IQ as the radeons we have at school. I do not use FSAA becuase it screws up text in Desertcombat (one of the only games i play that is DX), I have seen the fuzzy text on 1942 with AA on and it looks clearer on my 4200 than my friends 9800 non pro. In short, i say that the bad 2d IQ on nvidia's side is a myth. people that want great 2d need to look at the parhelia. I play some OpenGL games also, probably more games in my library are OpenGL than directX. So nvidia is the best solution for me.
 

Gamingphreek

Lifer
Mar 31, 2003
11,679
0
81
Also for all those saying 3.0 should have thrown Nvidia in the lead... we have no SM3.0 games today. Yeah far cry has been "extended/modified" if you will to support it but it does not natively support SM3.0. Also if you haven't noticed all the SM3.0 featurees aren't there.

I think that there is more than meets the eye right now and SM3.0 is just beginning to stretch its legs. Also remember that the NV40 is a completely new architecture.. and therefore the drivers are still immature.

-Kevin
 

kelvin1704

Senior member
Mar 21, 2001
869
0
0
Don't forgot Nvidia card require additional power CORDS.... and takes up two slots......

tat;s an disadvantage too over ATI especially to those people which prefer SFF and for those that is or not, it will means have to spend extra on power supply.
 

JBT

Lifer
Nov 28, 2001
12,094
1
81
Originally posted by: kelvin1704
Don't forgot Nvidia card require additional power CORDS.... and takes up two slots......

tat;s an disadvantage too over ATI especially to those people which prefer SFF and for those that is or not, it will means have to spend extra on power supply.

Only the Ultra has more power cards. And I could give a crap about that. I actually would perfur it if my GT's cooler took up 2 slots the 2nd power molex isn't required it is if OCing though. My GT surpasses Ultra speeds no problem so the extra power isn't needed. I have heard the 2nd one really only supplies about 5 watts anyways.
 

Lebeau

Member
Jul 28, 2004
25
0
0
Originally posted by: kelvin1704
Don't forgot Nvidia card require additional power CORDS.... and takes up two slots......

tat;s an disadvantage too over ATI especially to those people which prefer SFF and for those that is or not, it will means have to spend extra on power supply.

if you can afford a $400 card, I dont think a 80 dollar PSU will force you to sell your liver
 

KillaKilla

Senior member
Oct 22, 2003
416
0
0
Originally posted by: Lebeau
Originally posted by: kelvin1704
Don't forgot Nvidia card require additional power CORDS.... and takes up two slots......

tat;s an disadvantage too over ATI especially to those people which prefer SFF and for those that is or not, it will means have to spend extra on power supply.

if you can afford a $400 card, I dont think a 80 dollar PSU will force you to sell your liver

Actually, with the relative prices of VCs to entire systems, and the performance benefits, Some midrange systems have X800/6800s. And A better PSU does not improve performance, so it effectively adds 35-80$ in cost to the VC. And I already sold my liver for a 486DX9, so Im out there.:brokenheart:
 

Matthias99

Diamond Member
Oct 7, 2003
8,808
0
0
Originally posted by: CarrotStick
Originally posted by: Selso2109
Where are these benchmarks carrot stick? Post a link...

LINK

Those are benches with the leaked beta from a year and a half ago. The only conclusion you can draw here is that the NV40 is a lot better than the NV30.
 

rbV5

Lifer
Dec 10, 2000
12,632
0
0
ATI also has truform 2.0 which i havent seen in use yet, but could help alot.

Don't count on Truform 2.0 to help out at all, its not an improved method of hardware n-patch support. The lack of SM3.0 won't be a major downside for the X800 cards unless SM3.0 enabled games offer a performance advantage on cards with SM 3.0 hardware support, and that has yet to be seen.
 

carni

Member
Mar 22, 2004
92
0
0
By the time shader 3.0 is implemented in games that card will be a couple steps down the ladder. I rerad in an article that they are barely relly employing the full spectrum of shader 2.0 in games. Right now having 3.0 is like owning a Ferrari. Sure it does 200 mph, but where in the heck are you going to be able to go that fast. I do however admit that if I were goingt o upgrade my 9800 XT I would probably get a 6800 GT. You can easily clock it up to Ultra speeds (which is teh only difference between the two) and it costs less than the X800 Pro. Since these cards aall perform close I would go with bang for the buck.
 

Oyeve

Lifer
Oct 18, 1999
22,047
877
126
Originally posted by: carni
By the time shader 3.0 is implemented in games that card will be a couple steps down the ladder. I rerad in an article that they are barely relly employing the full spectrum of shader 2.0 in games. Right now having 3.0 is like owning a Ferrari. Sure it does 200 mph, but where in the heck are you going to be able to go that fast. I do however admit that if I were goingt o upgrade my 9800 XT I would probably get a 6800 GT. You can easily clock it up to Ultra speeds (which is teh only difference between the two) and it costs less than the X800 Pro. Since these cards aall perform close I would go with bang for the buck.

This is SO true. Look how long it took for DX9 games came out after DX9 cards were available. I believe it took 3 VC generations with DX9 capabilities before any good DX9 games surfaced. i for one am keeping my 9800 Pro for at least another 2 generations as i am quite sure it will handle Doom 3 and Halflife 2 just fine. If I can get 30-40 fps at a decent resolution then i will be just fine.
 

CarrotStick

Member
Jul 22, 2004
68
0
0
Originally posted by: carni
By the time shader 3.0 is implemented in games that card will be a couple steps down the ladder. I rerad in an article that they are barely relly employing the full spectrum of shader 2.0 in games. Right now having 3.0 is like owning a Ferrari. Sure it does 200 mph, but where in the heck are you going to be able to go that fast. I do however admit that if I were goingt o upgrade my 9800 XT I would probably get a 6800 GT. You can easily clock it up to Ultra speeds (which is teh only difference between the two) and it costs less than the X800 Pro. Since these cards aall perform close I would go with bang for the buck.


WRONG. Shader 3.0 is VERY easy to implement over shader 2.0. Your going to see shader 3.0 everywhere very soon since is so easy to implement.
 

JackHawksmoor

Senior member
Dec 10, 2000
431
0
0
This is SO true. Look how long it took for DX9 games came out after DX9 cards were available. I believe it took 3 VC generations with DX9 capabilities before any good DX9 games surfaced.

It took along time to get Direct X 8 level stuff, but we got Direct X 9 level stuff almost within a couple of months of the first hardware last year. It happened while the first generation PS 2.0 hardware was still on the market. Mmm, oh, I forgot about the Radeon 9700 back in 2002. Still, it didn't take very long for PS 2.0 games to come out, and Farcry and probably Half-Life 2 already use PS 3.0.
 

Matthias99

Diamond Member
Oct 7, 2003
8,808
0
0
Originally posted by: CarrotStick
Originally posted by: carni
By the time shader 3.0 is implemented in games that card will be a couple steps down the ladder. I rerad in an article that they are barely relly employing the full spectrum of shader 2.0 in games. Right now having 3.0 is like owning a Ferrari. Sure it does 200 mph, but where in the heck are you going to be able to go that fast. I do however admit that if I were goingt o upgrade my 9800 XT I would probably get a 6800 GT. You can easily clock it up to Ultra speeds (which is teh only difference between the two) and it costs less than the X800 Pro. Since these cards aall perform close I would go with bang for the buck.


WRONG. Shader 3.0 is VERY easy to implement over shader 2.0. Your going to see shader 3.0 everywhere very soon since is so easy to implement.

Yes, but they still have to support Shader 2.0 (unless they want to not support anything but the NV40-based cards). Few developers will do a full rewrite of their game in SM3.0 for the handful of NV40s out there.

And I still believe there are few situations where SM3.0 is going to provide noticeable performance boosts. But you are correct in that the move from SM2.0 to SM3.0 is trivial -- you can recompile SM2.0 shaders directly to SM3.0, although it probably won't run any faster unless you're using the more advanced SM3.0 features (and even then it might not be faster, just easier to code).
 

BJKNR2004

Member
Jul 10, 2004
27
0
0
I am not the smartest person when it comes to talking hardware API's (Direct X, Open GL). But I can tell you that in my favorite game running open GL, NASCAR Racing 2003 Season, My Ti4600 128mb will beat my Radeon x800 pro (This is with no AA or AF on the ti4600, and 2x AA and 8x AF on the X800). But in Direct 3D, the X800 pro is about 130% faster and if I enable 4x AA and 16x AF about 60%...

SO wouldn't ATI just need to improve its Open GL drivers to get in the game with the Doom 3 engine? Or is it just that ATI Hardware cannot run Open GL the same as Nvidia cards because of the architecture? Im just confused as to why the 6800's and X800 are ABOUT the same in most cases yet some games are showing huge differences, wouldn't that be all software based??
 

stickybytes

Golden Member
Sep 3, 2003
1,043
0
0
Doom 3 makes nvidia look like the king.

But i'll reserve my true judgement over who is the victor when the final edition of half life 2 comes out and is benchmarked. I probably take a wild guess and bet they'll program something in hl 2 that will purposely make it slower on nvidia's cards making the x800's look faster but we all know the 6800 cards are more technologically advanced. Shader 3.0 and a built-in video processor. What more can i say?
 

Raspewtin

Diamond Member
Nov 16, 1999
3,634
0
0
Above 50 FPS or so, I personally doubt people will be able to see the difference in gameplay.
 

So

Lifer
Jul 2, 2001
25,923
17
81
I really am glad that NVIDIA is back on top. I still hate ATI's drivers (yes, I have a current gen ATI card, and still hate their drivers) NVIDIA never gave me graphical errors that required a restart if I'd hibernated the computer since I rebooted. Nor did I ever have an NVIDIA card get into an infinite reboot loop if only a DVI monitor is connected. :|
 

kelvin1704

Senior member
Mar 21, 2001
869
0
0
OK, can someone share some light?

THe ATI best board use or do not use extra power socket?

and does it use two slot or big heatsink which means not recommended for SFF?
 

Shamrock

Golden Member
Oct 11, 1999
1,441
567
136
Originally posted by: fsstrike
Its not a downside at all. ATI still manages to beat nVidia is FarCry when SM3.0 is enabled. If you dont belive me, go check out the benchmarks @ anandtech. nVidia does not position itself anywhere ahead of ATI even with the advantage of SM3.0, which to me is very pathetic. It is funny how, even though ATI is winning, people claim nVidia is best when the benchmarks clearly show the opposite.

Just in case you dont know how to browse Anandtech:
http://www.anandtech.com/video/showdoc.aspx?i=2102

And it isnt the fact that ATI wins 12 and nVidia wins 10, it is the fact that ATI wins at all. nVidia has a HUGE advantage of SM3.0, and they SHOULD be completely crushing ATI. But, this does not happen, which cleary means that ATI is better.

The rumor over at Farcry message boards is, that Farcry patch 1.3 will have even MORE SM3.0 shaders and coding in it, so is ATI gonna be faster then? Or come up with another software trick so it takes up more CPU cycles ;) PLUS the fact that NV wasnt designed to be the DirectX king, but it does however compete quite well with it, surpassing expectations of even ATI. The Reason ATI is winning some benchmarks, is, it has a faster core. Faster core = more bandwidth.

Originally posted by: Runner20
Originally posted by: fsstrike
Its not a downside at all. ATI still manages to beat nVidia is FarCry when SM3.0 is enabled. If you dont belive me, go check out the benchmarks @ anandtech. nVidia does not position itself anywhere ahead of ATI even with the advantage of SM3.0, which to me is very pathetic. It is funny how, even though ATI is winning, people claim nVidia is best when the benchmarks clearly show the opposite.

Just in case you dont know how to browse Anandtech:
http://www.anandtech.com/video/showdoc.aspx?i=2102

And it isnt the fact that ATI wins 12 and nVidia wins 10, it is the fact that ATI wins at all. nVidia has a HUGE advantage of SM3.0, and they SHOULD be completely crushing ATI. But, this does not happen, which cleary means that ATI is better.

I am a ATI person and would recommend ATI cards to hardcore gamers and other non gamers as well.

Very reliable ..... i know the x800 xt doesnt have 3.0 support but it still beats the 6800 is raw speed in a lot of games


Yeah, at the expense of lowering IQ via the "adaptive algorithm" aka AF cheats...which, btw CANNOT be turned off by the consumer, but it has been hacked, and is approximately 22% lower performance. It uses "TRY"Linear, not tri-linear. I give you this though, NV has the same thing, BUT IT HAS THE OPTION of being turned off.

Originally posted by: JBT
Originally posted by: kelvin1704
Don't forgot Nvidia card require additional power CORDS.... and takes up two slots......

tat;s an disadvantage too over ATI especially to those people which prefer SFF and for those that is or not, it will means have to spend extra on power supply.

Only the Ultra has more power cards. And I could give a crap about that. I actually would perfur it if my GT's cooler took up 2 slots the 2nd power molex isn't required it is if OCing though. My GT surpasses Ultra speeds no problem so the extra power isn't needed. I have heard the 2nd one really only supplies about 5 watts anyways.

Let's also not forget the ATI card runs HOTTER. And the GT uses one slot, only the ultra uses 2, but that is just for safe keeping. If you use the the X800XT, you probably dont use the 2nd slot anyways, coz the heatsink is so big. If you do use the 2nd slot, I suggest not to, because you could overheat your card...Every video card needs to breathe.

Originally posted by: Matthias99
Originally posted by: CarrotStick
Originally posted by: Selso2109
Where are these benchmarks carrot stick? Post a link...

LINK

Those are benches with the leaked beta from a year and a half ago. The only conclusion you can draw here is that the NV40 is a lot better than the NV30.

Look at the bottom of that page. The date on it, says "01.07.2004" This website is european, in Europe, that means it was tested on July 1st, 2004. It makes no mention of what build of HL2 they have. Looks like the NV 6800U o/c is beating ATI in their own ball field. Home team isnt winning!

kelvin1704, only the NV Ultra uses the 2 slot design, the 6800 GT still uses 1 slot. By this I mean it uses 1 PCI slot AND power cord. The ATI's heatsink is so big, I wouldnt use the 2nd slot on your motherboard. So that debate is moot. Unless of course you want to fry your card. Even with my old GeForce 3 Ti200, I didnt use the 2nd slot, I let the fan do it's job. Not blow back in it's face. By using the 2nd slot, the only thing you're doing is blowing the hot air back onto the heatsink, kind of like blowing a hairdryer back on to it. Now, you wouldnt put a hairdryer on your video card would you?
 

Shamrock

Golden Member
Oct 11, 1999
1,441
567
136
I forgot something...why is ATI folks throwing out this HL2 crap? it isnt out yet, D3 is!

This just in: Half Life 2 is expected to ship Sept 15, 2025. About the same time DVD drives will be discontinued
[/sarcasm off]
 

SickBeast

Lifer
Jul 21, 2000
14,377
19
81
Originally posted by: Shamrock
I forgot something...why is ATI folks throwing out this HL2 crap? it isnt out yet, D3 is!

It's the same reason as the guys touting the benefits of SM3 have (yourself included). The first game to fully utilize that feature will probably be Unreal3 in 2006.

Far Cry shows the same gains in benchmarks on ATi hardware and nVidia hardware with the 1.2 patch. The 1.3 patch will probably introduce 3DC support, which only ATi cards support. The R420 cards can run shaders with over 1500 instructions. That's fine for Far Cry; the only disadvantage R420 has is that it can render 3 lights per pass vs. 4 per pass on NV40. This doesn't even show up in the benchmarks though.
 

Shamrock

Golden Member
Oct 11, 1999
1,441
567
136
I didnt tout the benefits with SM 3.0 until I saw their performance and glamour. As my 5950U doesnt support it, I stay out of a debate I know nothing about :p
 

SickBeast

Lifer
Jul 21, 2000
14,377
19
81
Originally posted by: Shamrock
I didnt tout the benefits with SM 3.0 until I saw their performance and glamour.

? :confused:

Far Cry barely scratches the surface of SM3, and any "glamour" you saw was also possible on ATi's X800 cards with the same boost in framerate.

It's funny that you bring up Far Cry when you're obviously on the side of the nVidia cards; the X800XT *dominates* in Far Cry.