Originally posted by: fsstrike
Its not a downside at all. ATI still manages to beat nVidia is FarCry when SM3.0 is enabled. If you dont belive me, go check out the benchmarks @ anandtech. nVidia does not position itself anywhere ahead of ATI even with the advantage of SM3.0, which to me is very pathetic. It is funny how, even though ATI is winning, people claim nVidia is best when the benchmarks clearly show the opposite.
Just in case you dont know how to browse Anandtech:
http://www.anandtech.com/video/showdoc.aspx?i=2102
And it isnt the fact that ATI wins 12 and nVidia wins 10, it is the fact that ATI wins at all. nVidia has a HUGE advantage of SM3.0, and they SHOULD be completely crushing ATI. But, this does not happen, which cleary means that ATI is better.
The rumor over at Farcry message boards is, that Farcry patch 1.3 will have even MORE SM3.0 shaders and coding in it, so is ATI gonna be faster then? Or come up with another software trick so it takes up more CPU cycles

PLUS the fact that NV wasnt designed to be the DirectX king, but it does however compete quite well with it, surpassing expectations of even ATI. The Reason ATI is winning some benchmarks, is, it has a faster core. Faster core = more bandwidth.
Originally posted by: Runner20
Originally posted by: fsstrike
Its not a downside at all. ATI still manages to beat nVidia is FarCry when SM3.0 is enabled. If you dont belive me, go check out the benchmarks @ anandtech. nVidia does not position itself anywhere ahead of ATI even with the advantage of SM3.0, which to me is very pathetic. It is funny how, even though ATI is winning, people claim nVidia is best when the benchmarks clearly show the opposite.
Just in case you dont know how to browse Anandtech:
http://www.anandtech.com/video/showdoc.aspx?i=2102
And it isnt the fact that ATI wins 12 and nVidia wins 10, it is the fact that ATI wins at all. nVidia has a HUGE advantage of SM3.0, and they SHOULD be completely crushing ATI. But, this does not happen, which cleary means that ATI is better.
I am a ATI person and would recommend ATI cards to hardcore gamers and other non gamers as well.
Very reliable ..... i know the x800 xt doesnt have 3.0 support but it still beats the 6800 is raw speed in a lot of games
Yeah, at the expense of lowering IQ via the "adaptive algorithm" aka AF cheats...which, btw CANNOT be turned off by the consumer, but it has been hacked, and is approximately 22% lower performance. It uses "TRY"Linear, not tri-linear. I give you this though, NV has the same thing, BUT IT HAS THE OPTION of being turned off.
Originally posted by: JBT
Originally posted by: kelvin1704
Don't forgot Nvidia card require additional power CORDS.... and takes up two slots......
tat;s an disadvantage too over ATI especially to those people which prefer SFF and for those that is or not, it will means have to spend extra on power supply.
Only the Ultra has more power cards. And I could give a crap about that. I actually would perfur it if my GT's cooler took up 2 slots the 2nd power molex isn't required it is if OCing though. My GT surpasses Ultra speeds no problem so the extra power isn't needed. I have heard the 2nd one really only supplies about 5 watts anyways.
Let's also not forget the ATI card runs HOTTER. And the GT uses one slot, only the ultra uses 2, but that is just for safe keeping. If you use the the X800XT, you probably dont use the 2nd slot anyways, coz the heatsink is so big. If you do use the 2nd slot, I suggest not to, because you could overheat your card...Every video card needs to breathe.
Originally posted by: Matthias99
Originally posted by: CarrotStick
Originally posted by: Selso2109
Where are these benchmarks carrot stick? Post a link...
LINK
Those are benches with the leaked beta from a year and a half ago. The only conclusion you can draw here is that the NV40 is a lot better than the NV30.
Look at the bottom of that page. The date on it, says "01.07.2004" This website is european, in Europe, that means it was tested on July 1st, 2004. It makes no mention of what build of HL2 they have. Looks like the NV 6800U o/c is beating ATI in their own ball field. Home team isnt winning!
kelvin1704, only the NV Ultra uses the 2 slot design, the 6800 GT still uses 1 slot. By this I mean it uses 1 PCI slot AND power cord. The ATI's heatsink is so big, I wouldnt use the 2nd slot on your motherboard. So that debate is moot. Unless of course you want to fry your card. Even with my old GeForce 3 Ti200, I didnt use the 2nd slot, I let the fan do it's job. Not blow back in it's face. By using the 2nd slot, the only thing you're doing is blowing the hot air back onto the heatsink, kind of like blowing a hairdryer back on to it. Now, you wouldnt put a hairdryer on your video card would you?