Driver Heaven Smackdown

Page 5 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

housecat

Banned
Oct 20, 2004
1,426
0
0
he has a 6800gt now. i looked up the old benchmarks after your first reply and the 5800 does great in dx8 and similar technologies but in dx9 games or shader intensive games (even tho very sparse at release) it gets smoked.

i waited through the 5800 personally, hoping for nvidia to do better. but went with a 9800pro instead. was disappointed with game issues and conflicts so went back to my gf4.
now im in limbo waiting for amd pci-e so i can get a 6600gt or 6800gt.
 

nRollo

Banned
Jan 11, 2002
10,460
0
0
Originally posted by: housecat
he has a 6800gt now. i looked up the old benchmarks after your first reply and the 5800 does great in dx8 and similar technologies but in dx9 games or shader intensive games (even tho very sparse at release) it gets smoked.

i waited through the 5800 personally, hoping for nvidia to do better. but went with a 9800pro instead. was disappointed with game issues and conflicts so went back to my gf4.
now im in limbo waiting for amd pci-e so i can get a 6600gt or 6800gt.

I guess it comes down to what you consider the useful life of a card and what you buy a card for. IMO cards are good for a year, two tops. Here we are a year and a half from 5800U launch and I own two DX9 games: Far Cry and HL2. Every other game I own still would run fine on a 5800U. That doesn't make it "HORRIBLE".
 

housecat

Banned
Oct 20, 2004
1,426
0
0
im not elaborating because like you said, you dont really want to pull up all the old benchmarks.. i dont want to explain how when i said horrible i meant; driver issues, nvidia's cheats they were pulling at the time, degraded image quality compared to ATI, with about equal performance in DX8 apps, and obvious performance loss in DX9.

well, i wouldnt consider farcry and HL2 being a mere two games.. those are two of the biggest (if not thee biggest) blockbusters of the last year! i wouldnt doubt the 9700 pro does indeed ultimately defeat the 5800ultra in OGL doom3.. which if true.. would REALLY be saying something considering thats NV's ballpark.

all this coming from a guy who wont buy ATI products anymore. i could go on and on as to why i'd (in most cases) simply recommend staying away from any ATI card. but for me to deny the 9700/9800 would just be an exercise in futility.

also i will retract my use of the word "dominant"! :)

i actually tried to buy that 5800 ultra from my friend dirt-cheap because I figured they would have a resale value of 99cents with the reputation they had.. but it appears they nearly have become a collectable, like a voodoo1/v2 sli, a NV edge3D or quadGPU V5!
being so recent its probably not something most think about.. but its guaranteed a place in the video card hall of fame with a name like the "dustbuster"!





edit- i should add that i do not hate the 5800.. in fact i generally defend it. The original NV30, hated as it might be.. I trumpeted for its forward looking features.
The NV30 Pixel Shaders contained both dynamic flow control and dynamic branching.. these are some of the main reasons why the NV40 will be more efficient than the R300 and all its offspring (including R420), will not have this feature as it will not have PS3.0 support.

This is our "DX9+" that is now revealed better and ratified in the form of DX9.0C.
Predication was also a NV30 hardware feature that is used in PS3.0, it also speeds up shaders.
Arbitrary swizzle is the 3rd formerly unused NV30 features, yet again, and efficiency improvement.
Also, the NV30 had a maximum 1024 instructions (well above the 512 minimum for PS3.0) while the R300 had 32max.

As far as "DX9+" vertex shaders, they also had dynamic branching, predication and dynamic flow control.. all which improve performance/efficiency. As well, it had a maximum of 65K maximum VS instructions (R300+ has 1024max).

So as you can see, the NV30 stuff like what you have has most of the Shader Model 3.0 features.
I believe those older cards might have or will see an extra boost that is unexpected from most who are blind sided by this information. The good thing is, the NV30 cores are missing most of the elements that they wouldnt have the power to handle, but have the Shader Model 3.0 parts that pretty much ONLY improve performance. Not that some of this stuff won't improve IQ as developers will likely push the limits further because of the added efficiency. Or, simply boost all NV30 based scores. But what they are not capable of, that stuff is out of their league.. and better left to the NV40 generation.

i dunno.. im just a nerd. :D
 

ronnn

Diamond Member
May 22, 2003
3,918
0
71
Looks like dh is never going to burn that card. But they do have the newest omegas up.
 

nRollo

Banned
Jan 11, 2002
10,460
0
0
housecat:
nvidia's cheats they were pulling at the time
You mean like Trylinear on X800s?

degraded image quality compared to ATI
You mean the AF issue that was fixed within a couple months, as reported here and elsewhere, that gave the nV30 com parable IQ and better AF?

i wouldnt doubt the 9700 pro does indeed ultimately defeat the 5800ultra in OGL doom3.. which if true.. would REALLY be saying something considering thats NV's ballpark.
You'd be wrong about that- search for my "Doom3 shocker" thread in which my benches of a 5800Ultra show it to be faster than a 9800Pro at Doom3. (I think- it was 1-2fps either way)

all this coming from a guy who wont buy ATI products anymore. i could go on and on as to why i'd (in most cases) simply recommend staying away from any ATI card. but for me to deny the 9700/9800 would just be an exercise in futility.
It's futile to limit yourself to half the gaming cards? You'll never hear me say I won't buy someone's products unless they're made of people. (Soylent X800- heh)

Anyway, much of what is needed to know about 5800s is here:
http://www.anandtech.com/showdoc.aspx?i=1821
The late release, the AF issue, performance in games. You're right the nV30 is no DX9 gamer, but there's not many Dx9 games and the ones we have barely use it and for the most part look the same in DX 8.1. (shiney waters notwithstanding)

 

housecat

Banned
Oct 20, 2004
1,426
0
0
I said I wouldnt be shocked if a 9700 Pro beat a 5800Ultra in doom3.. but since it doesnt I am not shocked that way either. I would HOPE that a 5800 could defeat a 9700 in at least the big OGL releases.

Now take any off the shelf generic DX9 game... does the 5800Ultra come close without reverting to 16FP?

I think you'd have as hard a time selling dustbusters to people as Nvidia did. :D


I dont find it futile not using ATI. I dont see the advantage. I had driver issues galore, seemingly simple games like COD based off the age old Q3A engine having problems??? I'll pass even if it was fixed. For me, something like that was just ridiculous.

In fact, I find my computing much LESS futile without ATI! They are indeed the only line with full top to bottom DX9C support.
Funny considering they also were the only ones with full top to bottom DX9 support in the last generation as well.
Rock solid drivers and compatibility, the best bang for the buck (6600/6800) and in most cases the fastest card available (6800Ultra).. I really dont see the futility in those choices.

Speaking of an excercise in futility, even my windows desktop seems to run faster, there is noticable lag when I'd run a system with my 9800 Pro and then even with a geforce2mx I'd notice a much better running, streamlined desktop. I dont think ATI's drivers try to accelerate the luna gui.
I know for a fact NV's do, because that was one of their big things when XP was released. They had already worked to accelerate the GUI.

Its a minor issue, but when you start working in resolutions like I will be Monday (1680x1050).. and you have a puny 1700+ like I do.. it does matter.
Some may have not noticed this, but take two formatted machines with a ATI and NV card in each, then open a few windows and drag one around really fast... see which one has significant lag and which one doesnt.

We've gotten completely off topic but I'd be interested if anyone else has noticed this GUI acceleration issue. Moral of the story is, I never felt I got the completely polished product from ATI that I did from NV... from outrageous Q3A engine problems to a slower running GUI. And for the money ($200-400) I expect as much shine as possible.