WetWilly-
I had to reread some of this to get my train of thoughts back
"I like UT better, which is a subjective choice. So you're saying there's an accurate choice as to whether Quake3 or UT is better? If there were, the only "accuracy" involved would be how "accurately" they comply with subjective criteria."
My point on image quality is that you need to rely either entirely on sbjectivity, or need to have a defined standard for accuracy. I would much rather have a standard then rely on other people's subjective views. When we have boards that perfectly render all environments, subjectivity then becomes important. Would you call the Lexus LS400 a quality truck?
"Yes, it would be of service. And like I've said repeatedly, a subjective judgment prefaced by "Maybe it's just us, but we found that ..." certainly qualifies any subsequent judgment as subjective. As for the amount of people offering their POV, the more the better. If you have 30 POVs in video card reviews, chances are you'll likely find consensus more than 30 totally different opinions."
And what should be used by the people to judge on? A torture test or a simplistic scene?
"Ummm ... my point was in response to your comment about FuseTalk. I don't believe FuseTalk uses Dot3 "
But it uses more then 8bit color too
"But the broad consensus is that the Radeon, which "cheats" on the sky looks better. As for color saturation, I wouldn't expect a lot of posts because most people don't notice until they see something better like the Radeon."
The better sky and lack of rainbowing in Quake3, turns into missing textures in UT. They used a hack to make them look good in one case, and came up short in another. Which is a bigger problem? Lesser quality sky or completely missing texture maps? Missing textures very clearly stand out without having to see another board.
"but my impression was that 3dfx's "severly" inaccurate rendering as exemplified by Sharky's test was not a major hardship on developers. And besides, at the rate things are going, soon you'll only have to deal with nVidia."
Not now, but should it be tollerated? As far as nV, GameCube supports OpenGL and runs on PPC hardware, the ArtX acquisition could be a boost for ATi on the PC side also.
"We rely on our eyes, and there's more to quality than accuracy. I'd take a vibrant, sharp, well saturated image with 98% image accuracy over a dull image muddled by bad video card signal filters that's rendered with 100% accuracy. I'm not alone either, as exemplified by many happy V5500 owners."
And because developers know this they are forced to plan on inaccurate images being displayed. A developer can't count on a board being accurate then they can't make integral game portions reliant on precise image accuracy. That impacts all of PC gaming.
"Then if the "total visual experience" which DOES include more than image accuracy is so important, then why don't developers put test/adjustment tools in their games that lets users adjust their gamma/contrast/brightness to be somewhere in the neighborhood of what the developer wanted the game's environment to be?"
You want to deal with the hate mail from all the 3dfx owners? This isn't a joke. If half the Voodoo loyalists had any idea how bad their boards were, particularly for Gamma at default, they would be filling developers' mailboxes with flames. I think it would be a good idea to do it myself, now with 3dfx gone perhaps we will see it.
"I became painfully aware of this a couple of years ago when I was playing a game (think it was the original Turok) and was stuck for an hour. Finally turned to a walkthrough which said to pick up something. Went to the location, but couldn't see it. I reached for something under the monitor and accidentally moved the brightness control up. Guess what? The item showed up."
You weren't running a 3dfx board I assume. This is an example of developers having to compensate for the flaws of particular hardware. Because of the extremely overbright settings on 3dfx boards many developers adjusted their game settings. They do think ahead. By default if memory serves the D3D on 3dfx cards is set to 1.3 for gamma, try that on a nVidia or ATi board(seriously, take a look

) and look at the screen(or even a 3dfx board on the desktop with that setting). Absolutely hideous. With 3dfx the main player in PC gaming back then, the D3D renderer was built to compensate for this. If you would have been running in software mode, you would have seen its' pixelated self no problem. This is actualy a good example of what I have been saying, although in this case the game was designed to be run on non accurate hardware.
"This is again exactly my point - you're talking about a definition that has a particular meaning to a specific population - developers."
I'm not a developer

The definition also holds for 3D artists and CAD/MCAD pros, the people who pushed 3D in its' infancy forward, long before gaming was a realistic objective for the technology.
"Most consumers don't even know the difference between SGI and a SDK."
But if people start saying that SDK means "soulful dog kids"(no, I can't think of anything better at the moment

) I will not agree with them. It won't matter if the mainstream decides to take it up as there definition of it, it has already been defined. The duron issue you bring up is valid, in that instance I would have to cede to the preexisting definition. If I'm talking to a British naval enthusiast I realize that "Corvette" is a ship, the car came later. The preexisting definition is not replaced because another one is more popular in the mainstream.
"I'll certainly agree that consumers' level of tolerance for imperfections is high, particularly with poor 2D."
It is much, much higher in 3D then 2D. If I typed "gbvl upi" and it showed up as "fvck you" I would be pretty ticked with whatever caused it, but this is what happens in 3D. The image fails to be rendered properly and people just smile.
"If you took a poll of V5500 owners and showed them the reference image (not the XOR image) and the V5500-rendered image, do you seriously think that 50% or more would immediately see the differences? Better yet, let them see the image test at low detail running at 60-100 FPS and see if they'd notice."
Randomly select one hundred people off of the street and show them the Mona Lisa and ten different artist done copies and see how many can tell the difference. Then bring in a group of art professors and do the same

Then try the same groups just walking by. People who care a great deal about the quality will likely have very little trouble pointing it out even when moving at 60-100FPS, those who don't, won't.
"I'll agree that GameBasement is a pretty good site, and I visit there a lot. And you're responsible for that 2nd UT CD actually being taken out of many peoples' jewel case . But you're definitely in the minority, which goes back to audreymi's point for this thread (BTW, what WAS that point ... )"
Thanks for the kind words

I understand that we are in the minority testing(and the amount that we run and never report on is
much larger then what is reported on) but I think that it should be up to the game sites to report on issues with particular games(and that was the original point, why AT didn't cover this).
Can you tell me of an across the board issue where one board wins hands down? Even in FSAA, the Radeon and GeForce retain more texture clarity then the V5, something that is thought to be completely in 3dfx's pocket.
Because we deal with games on a game by game basis, we can sit down and look long and hard at the graphics and look at what issues there are. We deal with gaming, first and foremost. For us, the graphics display quality are very important. For Anand, well, a motherboard doesn't effect the sky in Quake3 any

My point in jumping in this thread is that it isn't the place of hardware sites to deal with every possible issue that could arrise with any one game.
In all honesty, if the sky problem would have shown up in NOLF instead of Quake3 would we have heard anything at all about it on any hardware site? I highly doubt it(and it is there in 16bit, not sure when we will have that covered although no problems in 32bit at all which meant if we hadn't been looking for any problems by testing settings we wouldn't use, we wouldn't have seen it).
Do I think the hardware sites are trying to cover up the problems? No, the only place I have seen the NOLF sky issue discussed that I can recall is in email between myself and "wumpus", and it is likely one that end users will never see as it only shows up in 16bit color(which should remain the same for every board due to the compression used though we haven't been able to test yet).
"and, with a few gamma and contrast adjustments, is now almost indistinguishable from the Radeon."
You using a trini monitor? What is your gamma set at(I really wish they had numbers for contrast/brightness as well)?
"Heck, I've even got AGP4X, sidebanding and fast writes enabled on the GeForce on an Abit KT7, for what it's worth."
I do on my K7TPro2A too. In fact, I was curious as to what the he!! kind of problems everyone else was having, it worked no problems right off for me.< shrugs >
"I was a little disappointed with the Radeon, though, because I got the impression that it's really choked by it's drivers."
Drivers and Ati...< rant > I have an 8MB All-In-WonderPro Rage Pro board from early '98 that I dropped over $300 on when new. This board was for a dual boot system, NT/Win95(then 98), the NT drivers, to this day, are absolutely horrible. I knew of existing problems when I bought the card, figured they would fix them given enough time, they never did. They were more concerned cheating on ZD's 3DWinBench then making something the consumer could use:|:|< /rant >
"Ultra anyone?"
Don't forget the Pro

It does seem like they think they are leaders in the performance race when at best they are tied for third, fourth if you are being a bit anal about a couple FPS

.