Robo-
"uh, try Q3 @1024x768 w/ FSAA on, and EVERY viedo card will grind to a standstill"
GeForce3😀
Weyoun-
I want to cover several points you made, but I'm going to take them in random order🙂
"When you really consider 3dmark, it's hardly what I'd call a conclusive benchmark suite. It does have some nice test results like fill rate tests and internal card comparing, but as far as I know, minimum frame rate or 3D image quality is not mentioned at all, let alone FSAA settings."
Minimum framerate would be a nice addition, everything else you mention is already in there(have you DLed 3DMark2K1 yet?).
"Perhaps a test would be to sample a texture at the different levels that the hardware supports, have it dump a frame from the frame buffer and actually do a bit by bit comparison to the same texture sampled by the CPU with 100% clean code, not speed tweaked but to get perfect image quality for that particular level. Then a bit comparison could take place and the card could be given a mark regarding it's texture filtering qualities."
What they do is dump an image from the framebuffer and then show one rendered on a reference rasterizer(software) and have you compare them. Why have you compare instead of software? Because of the way software would "look" at the image. The only way for software to compare is break it down by bits and compare them to each other. Sounds good, but you run into problems quickly with this approach. What if one board is running 32bit color and has the alpha off by one bit for every pixel on screen? For you or I sitting in front of the monitor it would barely, if at all, noticeable. To a software compare that board scores a zero. Then you could have another board that misses every other texture. Only having every other poly textured could result in a score of 50% which is a huge advantage over the board with the alpha channel being off. Which is going to look better? The one that scores a zero, by a HUGE margin, but the software has no way of knowing that. Visualization benches do test this quite regularly, but there is only one consumer video card that can score acceptably on them.
"The same could apply for FSAA, worst case scenarios, best case, lines drawn at 0* (from horizontal), 22.5* and 45*, multisampling and supersampling, and perhaps have again a software implementation written to show the maximum image quality case. Speed would be of no consideration here, as it is giving a card a mark purely based upon it's quality."
Even bigger problems here. Current FSAA(OG or RG), 3dfx ATi or nVidia, is a hack. What you would consider best case on any particular angle is not going to be the best case in another. What's more, the human eye is more perceptive to aliasing when it happens at certain angles. And then you have the fact that the human eye is also more perceptive to the blurring caused by FSAA at those same angles. Which is better? With MSAA it gets easier as you only have to worry about edge aliasing in terms of "FSAA" and you can rely on anisotropic/trilinear to deal with texture aliasing. Maximum image quality, using anything out now except the GeForce3(which does use MSAA) is going to rely greatly on what flaws bug you most, jaggies or texture blurring. If you look at MSAA and image quality, well 3DMark2K1 does have an image quality test for that🙂
"A purely coded software method however, should yield perfect results, and driver 'tricks' could be found to see if any quality has been sacrificed for the new found speed. Also, quirks in drivers and filtering problems, such as those recently found on the Radeon concerning anisotropic filtering can be clarified more..."
Unfortunately, if people did start posting the comparisons more frequently you would hear much louder protests then you do now about MO being nVidia biased. The MS reference rasterizer and nVidia cards are the closest to each other, have been since the TNT1 days(well, briefly the Matrox G400 faired better then the TNT2 until the launch of the GeForce). ATi uses LOD bias settings that are more agressive then the reference rasterizer. 3dfx are less agressive(though they can be adjusted). 3dfx lacks not only anisotropic, but also trilinear for anything current. ATi's anisotropic is kinda screwey at the moment. nVidia's weakest point, their defaulting to 16bit texture interpolation for DXT1 is not something specified by MS so they wouldn't lose any points there. Would you consider that fair? Not too many people around here do(we have discussed this before, people's personal preferences seem to outweigh their desire for accuracy). I have argued that accuracy should come first with subjectivity a distant second, I'm in the minority around here though🙂
"In my humble opinion a benchmark should include more than just a speed test, but rather other areas considered. I'm not exactly sure on the implementation of this, but people always seem to be complaining as to a particular video card being better as a 'gamer's card' or 'in real life'."
There is no way to gauge this aspect unfortunately. Are you FPS and nothing else? nVidia. Hardcore sim fan? 3dfx. Like watching movies? ATi. 2D at 1920x1440? Matrox. For most people a little bit of everything is closer, even if we limit it to strictly gaming. For Falcon4 I would likely pick a V5 over a GF2Ultra. For Giants I would take a GF2MX over a V5. For Quake3 I would take a GF2U over anything else(outside a GF3😉), one case of the most expensive being the overall best, but that doesn't always hold out. Look at the KyroII review. If you care more about SS then any other game then the KyroII looks a whole lot better then if you are a MBTR junkie. A "gamer's card" is a term thought up by people who have found a board that does well for the games they like to play and that doesn't always line up with the popular consensus. Most of the time this is spoken by a particular company zealot, how many times on this board have we seen comments of people having to change to different Detonators just to play a game?? I have never had to change a Det for any game ever. Same goes with ATi and people talking about the Radeon not being able to so much as load a game under Win2K. Company zealotry. Some people do have a serious point to their arguments from their perspective, but I can't see how you could call any one board the true "gamer's card" as to date they all have respective strengths and weaknesses compared to each other. If nothing else, price alone is enough to tilt the field heavily.
"Carmack's engines are the best and the quake 3 engine is in a league of it's own considering that it's now 1.5 years old, and still kinda keeping up with visuals."
Have you by chance seen Giants or Sacrifice yet? In all honesty, after firing up those two I fired up Quake3 and it looked to me like Quake2, only not as good. I would say there is a bigger rift in visuals between the latest games and Quake3 then there is between Quake2 and Quake3. It was real good for its' time, but UT with the S3TC textures certainly gives Carmack's last engine a very good run for its' money🙂