Robo-
"Ben, a few things....
1) those MDK2 scores are 640x480. 3dMark2000's default is 1024x768. Kinda a weird comparison, no?"
Errr, the quote I was referring to said-
"I have NEVER seen a mx running ANY game at twice the speed"
You also forgot to mention that that was 640x480 32bit versus 1024x768 16bit and with the GF2MX dropping to 16bit makes a BIG difference at that res(~25FPS in Quake3 for isntance).
"yeah, and the 5500 was already past due by that time, it was a known fact that the 5500 had 32-bit color, blah blah blah...keep on preaching"
My kids still believe in the tooth fairy too, doesn't mean the guys who work at MO do🙂
What was the number one selling single add in board for the twelve months preceeding and roughly six months post 3DMark2K launch? Why it was the V3🙂
"if they were really that concerned about "unsupported features", I sincerely doubt they would've included EMBM, considering how many cards supported it at the time (Matrox) and how many were on track to support it (Matrox)"
Must have been there heavy nVidia bias😛 Know how much of a weighting EMBM has on the 3DMark2K score? 0%, the same that the hardware T&L and AGP texturing does. Of course, more examples of heavy nVidia bias right?🙂
Orbius
"3dfx's fault? Look everyone with any sense knows that 3DMark has always been hopelessly biased towards Nvidia. Punishing 3dfx by halving their scores because of the lack of 1 feature shows that bias clear as day."
Of course it does. That's why nVidia is using their patented hardware T&L technology that no other board manufacturer can use, right? That's why MS couldn't add hardware T&L to DX7 or why the OpenGL board couldn't add it to OpenGL because it is a nVidia exclusive feature, right? That's why MO switched over to 32bit default so ATi would continue to trail horribly, right? Since they are so nVidia biased and all. We had this discussion in another thread.
It's all the money nVidia is charging. See, they need to overcharge for their cards so they can bribe all the benchmark makers, the game developers and even Intel(can't use a V5 with a PIV) to make their hardware look and work better with their stuff. The cards are dirt cheap to make, nVidia is just paying eveyone off to make 3dfx look bad, and they are STILL DOING IT!! That's right, even after 3dfx is dead game developers and benchmark makers are STILL using hardware T&L, Dot3 and are even planning on using all of those GeForce3 features all just to punish 3dfx!!! We had this all figured out some time ago, nVidia is bribing everyone they can....😕