From the man who created the skuzzy interface and now a game dev.
"I don't have time to read this entire thread, but will respond to those who are complaining about Vlave being the problem.
First, Valve is not the problem. You do not understand how code in DX9 works if you say that.
Here is the quick and dirty. In the game code, at init time, you ask what level of shaders (pixel and vertex) you (the card) support? The card comes back and says PS2.0 and VS2.0, then you go, "Cool", we can use DX9 shaders. At this point in the code, you have no idea what video card is really out there, unless you specifically test BEFORE starting DX9 up.
It's not Vlave's fault that the NV3x family of parts perform very badly using the shaders thay calim to support.
For Valve to fix this problem, they would have to disable all shaders. Well, the user community has wanted dynamic code in and have been bitching for years, "why can't game devs support the high end cards?"
Now we are doing it and you are bitching. Folks, the NV3x cards suck at DX9 shaders. That is the simple truth. Deal with it. It's not Valve's problem and I, for one, am glad they are taking the stance they are. Why?
Well, it might just make the farkin video card companies take notice that we, the devs, are not going to go quietly into the night anymore and take heat from gamers about features that are not being used. Maybe, just maybe, it might make the video card companies stand up and take notice, that if you put a piece of crap out in the market, we will expose it."
The ARB2 codepath for ati is of higher IQ and runs faster with the optimized cats than the nv3x.
rogo