Note that the plan file is dated in 2001 so it predates everything that's happened since the NV3x was launched. However the general feeling he seems to have is that it's bad for drivers to be specifically hard-coded fo games and/or make assumptions about a game that may not hold true in future patches.
If it's non conformant then certainly. If it is a conformant optimization then there is absolutely nothing to worry about as changes would simply revert to the default rendering state.
The issue in question here is not whether to optimize or not, it's the nature of the optimizations. If they don't follow a set of rules then they're simply hard-coded cheats - not optimizations - since they don't boost performance in a realistic or generic fashion and can easily break since they rely on assumptions to work.
Take the latest round of optimizations as an example, what ended up broken when things were changed around? Nothing that I can see at all. I am not arguing with you on this point, simply using the current round of optimizations as an example of a more conformant one with the prior being very non conformant. Making an assumption is one thing, coding in if x, y, z, a, b, c, d and e are doing this, then you can do this(to oversimplify) is something else entirely. If one of the varriables is altered, which is something that could happen in an update, then the optimization wouldn't kick in and it would fall back to its default state. This way you eliminate the chance of breaking anything.
If the likes of PowerVR (or anyone else who does it) need tailor their drivers for each game then one can expect inconsistent performance and compatibility in the games that they don't happen to look at for whatever reason. Also if the games get patched and change one or more assumptions that the drivers make then the user can expect problems.
It depends on how the optimizations are handled though. Using the example of PowerVR there actually were some issues that they needed to work on in terms of their optimizations(which, I might add, they
quickly responded to me whenever I asked them about it, something I can't say for nVidia(takes weeks) or ATi(my last questions I've been waiting to hear back for roughly nine
months)).
In addition, when you see benchmark results you're not seeing how well the hardware performs, you're simply seeing how well the driver developers are able to hard-code cheats into drivers that work only in the benchmarked games. If you then look at the games that weren't targetted by the developers then you'll see a very different picture.
This is true all around though. Not to harp on it, but the R9800 was running with the FX5200 in Doom3. Hell the Ti4200 is regularly besting the R9800XT
and FX5950 in some instances(NWN, no idea why either). All of the companies are doing it, and they always have. If we expect them to eliminate all their optimizations we would significantly reduce the performance of all the boards on the market in most games. I have long been a proponent of using a much larger selection of games(IIRC you, Wingz, Robo and I had a lengthy discussion about this a few years back) for the reason you stated not to mention it will force driver teams to optimize for more titles.
Basically what you then have is drivers that are about as stable as jelly - as long as you don't move the plate too much it'll stay up, but start to wobble it and it'll come crashing down.
The company that is taking the most heat for their optimizations still has the most stable drivers, I don't see the two as directly related. I can see what you are saying if you are talking about optimizations that have the potential to break something, but not all of them are like that. As a generic example, all of the filtering hacks/optimizations everyone is doing now don't break anything, they simply reduce IQ a given amount and boost performance a given amount.