Software is often optimized for a specific videocard as can be seen on consoles that have a "fixed" hardware component base. However, on a PC there are hundreds of videocards. It would be a daunting task to optimize software for each graphics card like X1900GT, X1950GT, X1950Pro, X1950XT, X1950XTX, X1900XT, X1900XTX and X1900 AIW. Instead, considering the architectural design, it is better in this instance for ATI to develop a driver that boosts X1900 series performance in say Doom 3 (i.e. taking advantage of its 512-bit ring memory ring bus to improve performance). Now consider that GF8 and GF7 and X1900 series have completely different designs, optimizing for videocard families, nevermind specific models would most likely require a separate coding to take full advantage of the shaders, memory bandwidth, etc.
Bugs are not always caused by videocards either. Dual core cpus tend to freeze games and create problems. And then there is Windows...Since PC tends to surpass consoles for hardware, you might need serious optimization to run Oblivion on 7900GT for PS3 at a fixed resolution, but a computer user can vary resolution and image quality settings. So for Xbox360 when a game runs at 1280x720 and the developers target 60FPS average (i.e. racing game), they already choose to reduce shadows, draw distance, bloom, etc. in order to achieve this objective on the R500 videocard. How would you optimize all those settings and achieve a comfortable performance at 1920x1200? They don't have to worry about that. For instance, in QW:ET, mega textures allow the 320mb 8800GTS to run the game smoothly. But in other games without such an implementation, it tanks. If you knew that 100% of PC gamers were going to use 8800GTS 320mb for the next 4-5 years, you'd always optimize for the texture issue. But when 3% of PC gamers own 8800GTS 320mb, are you going to write all new code or introduce a new texture compression technique? NO.
Don't forget the user is given optimization options in the game's menu. Giving this freedom to adjust settings takes some of the weight off the shoulder of developers who would have to spend countless months and $ to optimize what can be done at home in seconds. I suppose it's a compromise since in PC gaming, the graphical progress is more or less linear year over year while in console gaming you see major jumps every new generation of consoles. As a result you cant really afford to optimize as much for PC because in 12 months what was a phenomenal game will only be great, in 6 more months good and in 6 more months just OK. With consoles games look fairly similar within 1-2 years of release (or the difference isn't as significant).
Development costs do matter when it comes to your bottom line. When Microsoft sells 170 million dollars of the game, they can afford to spend a lot more $ on development and optimization vs. say Prey which wont ever sell as good. In addition, key titles are necessary to improve awareness and brand name of the console which will attract new users. In PC gaming, it's everyone for themselves. Game developers don't really try to make PC gaming more attractive since they would most likely rather develop across many platforms to sell more product. Since the PC market is smaller, your optimization cost per each copy sold of the game is therefore a larger % of the total development expenses incurred by the company.
Finally, a lot of things have to do with the gaming engine and the type of game. Doom 3 can still be enjoyable at 1024x768 or HL2 looked great with GeForce 6600 hardware. On the other hand World in Conflict beats the crap out of graphics cards but doesn't necessarily look better (at least doesnt look better relative to the increase in graphics card power required).