Wouldn't it work better if software was optimized to the video card?

SunnyD

Belgian Waffler
Jan 2, 2001
32,675
146
106
www.neftastic.com
Seriously, why do video card manufacturers put out drivers that consistently have issues simply because they are optimizing various little paths just for the new software or game that's coming out to eek out a fraction of a percentage point of framerate or whatnot?

If software vendors simply wrote to the API spec instead of counting on the driver developers to tweak the video hardware, there would be a whole hell of a lot less bugs, crashes, etc. And certainly a lot less whining when a new driver comes out that breaks one little feature of only title on a particular configuration.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Software is often optimized for a specific videocard as can be seen on consoles that have a "fixed" hardware component base. However, on a PC there are hundreds of videocards. It would be a daunting task to optimize software for each graphics card like X1900GT, X1950GT, X1950Pro, X1950XT, X1950XTX, X1900XT, X1900XTX and X1900 AIW. Instead, considering the architectural design, it is better in this instance for ATI to develop a driver that boosts X1900 series performance in say Doom 3 (i.e. taking advantage of its 512-bit ring memory ring bus to improve performance). Now consider that GF8 and GF7 and X1900 series have completely different designs, optimizing for videocard families, nevermind specific models would most likely require a separate coding to take full advantage of the shaders, memory bandwidth, etc.

Bugs are not always caused by videocards either. Dual core cpus tend to freeze games and create problems. And then there is Windows...Since PC tends to surpass consoles for hardware, you might need serious optimization to run Oblivion on 7900GT for PS3 at a fixed resolution, but a computer user can vary resolution and image quality settings. So for Xbox360 when a game runs at 1280x720 and the developers target 60FPS average (i.e. racing game), they already choose to reduce shadows, draw distance, bloom, etc. in order to achieve this objective on the R500 videocard. How would you optimize all those settings and achieve a comfortable performance at 1920x1200? They don't have to worry about that. For instance, in QW:ET, mega textures allow the 320mb 8800GTS to run the game smoothly. But in other games without such an implementation, it tanks. If you knew that 100% of PC gamers were going to use 8800GTS 320mb for the next 4-5 years, you'd always optimize for the texture issue. But when 3% of PC gamers own 8800GTS 320mb, are you going to write all new code or introduce a new texture compression technique? NO.


Don't forget the user is given optimization options in the game's menu. Giving this freedom to adjust settings takes some of the weight off the shoulder of developers who would have to spend countless months and $ to optimize what can be done at home in seconds. I suppose it's a compromise since in PC gaming, the graphical progress is more or less linear year over year while in console gaming you see major jumps every new generation of consoles. As a result you cant really afford to optimize as much for PC because in 12 months what was a phenomenal game will only be great, in 6 more months good and in 6 more months just OK. With consoles games look fairly similar within 1-2 years of release (or the difference isn't as significant).

Development costs do matter when it comes to your bottom line. When Microsoft sells 170 million dollars of the game, they can afford to spend a lot more $ on development and optimization vs. say Prey which wont ever sell as good. In addition, key titles are necessary to improve awareness and brand name of the console which will attract new users. In PC gaming, it's everyone for themselves. Game developers don't really try to make PC gaming more attractive since they would most likely rather develop across many platforms to sell more product. Since the PC market is smaller, your optimization cost per each copy sold of the game is therefore a larger % of the total development expenses incurred by the company.

Finally, a lot of things have to do with the gaming engine and the type of game. Doom 3 can still be enjoyable at 1024x768 or HL2 looked great with GeForce 6600 hardware. On the other hand World in Conflict beats the crap out of graphics cards but doesn't necessarily look better (at least doesnt look better relative to the increase in graphics card power required).
 

SunnyD

Belgian Waffler
Jan 2, 2001
32,675
146
106
www.neftastic.com
You read far too much into my rant. The manufacturer will obviously support their own cards within their drivers. But the manufacturer should NOT be expected to optimize their driver to a given title. The title should optimize their software to whatever API they're writing to (OpenGL, DirectX, whatever) and not count on NVIDIA or ATI writing a specific rendering path for their title. Period.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Originally posted by: SunnyD
You read far too much into my rant. The manufacturer will obviously support their own cards within their drivers. But the manufacturer should NOT be expected to optimize their driver to a given title. The title should optimize their software to whatever API they're writing to (OpenGL, DirectX, whatever) and not count on NVIDIA or ATI writing a specific rendering path for their title. Period.

But what exactly does API have to do with the level of detail, shader complexity, texture size....those aspects kill the videocard performance. You can write the game in OpenGL or DirectX but if your texture size is 32K x 32K, you'd need to use some kind of compression or texture optimization technique to make it run smoothly given a fixed amount of ram on the graphics card (which can vary from 256 to 1GB on modern cards). How do you expect the developer to optimize that? You are saying they should go and pick the best texture size for 256, 320, 512, 640, 768 and 1GB graphics cards and automatically choose that texture size when such a graphics card is detected? :laugh:

And since NV performs better in OpenGL than ATI, every game should be coded in OpenGL and DirectX then?
 

Munky

Diamond Member
Feb 5, 2005
9,372
0
76
I'd rather the devs not optimize the game for certain hardware, unless it's a console game. The PC game devs use certain standard API's like OpenGL and DirectX, and it's the responsibility of the video card manufacturer to make sure their hardware supports those API's with acceptable compatibility and performance. It's hard enough as it is to make games work well across different platforms due to the drivers and/or the hardware not supporting certain features of the standard API's, and it would be a nightmare to make the devs resort to various convoluted hacks and tricks to overcome the deficiency of the driver or hardware.