Video technology advancing too fast for its own good?

450R

Senior member
Feb 22, 2005
319
0
0
Is it possible that PC hardware is advancing so fast that game developers aren't taking full advantage of what's already available to them?

Look at the PS2 for example. More recent games are graphically superior, yet the hardware never changed. It's because the developers had time to become intimate with its advantages and shortcomings, and know how to code for the hardware's full potential.

PC game developers may not have this time...? It seems like they're pushed to include new technologies and buzzwords ... SM3, HDR, etc. So here's an honest, sarcasm-free question: has DX8 even been put to its full potential? Or are they rushing to put "DX9, featuring HDR!" stickers on their product boxes?

There just seems to be a huge lack of efficiency but that may be my ignorance showing. Insight please!
 

Mik3y

Banned
Mar 2, 2004
7,089
0
0
I would give you a detailed explanation, but that would take far longer then I have time for. In short, it's your ignorance showing. There's a reason for everything and once someone comes around to explaining it to you, you'll be more enlightened.
 

Pete

Diamond Member
Oct 10, 1999
4,953
0
0
Video technology advancing too fast for its own good?
Nevar!

Is it possible that PC hardware is advancing so fast that game developers aren't taking full advantage of what's already available to them?
It's possible, but the PC market is a different animal than the console market. Consoles are not only more stable platforms, they're also gateways to much higher sales. So devs may have more financial interest in and wherewithall to getting as close to the hardware as possible to exploit as much of it as possible, in addition to the tools to do so. PC devs ride a different cost/benefit slope.

It's true that PC devs may target a lower common denominator (to aim for the largest audience possible) and may not be able to get as close to the hardware as console folk, both b/c of product cycles and platform heterogeneity, but I'm not sure that's impeding gamers. Heck, look how long the R300, or Radeon 9700 Pro architecture lasted: three long SM2 years with basically only speed bumps (albeit large ones) to speak of, and at leat some games have been pretty quick to use SM3 to their (marketing) advantage. The key to that may be ease of use thanks to D3D9's high-level shader model that allows them to compile to both SM2 and SM3 targets, but also expanded precision and featureset that either simplifies or simply makes feasible some techniques (eg, FP blending for HDR, which is actually above and beyond the SM3 spec but is common to all SM3 GPUs).

But realize that only a literal handful of PC games feature HDR. It's buzz-worthy, but not quite driving devs in droves from good ole SM2.

Newer D3D versions bring with them fewer restrictions. Devs may optimize the poop out of a DX8 card, but they'd probably prefer to flex some DX9 muscle instead to show off a fancy new effect made possible, or at least easier, with newer and faster hardware. It's a compromise necessitated not only by the speed of innovation, but also by the size of the potential audience. So, yeah, it's a question of efficiency: game dev time/cost efficiency vs. player money/framerate efficiency. They manage it pretty well.

I'd guess DX8 has been milked as much as any desktop API thanks to the strength and longevity of the GeForce 3 and 4 cards and the fact that a modified GF3 made it into the Xbox. Similarly, the fact that (very roughly) SM3 parts lurk in both X360 and PS3 may mean SM3 desktop cards will be lavished attention for a decent period of time. So, the PC space has had its share of three-year platforms, which isn't too bad.

I'm rambling. Really, Ben and maybe others here are better equipped to answer this than me (and to correct my mistaken assumptions). Me, I'm off to play some many-years-old CSS gameplay with brand-new "HDR" (bloom!) features on my two-year-old 9800P. :)
 

xtknight

Elite Member
Oct 15, 2004
12,974
0
71
I think the recent games looking better on the PS2 have more to do with artwork than anything else. They were rushed to get out launch titles, which probably didn't even scratch the surface in terms of graphics quality.
 

five40

Golden Member
Oct 4, 2004
1,875
0
0
Comparing a console and PC are two totally different worlds. When something new is released for PC it's not like a completely new way to learn how to code for the game. New releases (ex: SM3) make doing something easier. However on a console release there is a major learning curve, therefore games look better in the 2nd, 3rd gen of the game on a specific console. Take the new consoles out/coming out for example. They use cell. That's going to require a completely new way to code since a totally different architecture. x86 has been around for just a little while, DX is nothing to new either, OpenGL is no new puppy either.
 

Crescent13

Diamond Member
Jan 12, 2005
4,793
1
0
Originally posted by: xtknight
I think the recent games looking better on the PS2 have more to do with artwork than anything else. They were rushed to get out launch titles, which probably didn't even scratch the surface in terms of graphics quality.


but look at BLACK. It has physics that surpass almost all games out there for PC. Why? The people who made this game had worked on the Burnout series for PS2 for 3 years. They know the hardware inside and out.
 

The Sly Syl

Senior member
Jun 3, 2005
277
0
0
I believe that the influx of better PC graphics card technology cannot improve too much for simple reasons of standardization.

For every new series of graphics cards, comes a new series of budget graphics cards. The better the lower-end (most common) graphics cards get, the better that graphics and physics for everyone shall increase.

Crescent: Black uses *entirely* scripted events from what i've heard. Likewise, the Renderware engine that it uses also works on PC (and any other home console), GTA is a good example of that, and Burnout 2 was for all 3 home consoles. Burnout Revenge is almost out for the 360 and it still has the same basic engine (through with quickly added extra prettyness).
 

Reznick

Junior Member
Jan 26, 2006
21
0
0
I agree with 450R - developers learn to max the capability of the hardware they are coding for. This applies to console development. This is why the quality and sophistication of console games improves over time, after the launch of a new console.

PCs are an entirely different matter. New video cards hit the market on about a 6 month cycle. Game software, like FEAR, still brings the X1900 to its knees (with soft shadows on), and it is the latest and greatest card you can buy.

It is changing too fast for its own good, but there is nothing you can do about it. Here is the thing: with consoles, as we've said, games tend to improve graphically over time, using the same hardware. With PCs, games tend to look worse over time when using the same video card/PC hardware, since newer games are designed for newer cards. You have to keep updating your hardware to keep pace with the software.

I don't care that things are changing quickly. I just care that I have to pay a whole lot more money just for the privilege of playing a few games with a mouse and keyboard. With the enthusiasts market making SLI/Crossfire, and even quad rigs a new standard, fewer people can afford stay with the program. The costs are too high when you consider what you are actually getting for that investment. This trend will only get worse, I think. So in some regards, it is advancing too fast for its own good if it can't find a way to make the technology more affordable to a larger population.
 

Unkno

Golden Member
Jun 16, 2005
1,659
0
0
The main reason why graphics look better on consoles are that each generation of consoles has a specific set of hardware so developers would know what kind of hardware to code for......For PCs, developers have to code for a wide varitey of hardware, thus they won't be able to fully take advantage of one kind of hardware. For example Nvidia are going with pipelines, while ATI are going with vertex shaders (correct me if i'm wrong on the terms); this means that the developers would have to choose to either specialize their game to work better on nvidia/ati or (in most cases) they make the game to work equally among nvidia and ati but this will still not fully use the maximum capability of the hardware
 

theHate

Banned
Feb 7, 2006
29
0
0
Originally posted by: 450R
Is it possible that PC hardware is advancing so fast that game developers aren't taking full advantage of what's already available to them?

Look at the PS2 for example. More recent games are graphically superior, yet the hardware never changed. It's because the developers had time to become intimate with its advantages and shortcomings, and know how to code for the hardware's full potential.

PC game developers may not have this time...? It seems like they're pushed to include new technologies and buzzwords ... SM3, HDR, etc. So here's an honest, sarcasm-free question: has DX8 even been put to its full potential? Or are they rushing to put "DX9, featuring HDR!" stickers on their product boxes?

There just seems to be a huge lack of efficiency but that may be my ignorance showing. Insight please!

ATI and nvidia likes to release cards every 8-9 months or so, but since they don't release the cards at the same time we're seeing a new card every 3-5 month, and the worse thing they are doing now is releasing highest end cards first then release mid end cards, 7800 GTX first then the 7800 GT, and x1900xt first then the x1900 XL, this screws up alot of people's buying decisions. When they realized they could pay a lot less for nearly the same performance.

In worse case situation is when people realized they paid 800 dollars for a 512mb GTX only to have a slower card. Or the other way around when they paid 600 dollars XTX, maybe only to get beaten by a 7900 GT.
 

nib95

Senior member
Jan 31, 2006
997
0
0
It's advancing too slow imo.
Either that or game devs are getting sloppy.

I believe that the top end card at the moment of a games release should be able to run it adequatly at max settings, or at least a resolution of 1600x1200. This has not been the case with games such as COD2 and FEAR which even now (a few months down the line of their initial release) with the X1900 XT do not run perfectly playable at max settings.

I think the problem is, too many cards are being released, too often, too soon, and far to high in price, I wish theyd stick to the every 6 months time span, but ensure the card that got released was truly spectacular as appose to just slightly better then the last gen.
 

RelaxTheMind

Platinum Member
Oct 15, 2002
2,245
0
76
simple rule of "supply and demand"

Keywords like: NEW!, SPACE AGE!, HIGHLY ADVANCED!, NEVER BEFORE SEEN! BILLIONS PER SECOND!... sell prefab brand name computers, games, and all way down to our component level.

Believe it or not there are people that look specifically for those types of keywords before purchasing

Practically the same business structure fits just about most other venues...

Games such as DOOM3 and FEAR pretty much exploded the video card market for sake of quality/framerates/bragging rights (in that order)...'

so basically if the charts say release a game/video card on this date... they release it. nothing but marketing... like all those pushed back release dates and early release dates. test batches sent out to see how the market "bites"

sorry... used to be marketing director for a private company... hehe