• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

New games = Lazy code? perhaps??

inveterate

Golden Member
I'm looking at all these new games comein out,, with FEAR and OBlivion now "stressing the graphic card" I can't help but wonder is it simply just Lazy codeing. When HL2 came out, the mid end cards at the time was more than sufficient to run the game at 70+frames with 2x/4x, and highend would do 4x/8x,,, Doom three came out,, looked great and was locked at 60 but still ran very well on the 6800 Ults. Now we are FORCED almost to buy the SLIs just to get above 60. Anyone thinking,, WTF,, HL2 looked just as good as fear imo, maybe not in technical features, but interms of the overall FEEL, the picture is just as nice, with lot less needed.
 
No, not really. Each generation is adding polys, adding lighting, adding shaders, it all creates additional stress on the card. How much effort they put in to tuning the game for lower end hardware is probably more a function of money and market research than anything else.
 
Are you kidding? Doom 3 brought mid-level machines to their knees. HL2 with HDR takes a good video card to run too.
 
Oblivion runs poorly on the 360 and supposedly it has been highly optimized including using all three processor cores.
 
imo gameing isn't a mid level thing, But FEAR/ Oblivion is bringing HIGHEST end to their knees, that is the problem i'm talking about.
 
Originally posted by: inveterate
imo gameing isn't a mid level thing, But FEAR/ Oblivion is bringing HIGHEST end to their knees, that is the problem i'm talking about.

Mid-level machines are just 1-2 year old high level machines. When D3 came out, I was running a P4 2.8, 1GB, 9800Pro. It was able to run HL2 and Far Cry beautifully at 1280x1024.
 
Like i said b4, Gaming has never been a mid level machine's task, gaming is like War in the real world, the weapons of which are always the most modern and/or should be. The less techno savy the army the more it will loose. basically i'm saying "fook mid-level" Not that there isn't a range of gaming machines, just that it isn't as wide as your're making out to be. Plz take no offense.
When D3 came out,, it can be played "WELL" on A64s and 68ULTs (AVAILABLE AT TIME) at 55+ fps with 4x/8x. OBLIVION/FEAR is out now, barely drawing 60+ on ABSOLUTE HIGHEST end comps.

used to have a 98pro and p4,, Farcry/HL2 was NOT beautiful,
 
I think Oblivion runs amazingly well, considering all the moving grass & trees combined with the lighting effects, etc... there are just some things that are better left 'off' or turned down for now, such as grass shadows, in order to keep the fps above 30.

FEAR isn't as well optimized as it could be, but it certainly doesn't bring my 7900GT 'to its knees', unless maybe I tried some insane resolution like you widescreen LCD freaks use 😛 I can max out that game @1280 w/ 4xAA no problem.
 
Originally posted by: inveterate
I'm looking at all these new games comein out,, with FEAR and OBlivion now "stressing the graphic card" I can't help but wonder is it simply just Lazy codeing. When HL2 came out, the mid end cards at the time was more than sufficient to run the game at 70+frames with 2x/4x, and highend would do 4x/8x,,, Doom three came out,, looked great and was locked at 60 but still ran very well on the 6800 Ults. Now we are FORCED almost to buy the SLIs just to get above 60. Anyone thinking,, WTF,, HL2 looked just as good as fear imo, maybe not in technical features, but interms of the overall FEEL, the picture is just as nice, with lot less needed.

Half-Life 2 graphics was decent, but they cheated heavily on the lighting, which is a major part of Oblivion and FEAR. Turn off all the lighting and shadowing in those games, and you'll be able to run it on a midrange system too. Get the HDR version of HL2, and you'll see how well it runs.

Oblivion is coded very well... you can tell by how quickly the game load, how quickly you transition between zones.

And the AI in both Oblivion and FEAR are far superior to HL2. The physics in HL2 was great at the time too, but FEAR and Oblivion's physics are superior to HL2 as well.
 
comps arn't expensive, Interest rates... taxes,, , shipping,, energy cost,,, Operating system,, Internet service,, all very expensive and adds up even more over time, all because of the bush administration, do u realize how little the CPI would be if we still had a buc a gallon. jus look at the Core CPI vs CPI.

I don't think it was cheating in anyway for the developers to do what they did. Those games LOOKED great, on the powerful card at the time on release at populace agreed 60+ fps playable.

FEAR/OBLiv doesn't do that. which is why i'm posting.
 
Originally posted by: inveterate
Like i said b4, Gaming has never been a mid level machine's task, gaming is like War in the real world, the weapons of which are always the most modern and/or should be. The less techno savy the army the more it will loose. basically i'm saying "fook mid-level" Not that there isn't a range of gaming machines, just that it isn't as wide as your're making out to be. Plz take no offense.
When D3 came out,, it can be played "WELL" on A64s and 68ULTs (AVAILABLE AT TIME) at 55+ fps with 4x/8x. OBLIVION/FEAR is out now, barely drawing 60+ on ABSOLUTE HIGHEST end comps.

used to have a 98pro and p4,, Farcry/HL2 was NOT beautiful,

I played Far Cry and HL2 at 1280x1024 with close to max settings and I could even turn on a bit of FSAA and AF in HL2. Also, Valve did a great job with animation which really helped the game visually.

Mid-level gaming machines are quite common, whether you'd like to believe it or not. The 6 month release cycle that nVidia and ATI try to follow is way too fast for most people to keep up with. Your analogy comparing gaming to war is absolutely ridiculous though. A war against whom?

Oblivion's performance is pretty bad, but Morrowind also had terrible performance issues on higher end machines.
 
Oblivion isn't coded bad.....i'm runng a X800GTO2 OC at almost all high settings (cept no AA, bloom, or AF, medium water, no shadows on grass or shadow filtering) at 1680x1050 and i usually get around 20-30fps. Considering the resolution i'm running at, i'd say it's pretty decent.
 
dude, ur settleing for 50-60fps with no aa/af, and 45-60 with low aa/af and theres a huge diff between max settings and "less than max settings" I've played on a 9800pro and it's NOT sweet, and NOT great.
If you settle for that AT the time which 6800Ultras were AVAILABLE, than that is ur decision but anything less than a 6800 vanilla i'd not consider a gaming comp "at the release of D3).

Mid range, are not gaming machine, they might've been ONCE,, but no longer, that's why they're mid range.

in my comparison for comps to war tech, its very logical in that the better weapon (comp) the better ur game. I'm not talking about the war as in killing just the movement/necessity for the most advance technology.
 
I dunno, I have Obliv up all the way(put FEAR up all the way too), and my system is over a year old. I don't have any problems. Certainly don't have SLI, and will never get it. SLI is a waste of money unless you have a big display and run at high resolutions.
 
Originally posted by: Malak
I dunno, I have Obliv up all the way(put FEAR up all the way too), and my system is over a year old. I don't have any problems. Certainly don't have SLI, and will never get it. SLI is a waste of money unless you have a big display and run at high resolutions.



What are your spex.. whatever they are,, u don't have problems, no one does, its just as NOT AS good

And i didn't say these games had PROBLEMS ,,, only that they ARE problems in that they don't run "AS well comparably to other games that came out with high end cards"



NICE WEBSITE by the way Malak,, hope you didn't ripe codes off other sites like my mom does. She just copied all the html codes from other sites and changed values. IMO that is harder to do, but she rather do that than learn html.
 
Originally posted by: inveterate
I don't think it was cheating in anyway for the developers to do what they did. Those games LOOKED great, on the powerful card at the time on release at populace agreed 60+ fps playable.

It depends on how you define cheating. I'm not saying they broke any rules or anything, but their lighting was nothing compared to Doom. Most of the game was outdoors, but when you went indoors, you can definitely see where it lacked.

FEAR/OBLiv doesn't do that. which is why i'm posting.

That's not why. Your original post assumes that Oblivion and FEAR are coded badly. They're just more GPU and CPU dependent because they do a lot more than HL2 did. Get that demo HDR HL2 and see how well your system runs. If you were here at the time that was released, you would have seen all the threads that said it ran like crap.
 
Originally posted by: inveterate
Originally posted by: Malak
I dunno, I have Obliv up all the way(put FEAR up all the way too), and my system is over a year old. I don't have any problems. Certainly don't have SLI, and will never get it. SLI is a waste of money unless you have a big display and run at high resolutions.



What are your spex.. whatever they are,, u don't have problems, no one does, its just as NOT AS good

Of course it's 'not as good'... it does a lot more than HL2 does.
 
Sure u can say it does alot more. Doom 3 did alot more than xxx,, HL2 did alot more than xxx... NIETHER did SO much at such cost to performance that a BRAND spankin new card couldn't handle.
That is bad code, and/or too forward thinking, laziness on their part to optimize. It just isn't right to release a game that does hold up to commonly deemed playable standards even on The most powerful comp.
 
It is a matter of taste. Some people might like all of the pre-rendered lighting in HL2 (I do). And I think it works really well. There?s actually a few things with way too much detail in Counter-Strike Source. For instance, turn on wire frame mode and check out how many polys are wasted on a door handle, and the gun shells have a shape too which seems wasteful to me. As for poor coding, less of this is happening then we might think. A lot of the poor performance has to do with resources used up on things you might not notice.
 
Originally posted by: inveterate
dude, ur settleing for 50-60fps with no aa/af, and 45-60 with low aa/af and theres a huge diff between max settings and "less than max settings" I've played on a 9800pro and it's NOT sweet, and NOT great.
If you settle for that AT the time which 6800Ultras were AVAILABLE, than that is ur decision but anything less than a 6800 vanilla i'd not consider a gaming comp "at the release of D3).

Mid range, are not gaming machine, they might've been ONCE,, but no longer, that's why they're mid range.

in my comparison for comps to war tech, its very logical in that the better weapon (comp) the better ur game. I'm not talking about the war as in killing just the movement/necessity for the most advance technology.

Perhaps I'm an old school gamer, but I don't consider FSAA and AF as important settings in the game. Just like 3D positional sound, they're extra features that you get if your system can handle it. If you want to win this war that you're fighting, try setting your details down a bit and just enjoy the game.

My current system is an x2 4200+, 2GB, 7900GT. Oblivion doesn't run perfectly, but it runs well enough that I can play it and enjoy it. I have max settings for 1280x1024 except for a few of the draw distances, which are set to around 1/2 or 3/4. I think I also turned off grass shadows. I honestly don't notice any difference because I haven't run into a zones large enough where there is major popup.

Your idea of max settings is relative from game to game. D3 didn't have shadows for grass, or heavy AI calculations. It didn't have such an interactive world either. If Oblivion were coded using the D3 engine, performance might even be worse. It's hardly an outdated engine. So perhaps medium-high settings on one game could be considered high in another when you factor in other elements.
 
Originally posted by: inveterate
Sure u can say it does alot more. Doom 3 did alot more than xxx,, HL2 did alot more than xxx... NIETHER did SO much at such cost to performance that a BRAND spankin new card couldn't handle.
That is bad code, and/or too forward thinking, laziness on their part to optimize. It just isn't right to release a game that does hold up to commonly deemed playable standards even on The most powerful comp.

Um what systems aren't playable on it? I play at like 1600x1000 on my laptop with the default High setting that it detected, and may laptop only has a 2ghz PM, 2gb ram, and 6800Ultra Go

If you're only looking at the SLI performance, their problem is a bug in the drivers, and SLI performance is actually worst than single cards atm.
 
playable generally means 4x/8x and 60+

obliv fear won't hit that on any single card at max viz effec

this excludes the new 1.7Volt mod and 800mhz 79gts
 
Back
Top