Wolfenstein VGA Performance and IQ

MODEL3

Senior member
Jul 22, 2009
528
0
0
http://www.hardocp.com/article...eplay_performance_iq/1

Graphical updates to id Tech 4 running in DX9 for Wolfenstein include: Depth of Field, Soft Shadows, Post-Processing effects (HDR), and Havok physics. These effects are very typical for DX9 games, and should provide, at least, that familiar level of graphics in current DX9 games even though it is running on a dated engine

Given that it is based on five year old graphics technology, its visuals are presented well and appear surprisingly up-to-date. Performance was spectacular almost across the board, and though the clumsy menu is clearly geared toward console controllers, the game stands up well in spite of its flaws.


 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Seems to like NV cards a lot :)

Nice to see 4890 being able to pull out 2560x1600 with maximum in game settings. I remember my 8500 barely managed 15 fps in Doom 3 at 640x480.

"Wolfenstein comes to us as yet another entry in a growing catalog of games which bare distinct signs of being console-focused titles. The lightweight graphics, memory, and processor requirements, the nonfunctional anti-aliasing, the clumsy menu system, and the mere handful of customizable graphics options all combine to show us that this chapter of B.J.?s World War II exploits was geared for the console and only adapted for the PC. While it can be argued that it makes sense from a business perspective for developers to focus on console development first, it does leave PC games wondering what exactly is next for us." :brokenheart:
 

faxon

Platinum Member
May 23, 2008
2,109
1
81
LOL i cant believe he didnt know about the dev console fix to change the FPS cap in ID-Tech 4 games. the engine still discards the extra frames rendered past 60FPS since the engine refresh is locked to 60Hz, but it allows you to benchmark past 60FPs rofl
 

MODEL3

Senior member
Jul 22, 2009
528
0
0
Originally posted by: faxon
LOL i cant believe he didnt know about the dev console fix to change the FPS cap in ID-Tech 4 games. the engine still discards the extra frames rendered past 60FPS since the engine refresh is locked to 60Hz, but it allows you to benchmark past 60FPs rofl

Yep, i saw that also (at first when i posted i was pointing also to the part of article about how close GTS250 was with GTX275, but after 5 minutes i edit it lol)

What is going on with hardocp?
I thought that it was one of the most susseful tech sites? (are they doing often mistakes like that?)


I checked with some other sites about the anti-aliasing and for the multi-player users say it exist!
For the single-player i don't know about ATI but many users are saying that you can force it through NV panel!

 

Forumpanda

Member
Apr 8, 2009
181
0
0
I was going to make a long writeup about something similar but I decided it would be a waste of time.

When max fps is unlocked using average fps as a baseline for benchmarking is a poor idea.
If a card can produce really high FPs in some part of the game but falls to lower minimum FPS in other parts of the game then it might still have a higher average FPS but it will generally feel more 'laggy' and provide less value in actual use.

This capping max fps (intentionally or not) is actually in my opinion the preferred way to rank graphic cards.
Still not optimal but at least better.
 

MODEL3

Senior member
Jul 22, 2009
528
0
0
Originally posted by: Forumpanda
I was going to make a long writeup about something similar but I decided it would be a waste of time.

When max fps is unlocked using average fps as a baseline for benchmarking is a poor idea.
If a card can produce really high FPs in some part of the game but falls to lower minimum FPS in other parts of the game then it might still have a higher average FPS but it will generally feel more 'laggy' and provide less value in actual use.

This capping max fps (intentionally or not) is actually in my opinion the preferred way to rank graphic cards.
Still not optimal but at least better.

I agree with the part in bold, that's why many sites are reporting min fps also (when the benchmark allows)

The problem in this case, is that hardocp is also reporting min fps but it kept the cap. (this can lead to strange results, giving wrong impression about the perf. differencies between cards)


 

nyker96

Diamond Member
Apr 19, 2005
5,630
2
81
just played thru this game, at 12x10 I put it on max settings, my 4850s seem to work fine on this game. But I think most of the time ID's engine do better on NV cards.

personally I like this game, the first 1/3 is too easy then when they turn up the heat it gets better for me. I am debating to play it again with some guide to find all the tombs, I only found 2 the first time around. how you guys like this one?
 

ShadowOfMyself

Diamond Member
Jun 22, 2006
4,230
2
0
Originally posted by: Keysplayr
Originally posted by: dguy6789
It's my opinion that minimum fps is the most important number always.

This :thumbsup:

Yes, but you cant see that from benchmarks, unless they use a system like FEAR, which states the % of time the card spends at min/avg/max

Lets say we have Card A which plays through the game at avg of 45-55 most of the time, but has occasional 1-2 second dips to 20 fps on heavy parts

Card B on the other hand, averages 35-45, but doesnt have those dips

Which one would you pick? Not so easy as "min frames is the most important number"

For all I know, the card could be working at 100 fps for 2 hours, and have a 1 second dip to 30 fps.. The min fps would show 30 and would be very misleading

The HardOCP graphs are actually very useful here, since they show just how many times the fps drops occurred etc, its nice
 

faxon

Platinum Member
May 23, 2008
2,109
1
81
Originally posted by: ShadowOfMyself
Originally posted by: Keysplayr
Originally posted by: dguy6789
It's my opinion that minimum fps is the most important number always.

This :thumbsup:

Yes, but you cant see that from benchmarks, unless they use a system like FEAR, which states the % of time the card spends at min/avg/max

Lets say we have Card A which plays through the game at avg of 45-55 most of the time, but has occasional 1-2 second dips to 20 fps on heavy parts

Card B on the other hand, averages 35-45, but doesnt have those dips

Which one would you pick? Not so easy as "min frames is the most important number"

For all I know, the card could be working at 100 fps for 2 hours, and have a 1 second dip to 30 fps.. The min fps would show 30 and would be very misleading

The HardOCP graphs are actually very useful here, since they show just how many times the fps drops occurred etc, its nice

yea i liked the graphs for that when the cards werent hitting the FPS cap, but since i plan on replacing my 9800GTX with a 5870x2, they didnt tell me jack about what to expect as far as dual GPU scaling, since the 4870x2 and the GTX295 rammed up against the cap most of the time anyway. what annoys me even more, is the only gameplay "benchmark" i have seen of the HD58XX cards (demoed at QuakeCon) is them playing wolfenstein at some unknown resolution and graphical quality at 70FPS, which, in my mind, makes the benchmarks worth about as much as an article on semiacurate

ed: here's what mark has to say

I am actually aware of the commands to eliminate the FPS cap. Removing, however, would not have changed the results or shown anyone anything more interesting than the existing data already shows.



Thanks for your concern, though.



Cheers,

Mark Warner

Video Card Editor
 

Shmee

Memory & Storage, Graphics Cards Mod Elite Member
Super Moderator
Sep 13, 2008
7,409
2,443
146
lolz, I know in ETQW, frame rate is locked at 30 by default. BTW, that game is much better. (online at least)
 

dguy6789

Diamond Member
Dec 9, 2002
8,558
3
76
Originally posted by: ShadowOfMyself
Originally posted by: Keysplayr
Originally posted by: dguy6789
It's my opinion that minimum fps is the most important number always.

This :thumbsup:

Yes, but you cant see that from benchmarks, unless they use a system like FEAR, which states the % of time the card spends at min/avg/max

Lets say we have Card A which plays through the game at avg of 45-55 most of the time, but has occasional 1-2 second dips to 20 fps on heavy parts

Card B on the other hand, averages 35-45, but doesnt have those dips

Which one would you pick? Not so easy as "min frames is the most important number"

For all I know, the card could be working at 100 fps for 2 hours, and have a 1 second dip to 30 fps.. The min fps would show 30 and would be very misleading

The HardOCP graphs are actually very useful here, since they show just how many times the fps drops occurred etc, its nice

Of course more data is always better for getting a better understanding of performance. But I'd still argue that if you were given a choice between seeing only the minimum, average, or max fps numbers, the minimum ones would be more useful to see.
 

psolord

Golden Member
Sep 16, 2009
1,920
1,194
136
Originally posted by: faxon
LOL i cant believe he didnt know about the dev console fix to change the FPS cap in ID-Tech 4 games. the engine still discards the extra frames rendered past 60FPS since the engine refresh is locked to 60Hz, but it allows you to benchmark past 60FPs rofl

Hello.

I am trying to unlock the framerate. I added the following lines to the wolf.cfg file, right after the last seta com xxxxxxx entry

seta com_unlockFPS ?1?
seta com_unlock_timingMethod ?0?
seta com_unlock_maxFPS ?200?

It still runs at 60fps locked! What am i doing wrong? Is there a benchmarking tool out there?

Thanks!