Once And For All...

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Chaotic42

Lifer
Jun 15, 2001
34,643
1,814
126
Originally posted by: ArchAngel777
I am already GREATLY dissapointed with the fact that EA pushed them to have it ready by a certain date and thus dropped Co-Op support.. :-( What a mistake...
Man, that sucks. That was one of the big things I was looking forward to in this game. It's so rare to actually find in an FPS.

 

Keysplayr

Elite Member
Jan 16, 2003
21,211
50
91
Originally posted by: BFG10K
but I'll have to disagree on STALKER. At least at 1680x1050 (my monitors native res) my 8800GTS with CPU stock at 2.13GHz (STALKER is CPU limited, I know, I've tested)
That's because 1680x1050 is a low resolution so it's not at all surprising to find you're CPU limited. Try 1920x1440 or higher and your card will squirm.

Also Stalker doesn't allow hardware AA so it relieves the card of the burden it would otherwise have.

1680x1050 is low? Since when? I can see 10x7 and even 12x10/14x9 being considered to be low. Actually, most folks with widescreens these days run 14x9 don't they. Unless they opt for larger 20-22"+ widescreens that run higher native res. Look at what is mainstrean, not what is max. Sure, there are many people running 24"+ monitors at 19x12 to 25x16 or whatever the highest supported res is that will hurt any graphics card.

I would think that the average enthusiast gamer has a monitor that can support anywhere from 12x9 to 16x12. Extreme gamers go for the big monitors, usually those with pocket change to spare. (Am I sterotyping? Maybe. Maybe not.)

So, some flexibility is required here for your standards. 19x12 is not for everyone or within their means. 12x9 thru 16x12 is commonplace, CRT or LCD.

For Stalker, 19x12 at max settings probably would make an 8800GTS squirm, but unfortunately, I didn't have the means to test that resolution here with me.
 

ArchAngel777

Diamond Member
Dec 24, 2000
5,223
61
91
Originally posted by: keysplayr2003
Originally posted by: BFG10K
but I'll have to disagree on STALKER. At least at 1680x1050 (my monitors native res) my 8800GTS with CPU stock at 2.13GHz (STALKER is CPU limited, I know, I've tested)
That's because 1680x1050 is a low resolution so it's not at all surprising to find you're CPU limited. Try 1920x1440 or higher and your card will squirm.

Also Stalker doesn't allow hardware AA so it relieves the card of the burden it would otherwise have.

1680x1050 is low? Since when? I can see 10x7 and even 12x10/14x9 being considered to be low. Actually, most folks with widescreens these days run 14x9 don't they. Unless they opt for larger 20-22"+ widescreens that run higher native res. Look at what is mainstrean, not what is max. Sure, there are many people running 24"+ monitors at 19x12 to 25x16 or whatever the highest supported res is that will hurt any graphics card.

I would think that the average enthusiast gamer has a monitor that can support anywhere from 12x9 to 16x12. Extreme gamers go for the big monitors, usually those with pocket change to spare. (Am I sterotyping? Maybe. Maybe not.)

So, some flexibility is required here for your standards. 19x12 is not for everyone or within their means. 12x9 thru 16x12 is commonplace, CRT or LCD.

For Stalker, 19x12 at max settings probably would make an 8800GTS squirm, but unfortunately, I didn't have the means to test that resolution here with me.


You are fighting a lost cause... BFG sees it this way.

2560 X 1600 = High
1900 X 1200 = 'Middling'
1600 X 1200 = 'Low'

Of course it is ridiculous to assert that... Because, then the following would be...

1400 X 900 = lowerer
1280 X 800 = lowererer
1024 X 768 = lowerererer
800 X 600 = lowererererer
640 X 480 = lowerererererer

Yeah, sure...

 

Skott

Diamond Member
Oct 4, 2005
5,730
1
76
FarCry was revolutionary in design and performance. I see Crysis as more evolutionary in design and performance than revolutionary. Yes, it'll tax the high end cards if you want everything maxed out with high resolutions. Thats common these days as mentioned. But for those running 1600x1200 or less I dont think will have much trouble with a 8800GTS or GTX. Thats just my gut feeling though. Those running 1900 res (24"+ LCDs) or higher are gonna need dual card setups or run with lower game settings I think. I suppose the industry standard is still techincally 1024x768 but I consider 1600x1200 and 1680x1050 the new standard now. Again this is just my own personal feelings.
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,003
126
1680x1050 is low? Since when?
2560x1600 is a high resolution; 1680x1050 has less than half the pixels so it's a stretch to even call it a middling resolution.

So, some flexibility is required here for your standards. 19x12 is not for everyone or within their means. 12x9 thru 16x12 is commonplace, CRT or LCD.
What you, I, or most people run or don't run is irrelevant to the fact I outlined above.

2560x1600 is the highest resolution available in consumer space so that's our metric for comparison against. A resolution can only be classed as high depending on how many pixels it has compared to that standard.

A resolution doesn't magically become high just because a certain number of people run it.

For Stalker, 19x12 at max settings probably would make an 8800GTS squirm, but unfortunately, I didn't have the means to test that resolution here with me.
Here you go. And again Stalker doesn't have to worry about hardware AA unlike other modern games.

I've benchmarked the game at 1920x1440 with full dynamic on my 8800 GTS and to get a reasonably playable score (58 FPS average) I need to run shadows and vegetation on low. Also even with that score it's going to be a slideshow in places like the Yantar swamps.
 

Keysplayr

Elite Member
Jan 16, 2003
21,211
50
91
Originally posted by: BFG10K
1680x1050 is low? Since when?
2560x1600 is a high resolution; 1680x1050 has less than half the pixels so it's a stretch to even call it a middling resolution.

So, some flexibility is required here for your standards. 19x12 is not for everyone or within their means. 12x9 thru 16x12 is commonplace, CRT or LCD.
What you, I, or most people run or don't run is irrelevant to the fact I outlined above.

2560x1600 is the highest resolution available in consumer space so that's our metric for comparison against. A resolution can only be classed as high depending on how many pixels it has compared to that standard.

A resolution doesn't magically become high just because a certain number of people run it.

For Stalker, 19x12 at max settings probably would make an 8800GTS squirm, but unfortunately, I didn't have the means to test that resolution here with me.
Here you go. And again Stalker doesn't have to worry about hardware AA unlike other modern games.

I've benchmarked the game at 1920x1440 with full dynamic on my 8800 GTS and to get a reasonably playable score (58 FPS average) I need to run shadows and vegetation on low. Also even with that score it's going to be a slideshow in places like the Yantar swamps.

As you can see BFG, the Xbit STALKER 1600x1200 graphs for the 8800GTS640 seems incorrect to me as I look at my own benches. Xbit is also outgunning me with CPU power using an X6800. Never, not once, with every setting maxxed (full dynamic lighting and AA not working as you say) did the GTS 640 get anywhere NEAR a 19fps minimum. In fact, my minimum in higher than Xbits average fps at 1600x1200. So you tell me, who's right and who's not? I know I can trust my own benches, and you know my procedure for aquiring those benches.

As for "low-mid-high" resolutions, you say "2560x1600 is the highest resolution available in consumer space so that's our metric for comparison against."

For how many years now has the resolution of 2560x1600 been the "technical" maximum for graphics cards? Yet a couple of years ago, you wouldn't be caught dead benching at that resolution. Your just looking at the "technical" max resolution. Nobody plays there. See, the trouble is all you see are numbers and not at all paying attention to what is really going on in the gaming world as far as what resolutions most gamers play at regardless of what cards they own. Find that happy medium. Don't call all gamers "irrelevant" if they don't play at 19x12 or above. They all count.

Humanization is VERY much needed when discussing this stuff. If you were a robot, I'd understand. But you're not.

This quote from Xbit on STALKER burns me a little, "You may try to get an acceptable speed on a GeForce 8600 GTS by disabling the dynamic lighting model, yet the game will look much poorer as the consequence.

I don't need to disable a damn thing at 16x12. I'm fairly certain I would have to if I played at 19x12.

Xbit doesn't give details about their benches that we need.

How many times did they run the bench for each specific test? Did they eliminate disk thrashing and allow textures to load by cold running the bench twice? I did.

What demo did they use? buildings timedemo? Their own? Who knows, they didn't say.
At least I did not see where they said they had.
 

ArchAngel777

Diamond Member
Dec 24, 2000
5,223
61
91
Originally posted by: keysplayr2003

Humanization is VERY much needed when discussing this stuff. If you were a robot, I'd understand. But you're not.

Agreed.

However, I do think BFG10K is a knowledge bot. That man isn't human because he never tires, has a wealth of knowledge and... Yeah, he must be artificial intelligence :D
 

SickBeast

Lifer
Jul 21, 2000
14,377
19
81
IMO game benchmarks only need to be run at the following resolutions:

2560 X 1600 (ultra)
1900 X 1200 (high)
1680 X 1050 (medium)
1280 X 1024 (low)

CRTs aren't even sold anymore in retail and it seems as though most people have moved away from them, or will do so in the near future.

In any event, those are the only resolutions that mainstream LCDs are sold at, so they should be the focus. I suppose the problem is that maybe 1% of computers have the high/ultra resolutions.

 

ArchAngel777

Diamond Member
Dec 24, 2000
5,223
61
91
Originally posted by: SickBeast
IMO game benchmarks only need to be run at the following resolutions:

2560 X 1600 (ultra)
1900 X 1200 (high)
1680 X 1050 (medium)
1280 X 1024 (low)

CRTs aren't even sold anymore in retail and it seems as though most people have moved away from them, or will do so in the near future.

In any event, those are the only resolutions that mainstream LCDs are sold at, so they should be the focus. I suppose the problem is that maybe 1% of computers have the high/ultra resolutions.

I would tend to agree with you except that you are are mixing 16:10 with 5:4 resolutions.

2560 X 1600 = 16:10 (4.1 MP) 100% (Highest)
1900 X 1200 = 16:10 (2.3 MP) 56% (High)
1680 X 1050 = 16:10 (1.8 MP) 44% (Med)
1400 X 900 = 16:10 (1.3 MP) 32% (Low)
1280 X 800 = 16:10 (1.0 MP) 24% (Lowest)

These all follow the 16:10 WS format and seem the be the most correct way to bencharmk. So, I would like to see reviews include the lowest, medium and the highest, the rest we can just use calc to figure out the performance difference (for the most part).
 

Cheex

Diamond Member
Jul 18, 2006
3,123
0
0
Originally posted by: ArchAngel777
Originally posted by: keysplayr2003

Humanization is VERY much needed when discussing this stuff. If you were a robot, I'd understand. But you're not.

Agreed.

However, I do think BFG10K is a knowledge bot. That man isn't human because he never tires, has a wealth of knowledge and... Yeah, he must be artificial intelligence :D

Seconded.
 

Cheex

Diamond Member
Jul 18, 2006
3,123
0
0
Originally posted by: ArchAngel777
Originally posted by: SickBeast
IMO game benchmarks only need to be run at the following resolutions:

2560 X 1600 (ultra)
1900 X 1200 (high)
1680 X 1050 (medium)
1280 X 1024 (low)

CRTs aren't even sold anymore in retail and it seems as though most people have moved away from them, or will do so in the near future.

In any event, those are the only resolutions that mainstream LCDs are sold at, so they should be the focus. I suppose the problem is that maybe 1% of computers have the high/ultra resolutions.

I would tend to agree with you except that you are are mixing 16:10 with 5:4 resolutions.

2560 X 1600 = 16:10 (4.1 MP) 100% (Highest)
1900 X 1200 = 16:10 (2.3 MP) 56% (High)
1680 X 1050 = 16:10 (1.8 MP) 44% (Med)
1400 X 900 = 16:10 (1.3 MP) 32% (Low)
1280 X 800 = 16:10 (1.0 MP) 24% (Lowest)

These all follow the 16:10 WS format and seem the be the most correct way to bencharmk. So, I would like to see reviews include the lowest, medium and the highest, the rest we can just use calc to figure out the performance difference (for the most part).

Agreed.
 

Keysplayr

Elite Member
Jan 16, 2003
21,211
50
91
Originally posted by: ArchAngel777
Originally posted by: keysplayr2003

Humanization is VERY much needed when discussing this stuff. If you were a robot, I'd understand. But you're not.

Agreed.

However, I do think BFG10K is a knowledge bot. That man isn't human because he never tires, has a wealth of knowledge and... Yeah, he must be artificial intelligence :D

Yes, he is very knowledgable.
 

TanisHalfElven

Diamond Member
Jun 29, 2001
3,512
0
76
Originally posted by: Genison
I remember the same speculation with Half Life 2 before it came out. They ensured that it would be possible to play it even on the lowest end of the spectrum with Direct X 8. I know Crytek has made it backwards compatible for Direct X 9, so I'm thinking it will run great on a wide range of systems. Some people will have to disable features. Heck, my brother still has a 9700 Pro and an Athlon XP 2200+ proc. He was able to play STALKER at 800x600 fine. Yeah, crappy resolution, but it worked fine and he lost no enjoyment for gameplay.

It will be fine. They have set a standard for good programming and forethought in the audience they are trying to attract.

duh its gonna be playable on low end. just turn down all thr pretty effect and it'll run on Intel IGP.
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,003
126
As you can see BFG, the Xbit STALKER 1600x1200 graphs for the 8800GTS640 seems incorrect to me as I look at my own benches. Xbit is also outgunning me with CPU power using an X6800. Never, not once, with every setting maxxed (full dynamic lighting and AA not working as you say) did the GTS 640 get anywhere NEAR a 19fps minimum. In fact, my minimum in higher than Xbits average fps at 1600x1200. So you tell me, who's right and who's not?
How do you know XBit?s tests are incorrect? Do you know if they ran the same demo as you? Do you know if they used the same methodology as you?

No?

Then how can you possibly make a claim of incorrectness based on a comparison to your scores?

I ran the Cordon & Agropron demos at 1600x1200 using everything maxed (except AA was off since it makes no difference with the dynamic path):

Cordon: min 16.39 FPS, average 69.30 FPS.
Apropron: min 9.77 FPS, average 72.55 FPS.

So my minimums are low like theirs (low minimums don't concern me since they're usually benchmarking noise and seldom translate to actual gameplay) but my averages are higher than theirs. But again that doesn't mean much since I doubt they ran the same demos I ran. They could've ran through Yantar like I was alluding to earlier, for example.

For how many years now has the resolution of 2560x1600 been the "technical" maximum for graphics cards? Yet a couple of years ago, you wouldn't be caught dead benching at that resolution. Your just looking at the "technical" max resolution. Nobody plays there.
Again who plays or doesn't play at that resolution has no bearing on the fact that pixel count is the sole metric for a high resolution. As soon as a GPU + display + game combination was available in consumer space that could do 2560x1600 that's when it became our baseline for what constitutes a high resolution. If a new combination comes out that doubles that resolution then 2560x1600 will become a middling resolution.

Don't call all gamers "irrelevant" if they don't play at 19x12 or above. They all count.
I never said gamers were irrelevant because I wasn't even talking about gamers.

What I said was how many people use a given resolution is irrelevant to its label. 1680x1050 is not a high resolution whether everyone or no one uses it. To claim it's a high resolution on the basis of how many people use it is an appeal to popularity logical fallacy.

It's not a high resolution because it has less than half the pixels of 2650x1600. 2560x1600 is our reference point for a high resolution by virtue of it having the highest available pixel count in consumer space, regardless of how many people use it.

If everyone gamed at 2560x1600 it would still be a high resolution, again by virtue of its pixel count relative to other resolutions available on consumer space. Likewise even if everyone used 1680x1050 that still wouldn't make it a high resolution.
 

Keysplayr

Elite Member
Jan 16, 2003
21,211
50
91
"How do you know XBit?s tests are incorrect? Do you know if they ran the same demo as you? Do you know if they used the same methodology as you? This is one of my points. They didn't give these sort of details. It would be great if they offered a downloadable link of the timedemo they used, a note on what they did to create it or if it was a preloaded bench. But as usual with most review sites, we are in the dark.

No? No, I don't. That is one of my complaints

Then how can you possibly make a claim of incorrectness based on a comparison to your scores?"
Because I can play STALKER at 16x12 with everything on and still NEVER see numbers as low as Xbit and they have MORE power in the CPU dept. So, that tells me something is not right. I'm talking regular gameplay as well as my timedemo.

Furthermore: Your position on resolution is fine, from a completely technical perspective. However, it does not translate very well to real world gaming and what people are using the most.

Technical high res: 25x16
Real world high: 16x12 breaking into 19x12

You're technical and only technical and want nothing to do with what really goes on out there.
I'm a little of both and can understand both sides of the story.

This was the flexibility I was talking about.


"I never said gamers were irrelevant because I wasn't even talking about gamers."
But gamers is what it's all about my friend. Without any gamers, none of this matters nor would it exist.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Here are my STALKER benches with 7.6 Cats:
[<<Highlighted is 2900xt/regular is 8800GTS 640M benched on system in rig]


STALKER v1.0003
All sliders maxed to right, FDL

VistaBuildings Demo
14x9 - 11.73 min / 86.51 av / 641.60 max
16x12 -11.40 min / 77.79 av / 598.51 max


VistaBuildings Demo
14x9 - 11.96 min / 86.75 av / 707.04 max
16x12 -12.17 min / 68.88 av / 672.50 max


WinXPBuildings Demo
14x9 - 11.58 min / 75.45 av / 171.66 max
16x12 -11.22 min / 72.70 av / 172.64 max


WinXPBuildings Demo
14x9 - 12.07 min / 87.47 av / 629.20 max
16x12 -12.01 min / 71.75 av / 467.98 max


Vista Short Demo
14x9 - 22.34 min / 82.08 av / 633.10 max
16x12 -16.06 min / 65.72 av / 658.43 max
<<Vista Short Demo
14x9 - 23.81 min / 79.34 av / 602.17 max
16x12 -15.58 min / 70.89 av / 604.63 max


WinXPShort Demo
14x9 - 26.94 min / 65.14 av / 164.24 max
16x12 -23.89 min / 63.76 av / 165.46 max


WinXPShort Demo
14x9 - 24.80 min / 82.64 av / 532.43 max
16x12 -21.91 min / 65.87 av / 503.95 max


it runs with a better minimum then in these demos ... actually playing at 16x12 there are no slowdowns [below 30FPS] while running FRAPS
 

andybird

Junior Member
Jul 19, 2007
5
0
0
The makers have said that it will run with MOST (but not all) settings on max @ 1280x1024 on a GTX, so expect to need at least that level of performance. A GTS-640 SLI when overclocked setup would probly run it with Max settings, and presumably 2900XT SLI would be around the same.

Personally I'll be waiting for the nv 9800GTX/GTS and then scraping together my pocket money (or 1 months disposable income) and getting one of those (assuming that it is indeed out for the rumoured November release date).
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,003
126
This is one of my points. They didn't give these sort of details. It would be great if they offered a downloadable link of the timedemo they used, a note on what they did to create it or if it was a preloaded bench. But as usual with most review sites, we are in the dark.
So again how can you say XBit's results are incorrect? You have absolutely no basis to make that inference. You can state they don't match your scores and are perhaps lower than what you expect in typical gameplay but you can't say they're incorrect unless you specifically know what is wrong with their testing method.

Besides, you didn't give us any details either, nor did you provide a link for your demo:

This time-demo was made by me. It is FAR better than the buildings_timedemo used previously. In other words, it simulates actual game play, not somebody running around staring at the sky or floating underground. These results are more like it

The above doesn't tell us anything except that you didn't use the buildings_timedemo which could mean anything given there are a lot of buildings in the game and anyone could?ve created one for download with buildings in it.

You didn't even tell us what level of the game you played it on, nor did you tell us what driver settings you used, unlike XBit which listed all of their driver settings.

Because I can play STALKER at 16x12 with everything on and still NEVER see numbers as low as Xbit and they have MORE power in the CPU dept.
Oh really? So what is your minimum in the Yantar swamp for example? Or how about in the depths of the Brain Scorcher dungeon when scaling the large inner chamber?

In order for you to claim you NEVER see numbers that low you must've played the entire game from start to finish on your 8800 GTS and checked the framerate in every section. So again I'll ask what your minimum scores are in the above two situations?

Having said that I would tend to agree with you that their scores look lower than expected. If they aren?t benchmarking a demanding section of the game I would almost suspect they?ve recycled some of their scores from older drivers. I?ve noticed nVidia put in serious driver optimizations into Stalker and it runs much faster than it did when I first tried it a few months ago.

Furthermore: Your position on resolution is fine, from a completely technical perspective.
Last time I checked facts require technical information to back them up and without technical merit it's simply popularity. You can say 1680x1050 is a commonly used resolution but it doesn't magically become high just because a lot of people use it. Likewise a coffee mug doesn't become a high-end drinking device just because most people use them.
 

Extelleron

Diamond Member
Dec 26, 2005
3,127
0
71
Originally posted by: keysplayr2003
"How do you know XBit?s tests are incorrect? Do you know if they ran the same demo as you? Do you know if they used the same methodology as you? This is one of my points. They didn't give these sort of details. It would be great if they offered a downloadable link of the timedemo they used, a note on what they did to create it or if it was a preloaded bench. But as usual with most review sites, we are in the dark.

No? No, I don't. That is one of my complaints

Then how can you possibly make a claim of incorrectness based on a comparison to your scores?"
Because I can play STALKER at 16x12 with everything on and still NEVER see numbers as low as Xbit and they have MORE power in the CPU dept. So, that tells me something is not right. I'm talking regular gameplay as well as my timedemo.

Furthermore: Your position on resolution is fine, from a completely technical perspective. However, it does not translate very well to real world gaming and what people are using the most.

Technical high res: 25x16
Real world high: 16x12 breaking into 19x12

You're technical and only technical and want nothing to do with what really goes on out there.
I'm a little of both and can understand both sides of the story.

This was the flexibility I was talking about.


"I never said gamers were irrelevant because I wasn't even talking about gamers."
But gamers is what it's all about my friend. Without any gamers, none of this matters nor would it exist.

What makes you think that you are better at benchmarking a game than a professional, trusted site such as X-bit labs? What makes you think that just because your results are different, it is them that are making some sort of error or fabrication, and not yourself? I can tell you that, while I don't actually run FRAPS, I'm pretty sure I'm getting considerably less than 36 FPS in some parts of the game... so, I guess you either made up your benchmarks or made an error. Of course, that's not a fair way to compare benchmarks - we have different setups and I'm not running the same area as you are for your benchmark. But if that's the way you're going to question X-bit labs benchmarks, then we can question yours in the same way.

As for what is a "high resolution", there is two ways to consider it, as you say. Technically, 2560x1600 is a high resolution, meanwhile something like 1680x1050 is a medium resolution, by pure pixel count. However, in terms of what most gamers are playing at, 1680x1050 is a pretty high resolution - I believe the most popular resolution is still 1280x1024. I game at 1680x1050.... I like to look at the 1920x1200 and 2560x1600 results to see how powerful the cards are, but the bottom line for me is how a card performs at 1680x1050.
 

Cheex

Diamond Member
Jul 18, 2006
3,123
0
0
I will always trust the advice I get from ATers. I will definitely wait and see.

My target resolution however is 1680x1050 (resolution of the monitor I want to get).

I think getting more memory is a much better option for me right now.
 

bryanW1995

Lifer
May 22, 2007
11,144
32
91
Originally posted by: ArchAngel777
Originally posted by: keysplayr2003
Originally posted by: BFG10K
but I'll have to disagree on STALKER. At least at 1680x1050 (my monitors native res) my 8800GTS with CPU stock at 2.13GHz (STALKER is CPU limited, I know, I've tested)
That's because 1680x1050 is a low resolution so it's not at all surprising to find you're CPU limited. Try 1920x1440 or higher and your card will squirm.

Also Stalker doesn't allow hardware AA so it relieves the card of the burden it would otherwise have.

1680x1050 is low? Since when? I can see 10x7 and even 12x10/14x9 being considered to be low. Actually, most folks with widescreens these days run 14x9 don't they. Unless they opt for larger 20-22"+ widescreens that run higher native res. Look at what is mainstrean, not what is max. Sure, there are many people running 24"+ monitors at 19x12 to 25x16 or whatever the highest supported res is that will hurt any graphics card.

I would think that the average enthusiast gamer has a monitor that can support anywhere from 12x9 to 16x12. Extreme gamers go for the big monitors, usually those with pocket change to spare. (Am I sterotyping? Maybe. Maybe not.)

So, some flexibility is required here for your standards. 19x12 is not for everyone or within their means. 12x9 thru 16x12 is commonplace, CRT or LCD.

For Stalker, 19x12 at max settings probably would make an 8800GTS squirm, but unfortunately, I didn't have the means to test that resolution here with me.


You are fighting a lost cause... BFG sees it this way.

2560 X 1600 = High
1900 X 1200 = 'Middling'
1600 X 1200 = 'Low'

Of course it is ridiculous to assert that... Because, then the following would be...

1400 X 900 = lowerer
1280 X 800 = lowererer
1024 X 768 = lowerererer
800 X 600 = lowererererer
640 X 480 = lowerererererer

Yeah, sure...
uh oh (looking at at lowerer settings on 19" monitor).

apop or keys, have you guys compared your cat 7.6 benchies to cat 7.7 to see if they've made any improvements? The 2900xt seems to get better with each driver update, but it'd be nice to see actual comparisons.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: bryanW1995
apop or keys, have you guys compared your cat 7.6 benchies to cat 7.7 to see if they've made any improvements? The 2900xt seems to get better with each driver update, but it'd be nice to see actual comparisons.
i'm sorry but i was expecting Cat 7.7 next week :eek:

and i am finally back to PLAYING games ... i have a literal "baker's dozen" of unplayed, unfinished games ... and OverLord is the MOST FUN i have had since Fable:LC ... it is exactly like Fable mixed with B&W2 ... instead of a "creature", you have 4 class of "minions" - orcs - with their own specialties and weaknesses - that act as your "arm" of enforcement. And you are an "corrupt" or "not corrupt [read: 'good']" 'Sauron' who has to wipe out the disgusting Evil Halflings ... you get to interact with Elves, Dwarves and Men as you set about to rebuild your Dark Tower.

so ... later ... maybe tonight :p

 

videogames101

Diamond Member
Aug 24, 2005
6,783
27
91
Once and for all...

It scales from 6-series all the way to SLIed 8800ultra's and then some. Whatever your current system spec, WHATEVER your current spec (including 8800ultra's), you will need something better to max all settings out at higher resoultions.
 

bryanW1995

Lifer
May 22, 2007
11,144
32
91
I guess I could just run some benchmarks myself for those legions of fans lining up to game with an x1950xt at 1440x900 ;)
 

Matt2

Diamond Member
Jul 28, 2001
4,762
0
0
Originally posted by: bryanW1995
uh oh (looking at at lowerer settings on 19" monitor).

apop or keys, have you guys compared your cat 7.6 benchies to cat 7.7 to see if they've made any improvements? The 2900xt seems to get better with each driver update, but it'd be nice to see actual comparisons.

In my testing, the 2900XT loses anywhere from 1-5fps @ 1920x1200.

I think this was just the driver to introduce the edge-detect AA.