Age of Conan Gameplay Performance and IQ @ Hardocp

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

AzN

Banned
Nov 26, 2001
4,112
2
0
Originally posted by: njdevilsfan87
I don't know about you but the 8800GT does not "stomp" the 9600GT. The highest playable on the 8800GT is 34.9FPS. Use those same exact settings on the 9600GT, and you get just under 30FPS. So the 8800GT performs about 20% better than the 9600GT - as expected.

Also, my guess is if you compared the "highest playable" 9600GT settings and used the same on the 8800GT, the comparison would show the 8800GT being about 20% better... again as expected.

At 30fps this game is not playable according to Hardocp which they lowered settings and resolution to make it deem playable. I'm just wondering what kind of frame rates 9600gt spit out @ 1680x1050 at the same setting 8800gt was @ 1920x1200 when you start turning effects off just to have have the same frame as 8800gt.

20% might seem much but if that's the difference playable and unplayable it very much stomps. I'm sorry.
 

golem

Senior member
Oct 6, 2000
838
3
76
Originally posted by: Azn

At 30fps this game is not playable according to Hardocp which they lowered settings and resolution to make it deem playable. I'm just wondering what kind of frame rates 9600gt spit out @ 1680x1050 at the same setting 8800gt was @ 1920x1200 when you start turning effects off just to have have the same frame as 8800gt.

20% might seem much but if that's the difference playable and unplayable it very much stomps. I'm sorry.

You do know you are taking the must subjective parts of this review and using it to support your claims of the 9600 being an inferior card right?

What options enable the best gameplay is subjective, what is a playable frame rate is definitely subjective. I could stay that I consider 29 FPS playable and then say in the apples to apples comparsion that the 9600GT definately "stomps" the 3870 because it's about 25 percent faster? In fact, it out "stomps" the 3870 by 5% over how much it's stomped by the 8800GT.
 

AzN

Banned
Nov 26, 2001
4,112
2
0
Originally posted by: golem
Originally posted by: Azn

At 30fps this game is not playable according to Hardocp which they lowered settings and resolution to make it deem playable. I'm just wondering what kind of frame rates 9600gt spit out @ 1680x1050 at the same setting 8800gt was @ 1920x1200 when you start turning effects off just to have have the same frame as 8800gt.

20% might seem much but if that's the difference playable and unplayable it very much stomps. I'm sorry.

You do know you are taking the must subjective parts of this review and using it to support your claims of the 9600 being an inferior card right?

What options enable the best gameplay is subjective, what is a playable frame rate is definitely subjective. I could stay that I consider 29 FPS playable and then say in the apples to apples comparsion that the 9600GT definately "stomps" the 3870 because it's about 25 percent faster? In fact, it out "stomps" the 3870 by 5% over how much it's stomped by the 8800GT.

Hardocp conclusions is this. Let me quote it for you if you didn't read it first time.

This time, the ATI Radeon HD 3870 outperformed the GeForce 9600 GT. Not only was it able to run at a higher resolution, it was able to do so with higher gameplay settings. At 1920x1200 and 16X AF, we had to turn the overall draw distance down to 2800m, and the high quality draw distance down to 75%. But we were able to enable parallax mapping, giving us better lighting effects on flat surfaces than the GeForce 9600 GT was able to do.

Subjective sure but most people's assessment about being playable and unplayable is about same. Of course there's some freaks out there who says chugging frame rates of 20's is playable. You must be counting on the freaks to make your point.
 

titan131

Senior member
May 4, 2008
260
0
0
Its pretty much impossible to determine a clear cut winner from these benchmarks...

From the highest playable settings benchmarks the 3870 looks to be the better card, but i think(?) that chizow is saying that the shadow quality setting being set to 'everything' on the 9600gt has a big negative impact on performance; the 3870 only has this setting set to 'characters'.

I personally have no idea now much difference that that one setting would have on the preformance of a card, but it would have to be pretty colossal if chizows argument is correct because the 3870 is at a higher resolution, higher shadow resolution, higher 3D ambient Occlusion, a larger viewing distance, has parallax mapping enabled and both have no AA and 16 AF (where did 8AA for the 9600gt come from Azn?) Infact the only setting the 9600gt has higher than the 3870 is the shadow quality, which is set to everything.

As golem said, the apples to apples comparison show the 9600gt stamping all over the 3870. If all settings are set to low or disabled in this bechmark then it could be that the 9600gt is faster only when the effects are on low or disabled, while the 3870, with its extra shader power is able to handle them better (This, i think, is Azn's argument). Or it could be, now that the most crippling setting (according to chizow) has been disabled or set to a lower setting, this is allowing the 9600gt card to realise its true potential.

So, i dunno, im a little more swayed by Azn's arguement, but without more benchmarks who the hell really knows :p
 

golem

Senior member
Oct 6, 2000
838
3
76
Again, you're willing to let Hardocp's decisions on what is the best combination of options is to be the final word (as long as it supports the point you are trying to make). What if I like like everything shadows at the expense of resolution. By that criteria that means every card but the 3870 is acceptable, which is obviously wrong.

Exaggerating to try and prove your point again? The only real apples to apples comparison in the whole review shows the 9600GT faster than the 3870, but you disregard this fact (cuz it conflicts with the your point again) by saying Hardocp considers this unplayable. It's .6 frames below 30 which is generally considered playable by most people. But if someone were to say that's okay, then you call them a freak? And even then, these settings for the apples to apples comparison are based on what they consider playable for 3870x2. The whole review is just based on the the reviewers opinions w/o other data sets to draw your own conclusions.

Would you even have brought up this review if it 1/2 of it (the most subjective part by the way) didn't agree with your pre-determined conclusion?
 

golem

Senior member
Oct 6, 2000
838
3
76
Originally posted by: titan131

As golem said, the apples to apples comparison show the 9600gt stamping all over the 3870. If all settings are set to low or disabled in this bechmark then it could be that the 9600gt is faster only when the effects are on low or disabled, while the 3870, with its extra shader power is able to handle them better (This, i think, is Azn's argument). Or it could be, now that the most crippling setting (according to chizow) has been disabled or set to a lower setting, this is allowing the 9600gt card to realise its true potential.

So, i dunno, im a little more swayed by Azn's arguement, but without more benchmarks who the hell really knows :p

Take a look at the number 3870x2 scores in it's individual numbers and the numbers it scores on the apples to apples. It's the same. So unless Hardocp changed the settings between cards in the apples to apples comparison too (too stupid since how would it be apples to apples), those scores are not with all settings lowered, but with them pretty much maxed out.

So the 9600GT beats the 3870 at 3870x2 best playable settings. Azn is disregarding this because 1) it conflicts with his conclusion 2) 29.4 is unplayable (cuz Hardocp says so) and should be ignored, unless your a freak and consider 29.4 playable.

Actually if you look at the apples to apples and the individual scores. The 3870 is the only one in which they lower the shadows from everything to characters, and when they do, it scores increase drastically. Tho. the other thing that might have caused this is the water setting. We don't know because Hardocp doesn't tell us.

If they had used high settings at a lower resolution also, that would be more data to draw a conclusion, but they don't.

In the future, the 3870 may be better than the 9600GT, but this review really doesn't prove it.
 

Munky

Diamond Member
Feb 5, 2005
9,372
0
76
This is one of the many reasons why I consider HOCP reviews garbage. Different settings affect performance in different ways, and if the reviewer arbitrarily decides to adjust settings for each video card, it doesn't give me any useful information. Games should be tested with maximum load on the video card, as long as the performance is playable. If not, then the settings should be adjust the same for every card, so it will still be a controlled experiment.
 

ArchAngel777

Diamond Member
Dec 24, 2000
5,223
61
91
Originally posted by: munky
This is one of the many reasons why I consider HOCP reviews garbage. Different settings affect performance in different ways, and if the reviewer arbitrarily decides to adjust settings for each video card, it doesn't give me any useful information. Games should be tested with maximum load on the video card, as long as the performance is playable. If not, then the settings should be adjust the same for every card, so it will still be a controlled experiment.

Yep... Been said so many times. HardOCP is a garbage site, it even looks like garbage. They should get a new web designer.
 

AzN

Banned
Nov 26, 2001
4,112
2
0
Originally posted by: golem
Again, you're willing to let Hardocp's decisions on what is the best combination of options is to be the final word (as long as it supports the point you are trying to make). What if I like like everything shadows at the expense of resolution. By that criteria that means every card but the 3870 is acceptable, which is obviously wrong.

Exaggerating to try and prove your point again? The only real apples to apples comparison in the whole review shows the 9600GT faster than the 3870, but you disregard this fact (cuz it conflicts with the your point again) by saying Hardocp considers this unplayable. It's .6 frames below 30 which is generally considered playable by most people. But if someone were to say that's okay, then you call them a freak? And even then, these settings for the apples to apples comparison are based on what they consider playable for 3870x2. The whole review is just based on the the reviewers opinions w/o other data sets to draw your own conclusions.

Would you even have brought up this review if it 1/2 of it (the most subjective part by the way) didn't agree with your pre-determined conclusion?

I didn't call anyone freak first of all. I'm just implying that there are some people that think 20fps is playable and some even lower doesn't necessarily mean it's playable by most people's assessment.

I brought this review because I read Hardocp in a occasion including many other sites. It doesn't change the fact shader count has been rising steadily within the last year and why 9600gt isn't a good idea over 8800gt when it's $10 more than 9600gt with 75% more shader power. And other alternatives that are cheaper and just as good.



 

AzN

Banned
Nov 26, 2001
4,112
2
0
Originally posted by: titan131
Its pretty much impossible to determine a clear cut winner from these benchmarks...

From the highest playable settings benchmarks the 3870 looks to be the better card, but i think(?) that chizow is saying that the shadow quality setting being set to 'everything' on the 9600gt has a big negative impact on performance; the 3870 only has this setting set to 'characters'.

I personally have no idea now much difference that that one setting would have on the preformance of a card, but it would have to be pretty colossal if chizows argument is correct because the 3870 is at a higher resolution, higher shadow resolution, higher 3D ambient Occlusion, a larger viewing distance, has parallax mapping enabled and both have no AA and 16 AF (where did 8AA for the 9600gt come from Azn?) Infact the only setting the 9600gt has higher than the 3870 is the shadow quality, which is set to everything.

As golem said, the apples to apples comparison show the 9600gt stamping all over the 3870. If all settings are set to low or disabled in this bechmark then it could be that the 9600gt is faster only when the effects are on low or disabled, while the 3870, with its extra shader power is able to handle them better (This, i think, is Azn's argument). Or it could be, now that the most crippling setting (according to chizow) has been disabled or set to a lower setting, this is allowing the 9600gt card to realise its true potential.

So, i dunno, im a little more swayed by Azn's arguement, but without more benchmarks who the hell really knows :p

Shadows is basically doubling up the amount of textures it needs to draw the screen. At lower resolutions 3870 could potentially have enough fillrate to be playable at high quality while it takes a nose dive at 1920x1200. I understand chizow is saying but unplayable is unplayable. I really doubt people buy 9600gt or 3870 to play 1920x1200 resolution anyway.
 

AzN

Banned
Nov 26, 2001
4,112
2
0
Originally posted by: munky
This is one of the many reasons why I consider HOCP reviews garbage. Different settings affect performance in different ways, and if the reviewer arbitrarily decides to adjust settings for each video card, it doesn't give me any useful information. Games should be tested with maximum load on the video card, as long as the performance is playable. If not, then the settings should be adjust the same for every card, so it will still be a controlled experiment.

I understand some cards have strengths and weaknesses but in the end you are shooting for highest playable or smooth gameplay with max graphic settings. Which a apple to apple benchmark can never give us that perspective. I don't totally agree with their way of benchmarking but I do have an idea what they are trying to aim for.
 

golem

Senior member
Oct 6, 2000
838
3
76
Originally posted by: Azn

I didn't call anyone freak first of all. I'm just implying that there are some people that think 20fps is playable and some even lower doesn't necessarily mean it's playable by most people's assessment.

I brought this review because I read Hardocp in a occasion including many other sites. It doesn't change the fact shader count has been rising steadily within the last year and why 9600gt isn't a good idea over 8800gt when it's $10 more than 9600gt with 75% more shader power. And other alternatives that are cheaper and just as good.

Then I apologize if I made this too personal. But really, if you look at the numbers, it can be interpretated in many ways. I can say it proves the opposite of what your area saying by just focusing on certain areas. If you use the apples to apples numbers as a baseline, the 3870 frame rates increase drastically when you lower shadow and water effects settings. If these are the most shader intensive areas, then it could be concluded that the 3870 has really weak shaders since resolution was kept constant. I don't think this is the correct conclusion but you could argue it if you ignore other data.

But yeah, getting a 9600gt instead of a 8800gt for a $10 difference would not be wise. I also don't disagree with you that as game shaders get more complex or numerous, the 3870 might start pulling ahead of the 9600GT, but it might not. This review is really too open to interpretation to say.
 

AzN

Banned
Nov 26, 2001
4,112
2
0
Well Chizow has argumentative problem with me. You could have guess it by now. We are like water and fire ever since we had a long thread about why I think G92 is bandwidth deprived and he thinks it doesn't.
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
I thought the most important feature of the 9600GT is that it starts with a 9... which is bigger then an 8 on the 8800GT.
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,002
126
This is where 8800gt stomps the 9600gt all over the place. 3870 seems to be stomping the 9600gt as well.
Eh? With Apples vs Apples settings it goes 8800 GT > 9600 GT > 3870.
 

nitromullet

Diamond Member
Jan 7, 2004
9,031
36
91
I don't understand where they are getting their numbers from anyway...

The minimum frames for both the 9600GT and the 3870 in the tables don't line up with the included graph.

http://enthusiast.hardocp.com/...NlTFlDVEpfNF8zX2wuZ2lm

The min fps for the HD 3870 dips below the reported min of 14 fps a few times, and the 9600GT dips way below 15 fps close to the left side of the graph. HardOCP indicates that the only card that is playable out of the three is the 8800GT, but interestingly they fail to actually recognize why. You would think that they would notice that the 9600GT and HD3870 dip between 0-5 fps on occasion, and show that in the min/avg/max table properly.
 

schneiderguy

Lifer
Jun 26, 2006
10,801
91
91
Originally posted by: nitromullet
I don't understand where they are getting their numbers from anyway...

The minimum frames for both the 9600GT and the 3870 in the tables don't line up with the included graph.

http://enthusiast.hardocp.com/...NlTFlDVEpfNF8zX2wuZ2lm

The min fps for the HD 3870 dips below the reported min of 14 fps a few times, and the 9600GT dips way below 15 fps close to the left side of the graph. HardOCP indicates that the only card that is playable out of the three is the 8800GT, but interestingly they fail to actually recognize why. You would think that they would notice that the 9600GT and HD3870 dip between 0-5 fps on occasion, and show that in the min/avg/max table properly.

The sharp dips to <5 fps aren't really significant, the game was probably loading something from the hard drive. If those dips lasted longer than a few milliseconds they would be noted as the minimum framerate.


 

AzN

Banned
Nov 26, 2001
4,112
2
0
Originally posted by: BFG10K
This is where 8800gt stomps the 9600gt all over the place. 3870 seems to be stomping the 9600gt as well.
Eh? With Apples vs Apples settings it goes 8800 GT > 9600 GT > 3870.

Eh well at unplayable 1920x1200 settings according to H. If you read the whole thread I think I mention where that performance negligence might happen with taxation of shadows in this game. At lower resolutions that taxes of texture fillrate becomes much lower which H doesn't give the performance result of.
 

AzN

Banned
Nov 26, 2001
4,112
2
0
Originally posted by: schneiderguy
Originally posted by: nitromullet
I don't understand where they are getting their numbers from anyway...

The minimum frames for both the 9600GT and the 3870 in the tables don't line up with the included graph.

http://enthusiast.hardocp.com/...NlTFlDVEpfNF8zX2wuZ2lm

The min fps for the HD 3870 dips below the reported min of 14 fps a few times, and the 9600GT dips way below 15 fps close to the left side of the graph. HardOCP indicates that the only card that is playable out of the three is the 8800GT, but interestingly they fail to actually recognize why. You would think that they would notice that the 9600GT and HD3870 dip between 0-5 fps on occasion, and show that in the min/avg/max table properly.

The sharp dips to <5 fps aren't really significant, the game was probably loading something from the hard drive. If those dips lasted longer than a few milliseconds they would be noted as the minimum framerate.

Exactly and point on.
 

nitromullet

Diamond Member
Jan 7, 2004
9,031
36
91
Originally posted by: Azn
Originally posted by: schneiderguy
Originally posted by: nitromullet
I don't understand where they are getting their numbers from anyway...

The minimum frames for both the 9600GT and the 3870 in the tables don't line up with the included graph.

http://enthusiast.hardocp.com/...NlTFlDVEpfNF8zX2wuZ2lm

The min fps for the HD 3870 dips below the reported min of 14 fps a few times, and the 9600GT dips way below 15 fps close to the left side of the graph. HardOCP indicates that the only card that is playable out of the three is the 8800GT, but interestingly they fail to actually recognize why. You would think that they would notice that the 9600GT and HD3870 dip between 0-5 fps on occasion, and show that in the min/avg/max table properly.

The sharp dips to <5 fps aren't really significant, the game was probably loading something from the hard drive. If those dips lasted longer than a few milliseconds they would be noted as the minimum framerate.

Exactly and point on.

Ok... few questions for you smart guy...

1) Probably loading something from the HD, eh? Why is it that the faster 8800GT magically doesn't suffer from these "load times"?

2) Notice that the peak at 49fps for the Radeon doesn't cover any more time than the dips below 10fps close to the 141 and 281 second intervals. Why is the max indicated as 49fps and the minimum as 14, when the Radeon more frequently and for just as long dips below 14fps as it sustains 49fps?

3) Why would anyone bother to graph something in increments that were too small to be meaningful? Doesn't that defeat the purpose of the graph?
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,002
126
Eh well at unplayable 1920x1200 settings according to H.
It doesn't matter if it's playable or not; what matters is the settings are the same. If the settings are different then you can?t really compare cards directly.

When the settings are the same the cards rank 8800 GT > 9600 GT > 3870.

If you read the whole thread I think I mention where that performance negligence might happen with taxation of shadows in this game. At lower resolutions that taxes of texture fillrate becomes much lower which H doesn't give the performance result of.
I can?t even parse this properly, let alone infer it?s somehow relevant to what you quoted.

The fact is you can't make a performance inference about the cards if they aren?t running at the same settings. You certainly can?t infer the 3870 is faster, especially not when apples vs apples confirms it isn?t.
 

AzN

Banned
Nov 26, 2001
4,112
2
0
Originally posted by: nitromullet
Originally posted by: Azn
Originally posted by: schneiderguy
Originally posted by: nitromullet
I don't understand where they are getting their numbers from anyway...

The minimum frames for both the 9600GT and the 3870 in the tables don't line up with the included graph.

http://enthusiast.hardocp.com/...NlTFlDVEpfNF8zX2wuZ2lm

The min fps for the HD 3870 dips below the reported min of 14 fps a few times, and the 9600GT dips way below 15 fps close to the left side of the graph. HardOCP indicates that the only card that is playable out of the three is the 8800GT, but interestingly they fail to actually recognize why. You would think that they would notice that the 9600GT and HD3870 dip between 0-5 fps on occasion, and show that in the min/avg/max table properly.

The sharp dips to <5 fps aren't really significant, the game was probably loading something from the hard drive. If those dips lasted longer than a few milliseconds they would be noted as the minimum framerate.

Exactly and point on.

Ok... few questions for you smart guy...

1) Probably loading something from the HD, eh? Why is it that the faster 8800GT magically doesn't suffer from these "load times"?

2) Notice that the peak at 49fps for the Radeon doesn't cover any more time than the dips below 10fps close to the 141 and 281 second intervals. Why is the max indicated as 49fps and the minimum as 14, when the Radeon more frequently and for just as long dips below 14fps as it sustains 49fps?

3) Why would anyone bother to graph something in increments that were too small to be meaningful? Doesn't that defeat the purpose of the graph?

Why thanks mullet

1. Test systems was using Vista 64. If you haven't used Vista by now the hard drive is always doing something. If they didn't do multiple runs the frame rates could spike. With 2gigs with Vista 64 CCC uses more ram than Nvidia drivers. And those sudden drops could be witnessed when ram runs out playing a MMO like Conan

2. Well it's freaking fraps. It's also has a overhead.

3. You should ask H
 

AzN

Banned
Nov 26, 2001
4,112
2
0
Originally posted by: BFG10K
Eh well at unplayable 1920x1200 settings according to H.
It doesn't matter if it's playable or not; what matters is the settings are the same. If the settings are different then you can?t really compare cards directly.

When the settings are the same the cards rank 8800 GT > 9600 GT > 3870.

If you read the whole thread I think I mention where that performance negligence might happen with taxation of shadows in this game. At lower resolutions that taxes of texture fillrate becomes much lower which H doesn't give the performance result of.
I can?t even parse this properly, let alone infer it?s somehow relevant to what you quoted.

The fact is you can't make a performance inference about the cards if they aren?t running at the same settings. You certainly can?t infer the 3870 is faster, especially not when apples vs apples confirms it isn?t.

Are you going to cry about it now because Hardocp said 3870 is faster than 9600gt for this game too? :laugh: