Company of Heroes DirectX 10 comparison

Woofmeister

Golden Member
Jul 18, 2004
1,385
1
76
Bit-tech's comparison here. Big surprise, DX10 give you some subtle effects with water, smoke and explosions at the cost of a large performance hit.

I'm not complaining mind you, COH is a good enough game in DX9 to make it worth your money. Adding a DX10 layer for free is icing on the cake.
 

ND40oz

Golden Member
Jul 31, 2004
1,264
0
86
I cranked everything up last night except AA on COH and got an average of 29fps in DX10 with a low of 8 and a high of 47 at 1920x1200 with dual 2900s with the built in benchmark. With it's default DX10 (ultra) settings I didn't drop below 30 fps though, and it was definitely playable and ran smoothly.
 

superbooga

Senior member
Jun 16, 2001
333
0
0
That's just how things work you know. At some point you get diminishing returns (in visual quality) at a huge cost in performance. It's like a 10mp vs 5mp camera for 3x4 photos.
 

Nightmare225

Golden Member
May 20, 2006
1,661
0
0
Originally posted by: ND40oz
I cranked everything up last night except AA on COH and got an average of 29fps in DX10 with a low of 8 and a high of 47 at 1920x1200 with dual 2900s with the built in benchmark. With it's default DX10 (ultra) settings I didn't drop below 30 fps though, and it was definitely playable and ran smoothly.

My GTX was taking in 15 FPS minimum, 30 average, and 57 high. Pretty strange how performance comparisons between ATI and NVIDIA in this game are reversed between DX9 and DX10. :confused:
 

ND40oz

Golden Member
Jul 31, 2004
1,264
0
86
Originally posted by: Nightmare225
Originally posted by: ND40oz
I cranked everything up last night except AA on COH and got an average of 29fps in DX10 with a low of 8 and a high of 47 at 1920x1200 with dual 2900s with the built in benchmark. With it's default DX10 (ultra) settings I didn't drop below 30 fps though, and it was definitely playable and ran smoothly.

My GTX was taking in 15 FPS minimum, 30 average, and 57 high. Pretty strange how performance comparisons between ATI and NVIDIA in this game are reversed between DX9 and DX10. :confused:

Were you benching at 1920x1200 or at the 2005WFPs 1680x1050?
 

Woofmeister

Golden Member
Jul 18, 2004
1,385
1
76
Originally posted by: superbooga
That's just how things work you know. At some point you get diminishing returns (in visual quality) at a huge cost in performance. It's like a 10mp vs 5mp camera for 3x4 photos.
With all due respect, I disagree. The drop in performance is chiefly the result of several factors:

(1) The fact that COH is a native DX9 game with a DX10 overlay versus native DX10
(2) The newness of DX10 itself
(3) The problems with Vista drivers that both Nvidia and ATI are still experiencing
(4) The fact that Vista itself exacts a performance hit versus XP.

The last three factors are going to be addressed sooner or later and then its pretty likely that those of us with the current crop of DX10 cards are going to be able to run the native DX10 games at playable frame rates. At least, I hope so.
 

superbooga

Senior member
Jun 16, 2001
333
0
0
Originally posted by: Woofmeister
Originally posted by: superbooga
That's just how things work you know. At some point you get diminishing returns (in visual quality) at a huge cost in performance. It's like a 10mp vs 5mp camera for 3x4 photos.
With all due respect, I disagree. The drop in performance is chiefly the result of several factors:

(1) The fact that COH is a native DX9 game with a DX10 overlay versus native DX10
(2) The newness of DX10 itself
(3) The problems with Vista drivers that both Nvidia and ATI are still experiencing
(4) The fact that Vista itself exacts a performance hit versus XP.

The last three factors are going to be addressed sooner or later and then its pretty likely that those of us with the current crop of DX10 cards are going to be able to run the native DX10 games at playable frame rates. At least, I hope so.

You have your points. However, the developers (Relic) had some blog or something where they talked about Dx10 vs Dx9. They mentioned that the shaders in Dx10 had to work like six times as hard for certain things. Also... PER-PIXEL LIGHTING!
 

cm123

Senior member
Jul 3, 2003
489
2
76
I personally think when all the smoke clears and dx10 games really get going, our 8800's and 2900's are not going to cut it at all if you what the eye candy turned on...

...sure they support dx10, just not with everything dx10 with things turned up - few of us at our work have got to play a bit with couple dx10 games now on good systems - 4GB, 3 raptors, 8800GTX one 2900XT 1GB other, x6800 cpu, X-Fi, Killer Nics, none of the games was playable cranked up - would think with systems vista give top performance ratings too across the board could handle dx10 games cranked...

...just my two cents, its been years we have all had vista (beta etc...) - so has ATI/AMD and nvidia, creative and such... if drivers, why the problems (ok AMD gets tiny break, new item)?

With nvidia having easy time of things the last year or more, it shows... or showing some now.
 

Extelleron

Diamond Member
Dec 26, 2005
3,127
0
71
Originally posted by: cm123
I personally think when all the smoke clears and dx10 games really get going, our 8800's and 2900's are not going to cut it at all if you what the eye candy turned on...

...sure they support dx10, just not with everything dx10 with things turned up - few of us at our work have got to play a bit with couple dx10 games now on good systems - 4GB, 3 raptors, 8800GTX one 2900XT 1GB other, x6800 cpu, X-Fi, Killer Nics, none of the games was playable cranked up - would think with systems vista give top performance ratings too across the board could handle dx10 games cranked...

...just my two cents, its been years we have all had vista (beta etc...) - so has ATI/AMD and nvidia, creative and such... if drivers, why the problems (ok AMD gets tiny break, new item)?

With nvidia having easy time of things the last year or more, it shows... or showing some now.

It's only in the last few months that nVidia/ATI have even begun to optimize for DX10. I think nVidia's Vista drivers got DX10 capability in February, and there wasn't any real DX10 applications until mid-may with Lost Planet. Now we have 3 DX10 applications, and only now is there a real reason for nVidia or ATI to care about DX10. The 158.45's showed big increases in DX10 performance, and I'm sure that future nVidia releases / ATI Catalyst 7.6 will show some increases as well. I think that drivers are holding back BOTH ATI's HD 2900XT and all of nVidia's 8800 cards.
 

pcslookout

Lifer
Mar 18, 2007
11,959
156
106
I think we will need the next generation Nvidia cards with 1 GB of video ram to be able to run most DX10 games with all high details on, 4x FSAA, and 16x AF at a pretty high resolution. Mainly because of LCDs. If we could run at a lower resolution on our LCD monitors with the same image quality as the native resolution we wouldn't have as much of a problem.