[UPDATED] DirectX 10 Gaming Performance Review - High End GPUs

Cookie Monster

Diamond Member
May 7, 2005
5,161
32
86
Updated with firing squads DX10 investigation.

FS: DirectX 10 Performance Update: Is DX10 Really Worth It?

And just how does the latest high-end DX10 hardware stack up in today?s DX10 content? Unfortunately as it stands right now, we aren?t seeing the true potential of AMD?s DX10 hardware due to immature drivers, so we can?t even speculate on how the Radeon HD 2900 XT performs in comparison to the GeForce 8800 from NVIDIA. Quite simply, NVIDIA?s GeForce 8800 line takes the DX10 performance crown unchallenged. And as our tests show, only one DX10 app scales with CrossFire: Lost Planet. This means that AMD?s driver team not only has to tweak their DX10 driver for more performance, but CrossFire needs to be implemented as well if they?re going to mount an effective challenge to NVIDIA?s SLI.

Fortunately AMD knows this and they?ve told us that they?re hard at work on addressing these issues. We should see the first fruits of this in upcoming drivers later this year, hopefully as soon as Catalyst 7.10. We also know AMD has some interesting developments in the works in terms of CrossFire ? already AMD has demonstrated 3-way CrossFire on their upcoming RD790 platform, with 4-way CrossFire in the works for that chipset as well. If everything goes according to plan, we should see this debut later this year.

Of course, NVIDIA won?t be resting on their laurels. Their nForce 680i chipset has boasted 3 PCI Express Graphics slots for quite some time now, and the second SLI connector present on their GeForce 8800 GTX/Ultra has been sitting there untapped for nearly a year now. It has also been an awful long time since we?ve seen a new GPU from NVIDIA, a company which is well known for establishing the 6-month product cycle in desktop graphics.

Link

Final Thoughts

At the outset of this review I wanted to help you decide which of the high end graphics cards currently available were the best for playing the first wave of DX10 game titles and I think we have easily done that. The NVIDIA GeForce 8800 GTX is our pick for anyone looking to play the likes of Bioshock, Company of Heroes, Call of Juarez, Lost Planet or World in Conflict in DX10 modes. The power of the G80 GPU and the time NVIDIA's driver development team has put into Vista are evident in our benchmark and gaming experience results.

The NVIDIA GeForce 8800 GTS 640MB makes a great second option though for users with a slightly smaller budget. This card still does very well under Vista with the DX10 games and keeping another $150 in your pocket never hurt anyone. The let down for me here was the Radeon HD 2900 XT from AMD; the card was only able to get off the bottom post in our results in a single test (Call of Juarez) and the multi-GPU results of CrossFire were abysmal when compared to what NVIDIA's SLI was able to accomplish. AMD needs to put some work into the GPU segment if they want to maintain any kind of technological level with NVIDIA's products, whether that would require software fixes or hardware re-spins. And there is no time like the present...

With other HUGE titles like Crysis and Unreal Tournament 3 just over the horizon, DX10 gaming is going to rocket-jump this holiday season. Let the fun begin!

I thought the review was pretty interesting with using the highest in game settings, DX10 etc etc plus using the latest Vista drivers. Also showing min/max/avg fps was good.

The review was 8800GTX /SLi vs 8800GTS 640mb /SLi vs 2900XT /CF

 

alcoholbob

Diamond Member
May 24, 2005
6,371
437
126
They all seem to share some similarly bad minimum fps numbers though. I'm guessing there's some kind of bottleneck in all modern video cards that just can't be gotten around yet. Average fps of 50 or 60, with peaks of over 100, but at the same time you get lows of 5fps.

Some games really bring out this bottleneck (like Oblivion, lol).
 

Sylvanas

Diamond Member
Jan 20, 2004
3,752
0
0
Meh... I am not really too interested in DX10 performance for these games with the exception of bioshock- it seems the devs still have a long way to go in releasing patches to make these titles playable in DX10 mode, at least the upcoming WIC patch aims to 'improve performance in DX10', Likewise for NV/ATI drivers. I'd like to see some more UT3 based games arrive for benching.
 

Shamrock

Golden Member
Oct 11, 1999
1,441
567
136
Leaving all bias behind, let's focus on AMD/ATI

In Company of Heroes, the AMD 2900XT scored (16x12 4xAA,8xAF)

max - 57
avg - 27.2
min - 5

Crossfire scored
max - 53
avg - 25.8
min - 4

HUH!?!? With Crossfire it was LESS? It also does it with 1920x1200. Why? In Bioshock, using Crossfire only gained 1 fps?

Apoppin? can you confirm this? (I know you have crossfire)
 

Matt2

Diamond Member
Jul 28, 2001
4,762
0
0
I didnt know CF was that bad in DX10. Looks like Nvidia's driver team get's a thumbs up.
 

Sylvanas

Diamond Member
Jan 20, 2004
3,752
0
0
World in Conflict Update
Update #001

The update will be released on Monday, October 8th.

This update introduces several new features that allow players to customize their profiles, and it improves the DX10 performance on all graphic cards. The update also includes many smaller fixes, tweaks and balancing improvements.

Bug Fixes:
- Performance on ATI DX10 cards has been improved.
- Performance on DX10 has generally been improved.

I guess Nvidia working 'closely' with developers pays off on release.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: Shamrock
Leaving all bias behind, let's focus on AMD/ATI

In Company of Heroes, the AMD 2900XT scored (16x12 4xAA,8xAF)

max - 57
avg - 27.2
min - 5

Crossfire scored
max - 53
avg - 25.8
min - 4

HUH!?!? With Crossfire it was LESS? It also does it with 1920x1200. Why? In Bioshock, using Crossfire only gained 1 fps?

Apoppin? can you confirm this? (I know you have crossfire)

no you don't :p

i decided *against* X-Fire for this very reason ... crap scaling in DX10 games ... i don't need another GPU to play ANY maxed-out DX9 game at 16x10 with 4xAA/16xAF
 

ArchAngel777

Diamond Member
Dec 24, 2000
5,223
61
91
So far DX10 has been a disaster... I never did believe the hype that DX10 would play the same DX9 like content 6 times faster! I remember seeing some people tough that off. Yeah, it might support unified shaders or what not, but power is still power... I don't believe DX10 has anything, performance wise on DX9 and I don't think it ever will. The only thing it might have is extra extentions features, but those can be added in anyway... I dunno, I don't program DX, but to me it seems like it was one big pile of BS in an attempt to get gamers to upgrade to Vista.

I hope I am wrong about my position, but I doubt I will be... So far DX10 has been a huge turn off, despite the fact that I can use, I don't use it.
 

Shamrock

Golden Member
Oct 11, 1999
1,441
567
136
Originally posted by: apoppin
Originally posted by: Shamrock
Leaving all bias behind, let's focus on AMD/ATI

In Company of Heroes, the AMD 2900XT scored (16x12 4xAA,8xAF)

max - 57
avg - 27.2
min - 5

Crossfire scored
max - 53
avg - 25.8
min - 4

HUH!?!? With Crossfire it was LESS? It also does it with 1920x1200. Why? In Bioshock, using Crossfire only gained 1 fps?

Apoppin? can you confirm this? (I know you have crossfire)

no you don't :p

i decided *against* X-Fire for this very reason ... crap scaling in DX10 games ... i don't need another GPU to play ANY maxed-out DX9 game at 16x10 with 4xAA/16xAF

Ahh, okie doke. I thought you were looking for a 2900 pro...my Apoppogies HAHAHA
 

Hauk

Platinum Member
Nov 22, 2001
2,806
0
0
Originally posted by: ArchAngel777
So far DX10 has been a disaster...

I don't know about "disaster," but it's certainly a dissapointment. You're right on point about it being a way to market Vista to gamers though. They knew we'd be looking at XP vs. Vista benchmarks. Why upgrade? Because of what you'll experience with DX10.

We can't soley blame Microsoft, or bad ATI & nVidia drivers. Anyone who's anyone knew DX10 overpromised. Throw any game developer who's hyped a DX10 tiltle under that same bus.

But here we are with our $500+ graphics cards and our $250+ motherboards bitching about frame rates. We're scanning the horizon daily for new product. Will the industry make it all better? Sure they will.

I hate to sound like a conspiracy nut, but anyone who doesn't see the game being played isn't awake. Microsoft, Intel, AMD, nVidia, Asus, EVGA, and several others, all have thier Tops in the same room, on the same side of the chess board. We consumers are on the other side. We think we're smart with our benchmarks and reviews; but they will always win because of our weakness. We are consumers...
 

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
Originally posted by: Shamrock
Leaving all bias behind, let's focus on AMD/ATI

In Company of Heroes, the AMD 2900XT scored (16x12 4xAA,8xAF)

max - 57
avg - 27.2
min - 5

Crossfire scored
max - 53
avg - 25.8
min - 4

HUH!?!? With Crossfire it was LESS? It also does it with 1920x1200. Why? In Bioshock, using Crossfire only gained 1 fps?

Apoppin? can you confirm this? (I know you have crossfire)

Needs DX10 specific optimization in the driver for specific games in crossfire. I don't think they have gotten to that yet.
 

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
Originally posted by: Sylvanas
World in Conflict Update
Update #001

The update will be released on Monday, October 8th.

This update introduces several new features that allow players to customize their profiles, and it improves the DX10 performance on all graphic cards. The update also includes many smaller fixes, tweaks and balancing improvements.

Bug Fixes:
- Performance on ATI DX10 cards has been improved.
- Performance on DX10 has generally been improved.

I guess Nvidia working 'closely' with developers pays off on release.

I've said this for a while now. It's all because ATI chose to go with a complete different architecture that was so different from past cards and so different from Nvidia's 8800 series that you have to do specific things to get any performance out of it. Either in the game's code somewhere or in the driver.

In this case the developer took it upon themselves to rework a few things to help us out. I still believe that with more time to work out drivers and whatnot that you will see the HD2900 cards performance keep going up. The problem is that by the time that really happens nobody will care because the game that gets the big boost will not be new anymore. It's sad and I hope that ATI learned a lesson here and goes back to making a card that is easier to take advantage of. Tesselation engine? Nobody will use that. 320 unified stream processors? We haven't seen any real benefits from that (yet).

I'd really like to see another X1900 style release where it was very competitive from the get go and IMO better for us consumers because you had more of a choice.
 

Cookie Monster

Diamond Member
May 7, 2005
5,161
32
86
Updated with firing sqauds DX10 performance review.

They use ALOT of cards, in single/SLI/CF, DX9/10 etc.