Sad Crossfire Performance

LittleNemoNES

Diamond Member
Oct 7, 2005
4,142
0
0
I'm using Cat 6.5 and did a fresh install, enabled crossfire. Ran 3dmark06, got happy. But it was all down hill from there.

3dmark06 I got 9,637

All my games are up to date too, btw. Latest bios for A8R32, using 2GB OCZ ram @ 250 Opteron 170 @ 2.75, x1900xtx+x1900 CF@ 650/1550, X-Fi Fatality. Everything is perfectly cooled too...

Was I expecting too much?

I know its tempting to think that I"m a noob and don't know what the hell I'm doing... but seriously, I've tried many things and still get crap performance with crossfire.

I know to use high AA and AF. I always use 16xHQAF.

Like I said, the real problem is the minimum framerates which still suck. On a 4xAA 16xAF bench in FEAR I got:

Resolution 1680x1050

Single x1900xtx:
21 - minimum
36 - average
60 - max
---------
7% - <25fps
62% - 25-40 fps
31% - >40 fps

With Crossfire:
22 - minimum
41 - average
61 - max
--------
3% - <25fps
46% - 25-40 fps
51% - >40 fps

Same for GRAW and Oblivion -- in which I notice absolutely NO difference. The FPS difference isn't substantial enough to make it noticeable or worth it. I'm still getting crappy 17-20 fps in heavily grassed areas in Oblivion. BUT at least I'm getting 200 fps indoors -_-

My biggest complaint is the minumum FPS still sucks. My average went up like 10 fps and the highs went up like 200 fps... but they were already @ >60 fps.
 

thilanliyan

Lifer
Jun 21, 2005
12,065
2,278
126
What resolution you running at??

Look at the Anandtech review of the 7900 series where they compare crossfire to it. Are your numbers inline with those??
 

thilanliyan

Lifer
Jun 21, 2005
12,065
2,278
126
Then your numbers are off I think...

click

Even at 1600x1200 4AA the average is 53/54fps for crossfire and a single X1900XT got 41fps. I don't know what AF they used though. Do you have vsync and maybe adaptive AA on??

Look at this review too.
 

LittleNemoNES

Diamond Member
Oct 7, 2005
4,142
0
0
Well turns out I did have vsync on. I've not tried FEAR again but renaming Oblivion.exe and GRAW.exe to AFR-FriendlyD3D.exe double my FPS O_O

The thing is, man how come I had to figure that out instead of ATI doing it in the drivers? -_-

Well, I'm relieved now but there's still an issue in Oblivion where the shadows flicker when renamed to AFR-FriendlyD3D.exe though the average FPS was 56 fps O_O
 

nitromullet

Diamond Member
Jan 7, 2004
9,031
36
91
Originally posted by: hardwareking
i suggest u turn vsync off.Cause it can sometimes have a negative impact on frame rate.

This is true, but it also tends to add quite a bit to visual quality. I for one dislike tearing quite a bit, and I enable v-sync whenever I can. FEAR has a number of areas where there is a single flickering light source in a room, and without v-sync enabled the top half of the screen will often have the light in a different state than the bottom half. When this happens, it looks terrible and kills the effect that the developer was going for.

Originally posted by: gersson
Well turns out I did have vsync on. I've not tried FEAR again but renaming Oblivion.exe and GRAW.exe to AFR-FriendlyD3D.exe double my FPS O_O

The thing is, man how come I had to figure that out instead of ATI doing it in the drivers? -_-

Well, I'm relieved now but there's still an issue in Oblivion where the shadows flicker when renamed to AFR-FriendlyD3D.exe though the average FPS was 56 fps O_O

I thought they fixed the CrossFire Oblivion issues in the 6.5's... weird.
 

JohnAn2112

Diamond Member
May 8, 2003
4,895
1
81
Originally posted by: gersson
I'm using Cat 6.5 and did a fresh install, enabled crossfire. Ran 3dmark06, got happy. But it was all down hill from there.

3dmark06 I got 9,500+

All my games are up to date too, btw. Latest bios for A8R32, using 2GB OCZ ram @ 250 Opteron 170 @ 2.75, x1900xtx+x1900 CF@ 650/1550, X-Fi Fatality. Everything is perfectly cooled too...

Was I expecting too much?

I know its tempting to think that I"m a noob and don't know what the hell I'm doing... but seriously, I've tried many things and still get crap performance with crossfire.

I know to use high AA and AF. I always use 16xHQAF.

Like I said, the real problem is the minimum framerates which still suck. On a 4xAA 16xAF bench in FEAR I got:

Resolution 1680x1050

Single x1900xtx:
21 - minimum
36 - average
60 - max
---------
7% - <25fps
62% - 25-40 fps
31% - >40 fps

With Crossfire:
22 - minimum
41 - average
61 - max
--------
3% - <25fps
46% - 25-40 fps
51% - >40 fps

Same for GRAW and Oblivion -- in which I notice absolutely NO difference. The FPS difference isn't substantial enough to make it noticeable or worth it. I'm still getting crappy 17-20 fps in heavily grassed areas in Oblivion. BUT at least I'm getting 200 fps indoors -_-

My biggest complaint is the minumum FPS still sucks. My average went up like 10 fps and the highs went up like 200 fps... but they were already @ >60 fps.

Are both cards X1900XTX cards or is it X1900XTX with a X1900 like you posted? If what you posted correct, I believe that the performance will drop to the level of the X1900 and not the X1900XTX.
 

LittleNemoNES

Diamond Member
Oct 7, 2005
4,142
0
0
Performance for Crossfire is not like SLI in which the faster card is backward throttled to the slower cards' speed. You can run separate clocks for each card. I'm still testing to see what I can change.

BTW, the crossfire card was OCed to see if there would be an improvement. There was a very minute change.
 

Ackmed

Diamond Member
Oct 1, 2003
8,499
560
126
Try it again without vsync on and see what you get. I get a pretty large increase in F.E.A.R. Actually just reinstalled it to play it again. Now that I can actually play it at my native res.

Originally posted by: nitromullet

I thought they fixed the CrossFire Oblivion issues in the 6.5's... weird.

It was. I didnt have to rename it, and there is also a new chuck patch that works with the 6.5's.