3870X2 on 2560x1600

Page 4 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: nitromullet
Originally posted by: apoppin
i am quite certain that Crysis *would* be playable on Tri-SLi ... if they could optimize the game. Look at how long it took FarCry to become really optimized ... i am expecting another year - at a minimum. 64-bit should also give a much better experience than 32-bit if FC is an indication.

Maybe true, but who's gonna buy 3 GTX/Ultras, a 780i motherboard, and a 1000W+ PSU so they can play Crysis next year? Did Far Cry really get more 'optimized' or did hardware just get better?

yes it did ... FC was an even buggier mess than Crysis Demo out of the box ... i guess anyone could try it ... just don't patch it. i remember trying to run it with my Radeon 8500 and just gave up ... i upgraded to 9800xt especially for Thief - DS and was also able to play FC as it had been patched a couple of times. And of course they optimized it for HDR and added 64-bit support later on.

i d/led all the patched for HGL ... and it looks great fully maxed in DX10 @10x16 ... and i will try for *more* ... what is the best tool for OC'ing my Pro in Xfire? it needs to be 64-bit compatible and be able to keep setting without re enabling them every time i boot up.
Maybe RivaTuner ... or ATI Tray tools?

... and no idea about the Pro, BryanW ... i just remember noting it's release as a cheap replacement for the 512MB version ... i just hope it works for me
 

CP5670

Diamond Member
Jun 24, 2004
5,697
798
126
i am quite certain that Crysis *would* be playable on Tri-SLi ... if they could optimize the game. Look at how long it took FarCry to become really optimized ... i am expecting another year - at a minimum. 64-bit should also give a much better experience than 32-bit if FC is an indication.

I don't know know if Far Cry ever worked completely with it. The last time I played it on SLI (mid 2006), it still had problems. Things seemed to work well until halfway through the game, when you first picked up the nightvision goggles. Turning them on inexplicably caused the framerate to drop to the level of a single card. There was also some rendering glitch I can't remember now. Far Cry was one of the few major titles to use SFR, which didn't scale so well in general.
 

nitromullet

Diamond Member
Jan 7, 2004
9,031
36
91
yes it did ... FC was an even buggier mess than Crysis Demo out of the box ... i guess anyone could try it ... just don't patch it. i remember trying to run it with my Radeon 8500 and just gave up ... i upgraded to 9800xt especially for Thief - DS and was also able to play FC as it had been patched a couple of times. And of course they optimized it for HDR and added 64-bit support later on.

Oh, I remember how buggy it was, I just don't ever recall it being any other way... :) Wasn't the HDR patch the one they had to recall because it was so bad? I guess I lost interest before it got better. I played the whole game through on an FX5900, and never really got into the the MP. By the time I got my 6800GT, it was all about Doom3 and HL2.

I did play the first few levels just recently though (in anticipation of Crysis) on a 7900GS, it ran pretty well with 4xAA/16xAF @ 1920x1200, but a 7900GS is A LOT more video card than an FX5900.
 

CP5670

Diamond Member
Jun 24, 2004
5,697
798
126
I think it was the 1.4 beta patch that was removed, which provided HDR+AA on the X1900 cards. Although the final 1.4, which came out over a year later, was reportedly even more buggy. Some fans made a patch based off 1.3 that included some of the 1.4 stuff.

I would like to play through this game again some time on higher settings and the extended content pack, whenever I get around to upgrading my card. It would benefit a lot from a higher resolution given its vast open spaces. (I played most of it on 1024x768 without AA on a 6800GT, although I had to drop down to 800x600 for a few levels towards the end)
 

nitromullet

Diamond Member
Jan 7, 2004
9,031
36
91
Originally posted by: CP5670
I think it was the 1.4 beta patch that was removed, which provided HDR+AA on the X1900 cards. Although the final 1.4, which came out over a year later, was reportedly even more buggy. Some fans made a patch based off 1.3 that included some of the 1.4 stuff.

I would like to play through this game again some time on higher settings and the extended content pack, whenever I get around to upgrading my card. It would benefit a lot from a higher resolution given its vast open spaces. (I played most of it on 1024x768 without AA on a 6800GT, although I had to drop down to 800x600 for a few levels towards the end)

It was way before the X1900 series came out. It was patch 1.2 patch back in July 2004.

http://www.techspot.com/news/1...12-patch-recalled.html

...unless they recalled more than one patch....
 

CP5670

Diamond Member
Jun 24, 2004
5,697
798
126
...unless they recalled more than one patch....

This seems to be the case. I hadn't heard about this one as well.

I guess the 1.4 beta was always just that, a beta, but it was taken off their site at one point due to the bugs in it and the final 1.4 (which came out several months later) didn't support HDR+AA.
 

nRollo

Banned
Jan 11, 2002
10,460
0
0
Originally posted by: nitromullet
Originally posted by: apoppin
i am quite certain that Crysis *would* be playable on Tri-SLi ... if they could optimize the game. Look at how long it took FarCry to become really optimized ... i am expecting another year - at a minimum. 64-bit should also give a much better experience than 32-bit if FC is an indication.

Maybe true, but who's gonna buy 3 GTX/Ultras, a 780i motherboard, and a 1000W+ PSU so they can play Crysis next year? Did Far Cry really get more 'optimized' or did hardware just get better?

I uninstalled/reinstalled/patched Crysis, and can now play at 25X16 0X0X, all medium settings pretty well.

Not really what I bought a big monitor for, but at least no flashing textures anymore. (and I can play at lower resolutions, higher settings of course)

Crysis is definitely a unique animal in the world of PC gaming.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: nRollo
Originally posted by: nitromullet
Originally posted by: apoppin
i am quite certain that Crysis *would* be playable on Tri-SLi ... if they could optimize the game. Look at how long it took FarCry to become really optimized ... i am expecting another year - at a minimum. 64-bit should also give a much better experience than 32-bit if FC is an indication.

Maybe true, but who's gonna buy 3 GTX/Ultras, a 780i motherboard, and a 1000W+ PSU so they can play Crysis next year? Did Far Cry really get more 'optimized' or did hardware just get better?

I uninstalled/reinstalled/patched Crysis, and can now play at 25X16 0X0X, all medium settings pretty well.

Not really what I bought a big monitor for, but at least no flashing textures anymore. (and I can play at lower resolutions, higher settings of course)

Crysis is definitely a unique animal in the world of PC gaming.

*medium* :p
:Q

:roll:

what FPS do you get with "very high"
 

heyheybooboo

Diamond Member
Jun 29, 2007
6,278
0
0
Originally posted by: apoppin
i am quite certain that Crysis *would* be playable on Tri-SLi ... if they could optimize the game. Look at how long it took FarCry to become really optimized ... i am expecting another year - at a minimum. 64-bit should also give a much better experience than 32-bit if FC is an indication.

==================
I used the AMD GPU Clock Tool. Temps never got above 49c -66c on the sensors on either card. The cards were actually quiet. I think there's some OC headroom but I've only got a 650w PS and don't want to press my luck much further.
Hey, heyheybooboo - Are you using 2 bridge interconnects? And just for laughs, what is 3DMark06? ... and what is the *rest* of your rig like?


2 bridge interconnects? Uhhhh, yeah. Can you only use one? - lol

This 'optimize' thing is really frustrating. I ran the 2900s at max at the highest "optimized" rez (1280x960) I could run and got 55% higher fps in Crossfire in the FEAR test (versus 30% crossfire increase at 1366x768 'unoptimized'). Disabling 'soft shadows' didn't help or hurt, really ... but this &^#! 'optimized' bs is starting to 'p' me off

MSI K9A2 Platinum 790fx, x2 5400+ @ 2.8GHz w/ Opty copper heatpipe cooler, 2Gb Kingston DDR2 800, 80Gb Maxtor IDE, Enlight ATX Mid Tower, 2 80mm & 1 120mm case fans, loud-ass NEC DVD, Silencer 610 PS, Westy w4207 42-inch HD monitor, audio optical out to Yamaha SS receiver

Single card I got 9,152 3dmark somewhere around 750/840. I really need to find a bigger PS with some 2x4 connectors before I get carried away and screw something up :) I don't think it will help too much in 3damrk crossfire since I'm not 'optimized' anyway.

Originally posted by: bryanW1995
when did they come out with a 256 bit 2900 pro? It makes sense, of course, since the 512 bit was way overkill and 256 bit is cheaper to make, I've just never seen any spec sheets on one. I just re-checked sapphire's site and couldn't find any links to it, either.

The Egg links to the Sapphire 2900pro 512-bit page but it clearly states 256-bit edition on the box. 'Not quite' a 3870 but it is definitely pushing into stock 2900xt/8800gts territory when OCed. FEAR test maximun-4aa/16af - at 1280x960 (highest 'optimized' rez I can run)

Crossfire
min -57
avg - 145
max - 386

Single
min - 37
avg - 92
max - 226
96%^40 fps


1680x1050 actually scored about the same, if not a little less than 1920x1080 but neither of those were ""optimized""
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: heyheybooboo
Originally posted by: apoppin
i am quite certain that Crysis *would* be playable on Tri-SLi ... if they could optimize the game. Look at how long it took FarCry to become really optimized ... i am expecting another year - at a minimum. 64-bit should also give a much better experience than 32-bit if FC is an indication.

==================
I used the AMD GPU Clock Tool. Temps never got above 49c -66c on the sensors on either card. The cards were actually quiet. I think there's some OC headroom but I've only got a 650w PS and don't want to press my luck much further.
Hey, heyheybooboo - Are you using 2 bridge interconnects? And just for laughs, what is 3DMark06? ... and what is the *rest* of your rig like?


2 bridge interconnects? Uhhhh, yeah. Can you only use one? - lol

This 'optimize' thing is really frustrating. I ran the 2900s at max at the highest "optimized" rez (1280x960) I could run and got 55% higher fps in Crossfire in the FEAR test (versus 30% crossfire increase at 1366x768 'unoptimized'). Disabling 'soft shadows' didn't help or hurt, really ... but this &^#! 'optimized' bs is starting to 'p' me off

MSI K9A2 Platinum 790fx, x2 5400+ @ 2.8GHz w/ Opty copper heatpipe cooler, 2Gb Kingston DDR2 800, 80Gb Maxtor IDE, Enlight ATX Mid Tower, 2 80mm & 1 120mm case fans, loud-ass NEC DVD, Silencer 610 PS, Westy w4207 42-inch HD monitor, audio optical out to Yamaha SS receiver

Single card I got 9,152 3dmark somewhere around 750/840. I really need to find a bigger PS with some 2x4 connectors before I get carried away and screw something up :) I don't think it will help too much in 3damrk crossfire since I'm not 'optimized' anyway.

Originally posted by: bryanW1995
when did they come out with a 256 bit 2900 pro? It makes sense, of course, since the 512 bit was way overkill and 256 bit is cheaper to make, I've just never seen any spec sheets on one. I just re-checked sapphire's site and couldn't find any links to it, either.

The Egg links to the Sapphire 2900pro 512-bit page but it clearly states 256-bit edition on the box. 'Not quite' a 3870 but it is definitely pushing into stock 2900xt/8800gts territory when OCed. FEAR test maximun-4aa/16af - at 1280x960 (highest 'optimized' rez I can run)

Crossfire
min -57
avg - 145
max - 386

Single
min - 37
avg - 92
max - 226
96%^40 fps


1680x1050 actually scored about the same, if not a little less than 1920x1080 but neither of those were ""optimized""

Thanks for the reply.

Yeah, i only have one bridge interconnect. CCC says it is "not functioning at optimal performance" but has nothing in diagnostics. One bridge came with my VT 2900xt but G-D Sapphire is too cheap to include one ... so i ordered one that will be here tomorrow.

Explain this "optimized" vs. unoptimized thing. Almost everything i play is at 16x10 unless i need to break out my CRT for DX10 games :p

Single card, my 2900xt is around 10500 in 3DMark06 [vista64] and 12500 with the 2900p running in Tandem ... but RT shows my 2900xt running at it's stock clock and my 2900pro running at some unglodly low clock.

my problem is that OC don't "stick" ... i can't save them and i am *hoping* that it has to do with having only one bridge interconnect. i'll let you know tomorrow in the "Vista showdown" thread or if you reply - here.