Originally posted by: keysplayr2003
Originally posted by: apoppin
No, you didn't have the GTS640 yet. Otherwise why would I ask you for your card if you had both there in your hands already, which is why I asked you for yours in the first place? My goal was to compare the 2900XT to 8800GTS640. If you already had both in hand, why would I bother asking you for the XT when you could have done the comparison yourself?
Makes no sense.
And I believe your rig should be far below 2900XT crossfired. Number 1, you say overclocking the pro had zero improvement when in the 4x slot. So in reality, you have 2900pro Crossfire. This is why I have my doubts about your performance claims. I'm sure there are improvements over a single 2900XT, but not to the extent you are speaking of.
So, you have the performance of 2900pros crossfired, for the price of a 2900XT and a 2900pro. How is this making you happy?
i think you may be right ... i would have been without a video card for a time ... we were going to possibly 'temporarily trade', right? i don't remember the details ,,, sorry. i think your asking may have prompted me to order a GTS - also from Best Buy - to compare them side-by-side.
You still have "doubts" .. what benchmarks would you like me to post? i have the performance of Overclocked Pros - a solid +30[+]% performance increase over a single XT for only $150 more ... it would be very expensive for another new XT and not 'that much more' in my rig.
EDIT:
you say overclocking the pro had zero improvement when in the 4x slot
i don't think i said that .. i certainly didn't mean to imply it ... there is a *solid* improvement over a stock Pro ... about 1000 marks in 3D'06; from 10.5K with a single XT to 12K by *adding* a stock Pro in Xfire to 13K with the Pro O/C'd ... and i *know* i can get more ... i bet i can break 14K pretty easily in Vista32,
NP. It was a while back.
Ok, lets see. If you can push 14K in 3DMark06, that is great......but......The R6xx cores traditionally score much higher than say, a peer G80 or G92. It doesn't really translate to gaming performance that well.
If you could, I would like to see benches of your choice, as long as you know they scale well in Crossfire. For example, anything that can be compared to an online bench of a single 8800GTS 512. I'm not certain of the variety of games you have. So pick two or three games, or more if you have the time, and we'll take a look at your scores/settings/drivers etc. and compare with latest online reviews of 8800GTS 512 and Ultra. And if we can find them, fairly recent scores from 2900pros Xfired.
Sounds good ... remember that 3DMark06 is useful for tracking *changes* in a single system ... so that 'bump' from 10.5 to 13.1 represents pretty well what i am experiencing ... in
Hg:L it was *completely unplayable* before at 16x10, all settings 'max' or 'extreme' ... not to mention that the max AA/AF allowed in-game *killed* it completely ... right now i am running with maxed-out setting including AA/AF and getting 20s at the bottom and 30s regularly ... 'playable' to appreciate the graphics and completely playable with [only] AA/AF backed down.
that 'alone' makes it worthwhile .... i am going to test Vista 32 Hg:L right now ,,, i think it is faster still ... and i will post FEAR Perseus Vista 32 results ... here are some results i already have with
FEAR and Cat 81:
16x10 everything maxed 0xAA/16xAF - SS on
Vista 32 - 31 Min/58 Avg/114 Max
here is
FEAR with Cat 8.2 and Xfire
Vista 32 - 35 Min/88 Avg/281 Max
=================
are we a little off topic?
i'd say whatever the comparison to GTS/GTX/Ultra ... i have a SOLID boost from a $150 2nd GPU
i'll keep going . . .
====================
Crysis Demo is not so optimized, i think - DX10 1680x1050 AA=No AA, 32 bit test, Quality: VeryHigh :
CPU Crysis demo 32bit Vista
Single2900xt - Average FPS: 11.59, Min FPS: 4.95, Max FPS: 14.62
CrossFire ----- Average FPS: 13.89, Min FPS: 4.46, Max FPS: 21.68
CPU Crysis demo 32bit Vista
Single2900xt - Average FPS: 10.60, Min FPS: 0.74, Max FPS: 17.16
CrossFire ----- Average FPS: 13.96, Min FPS: 1.98, Max FPS: 17.60
===========================
Call of Juarez DX10 benchmark
Vista 64 -16x10- High Shadows/Shader Map - 2048x2048
2900xt -15.9 Min/20.7 Avg/49.3 Max
Pro/xt - 15.0 Min/38.5 Avg/82.8 Max
Vista 32 -16x10- High Shadows/Shader Map - 2048x2048
2900xt -- 14.7 Min/24.9 Avg/52.3 Max
Pro/XT - 14.4 Min/38.5 Avg/85.5 Max
================================
HL2 Lost Coast built in benchmark
Vista 64 - Min, Max, Avg
2900xt ------- 39, 190, 90.420
Pro/XT ------- 43, 214, 101.218
Vista 32 - Min, Max, Avg
2900xt - -------62, 226, 106.322
Pro/XT ------- 68, 283, 141.051
========================
Lost Planet: Extreme Conditions- [/b] - full retail game built-in demo. DX10/everything fully maxed in-game/1680x1050/4xAA-16xAF
Snow - Vista 64 - XT/Pro - 30.0 / Cave 29.0
Snow - Vista 64 -2900XT - 19.6 / Cave 28.1
Snow - Vista 32 - XT/Pro - 30.8 / Cave 30.2
Snow - Vista 32 - 2900XT - 19.6 / Cave 28.0
===============================================
PREY ... you can guess which one is CrossFire:
Resolution: 1680 × 1050 (Custom)
Demo: HWzone_co_il.demo
Shader Detail: Highest
Aspect Ratio: [16:9]
Antialiasing: 4×
Anisotropic filtering: 16×
Graphics BOOST: enabled
Score = 90 FPS
Score = 52 FPS
Score = 54 FPS
Average score = 65 FPS
Resolution: 1680 × 1050 (Custom)
Demo: HWzone_co_il.demo
Shader Detail: Highest
Aspect Ratio: [16:9]
Antialiasing: 4×
Anisotropic filtering: 16×
Graphics BOOST: enabled
Score = 41 FPS
Score = 54 FPS
Score = 54 FPS
Average score = 49 FPS
Are we clear on the increase?
and both PREY and FEAR demos have a single NASTY *chug* when FPS drop to nearly zero with Crossfire ... fortunately, i haven't experienced in in actual play ... but i think the min and average would both be a lot higher without it. No matter how you cut it, it is a SOLID increase
Just look at the number summary:
1. Single XT
2. Xt/Pro Xfire
31 Min/58 Avg/114 Max
35 Min/88 Avg/281 Max
Av 11.59, 4.95, Max 14.62
Av 13.89, 4.46, Max 21.68
Av 10.60, 0.74, Max 17.16
Av 13.96, 1.98, Max 17.60
15.9 Min/20.7 Avg/49.3 Max
15.0 Min/38.5 Avg/82.8 Max
14.7 Min/24.9 Avg/52.3 Max
14.4 Min/38.5 Avg/85.5 Max
62, 226, 106.32
68, 283, 141.05
19.6 / 28.1
30.0 / 29.0
19.6 / 28.0
30.8 / 30.2
49
65
do you not see a pattern that i experience in EVERY game [so far] .. no oddities .. no micro stutter
... and .. after really THIMKING about it ... [
]
... the head of nvidia is wrong criticizing Dual AMD GPUs, imo
LAST THING ... you said i was "giddy"
well ... as you may know i have been stuck on dial up for 8 years [except for 3 terrible months with FLashbyte Digital Wireless] ... and for the LAST 28 Days [it would be a good title of a movie] ... i have been attempting to pair my Pantech C150 with my PC via Bluetooth and use its DUN to access AT&Ts data stream
no luck ... in fact, i helped their tech guys write the folder on my issue and change the way they feature the C150 ... finally, a Regional Manager their Data TS and i agreed it was a 'vista issue' ... and maybe MS would care to resolve it ... we witnessed vista lose and change settings that XP had no problem with.
At ANY rate, i got a refurbished Samsung A437 for $20 from AT&T online ... got ripped off at Best Buy for a $20 USB 10' cable [i NEEDED it NOW] and set the damn Dial-up networking successfully up in 15 minutes flat ... 10 minutes with Vista 64 - and it works pretty good except i need to have my cell phone in the Patio ... but i an gettng D/Ls about 3-4 times faster than 56K dialup and it will probably top out with d/ls in the low 200s [kbps] ... suck ... but way better than dial up's 40 kbps ... and i can connect anywhere my cellphone gets a signal ... for $40 a month unlimited ... damn cool .. and when 3G gets here in a year i will have 1.3Mb/s
... *case closed* ... very satisfied to be able to use my phone again [i set my cell to forward to my landline while it is on the data streadm]
--except i need a cell phone antenna or booster for max signal