3870X2 on 2560x1600

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: Blacklash
Originally posted by: apoppin
Originally posted by: Blacklash


The HD 3870 X2 is a disappointment to me. I am not holding onto it because selling it later will be harder than doing it now while hype and interest surround it.

Have you actually tested and benchmarked it? You only had it a couple of days :p

I tend to play games more than benchmark. I actually praised the HD 3870 X2 for its 3dmark06 score. It put up 18.5k supported by a 3.8GHz Q6600 G-0.
http://service.futuremark.com/compare?3dm06=4997365

Considering those couple of days were off days I had time to use it quite a bit. I don't need months to recognize the card is; too hot, too loud under load, and too unpredictable in performance to keep. I intended on getting a 2560x monitor and to use the HD 3870 X2 to support it. I changed my mind because I don't like the idea of being stuck with a little less than single HD 3870 performance @ that res in some cases.

I have no idea why Crossfire causes a performance hit in a few titles, stutter in others and flashing textures in some. I consider things like; how smooth a title plays, IQ, and if there are graphic glitches.

With the card I tried; "Crysis" DX10 and 9 path, "The Witcher", "Tomb Raider: Legend", "Oblivion" with all expansions, "UT3", "Hellgate London", "Timeshift", "Jericho", "WiC", "DiRT" and "NWN 2".

I do not feel it's worth 449usd and will reiterate most folks will be better served by a cheaper 8800GT 512MB or 8800GTS 512Mb. If you're at 2560x go for an 8800GTX and overclock it. If you're @ 1440x or 1280x you could likely do fine with an overclocked HD 3850 256Mb using medium filtering.

all i wanted to know is if you felt you spent sufficient time with it ... i wasn't expecting a full history and half a page as a report
:D

and i am evaluating CrossFire for myself - for the first time right now [but not 'properly' 'till i get my 2nd bridge interconnect and OC my 2900p] ... and after benching and playing a few games i am underwhelmed [also] ... Vista 64 has nasty flashing textures for Crysis Demo - Vista 32 was ok, but exactly the same FPS as with one GPU; Stalker - surprisingly - did worse; HL2 Flew with Xfire; Lost Planet was better with Xfire [but then it is crap with a single GPU] and CoJ remained the same ... FEAR the same or worse ... Far Cry got a nice boost with Xfire ... from 60 to 70 FPS as an average in Vista32 - it actually looked 'smooth' - but not as good as a single GPU with Vista 64 [i am curious to see what Vista 64 and Crossfire will bring to FC] ... but i am going to play Hg:L and the Witcher ... for a couple of days ...

bad news ... my Bridge is coming to Cali from NJ :p
-so Thursday

i may just eat the $30 restocking fee and chalk it up to experience. i really don't need anything AtM that is faster than my 2900xt for 16x10 DX9 games - although it would be nice to get a little more AA. And i guess i need to wait for a MUCH faster GPU to play Crysis.

Of course, i need to get to about ten more games before really deciding.
- but that's just me :p
 

Blacklash

Member
Feb 22, 2007
181
0
0
I could have spent more time with the card and Newegg doesn't take that item back so I'm stuck selling. I really didn't want to wait for later revisions of the card to come out like below-

http://www.rage3d.com/board/showthread.php?t=33915340

Or the 9800 GX2 and then try to sell. I'd agree a HD 2900XT is plenty of card for 16x. As far as the "report" goes, I gave you the bonus plan. :p

 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: Blacklash
I could have spent more time with the card and Newegg doesn't take that item back so I'm stuck selling. I really didn't want to wait for later revisions of the card to come out like below-

http://www.rage3d.com/board/showthread.php?t=33915340

Or the 9800 GX2 and then try to sell. I'd agree a HD 2900XT is plenty of card for 16x. As far as the "report" goes, I gave you the bonus plan. :p

no prob ... i also gave you my detailed reasons about CrossFire ... thanks!
:)

i can see that you wanted to try it and it didn't meet expectations ... and you are playing with $500 ... i'm just messing with $150 ... if i had to decide *today* - based on 24 hours with Xfire, i'd send it back [also]
--the difference is i have a little time and i want to see what a solid OC on the Pro will do with both bridge interconnets
[i want to get my $30 worth out of the testing IF i send it back; NewEgg will take it back with a restocking fee]
:clock:
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: ArchAngel777
Originally posted by: Syntax Error
SLI and "bang-for-your-buck" should not be in the same sentence.

QFT and signature worthy.
that doesn't necessarily apply to CrossFire ... AMD seems to 1) have bad control of their stock, or 2) they design it with a budget gamer in mind [or (3) both]

case in point: "me"

i bought my 2900xt back in May for $330 with OB ... and it did its job very well for me @16x10; now -with the latest games - i feel a desire to add a few FPS to the bottom and perhaps a little more AA [to 4x]. For $150, i was able to add a brand new second GPU [2900p] that *should* carry me over to the next - next gen GPUs
 

nitromullet

Diamond Member
Jan 7, 2004
9,031
36
91
Interesting that both Blacklash and apoppin would mention flickering textures when none of the review sites mention it. It seems to me that most sites were really pleased with the stability and smooth operation, even if they weren't overwhelmed by performance.

from http://www.hardocp.com/article...w4LCxoZW50aHVzaWFzdA==

The entire feel of the video card and supporting software felt very ?solid? and ?finished.? From the moment we took the video card out of the anti-static bag it just felt like it was going to work well, and it did. It has been a long time since we?ve had such a confident experience in an ATI Graphics video card.

from http://anandtech.com/video/showdoc.aspx?i=3209&p=13

...during our testing it was very easy to forget that we were dealing with a multi-GPU board since we didn't run into any CrossFire scaling or driver issues.

I can't find a mention of flickering textures in any pf the reviews. Don't get me wrong, I'm not challenging either of your assessments. I just noticed that both of them have the flickering textures in common and that none of the review sites mention it. I'm seriously considering a 3870X2 for what would be my first completely non-NVIDIA build in years... Not even my Athlon64 X2 4400+ Crossfire rig had that distinction, since the SB was ULi. Fickering textures would be a deal breaker though.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: nitromullet
Interesting that both Blacklash and apoppin would mention flickering textures when none of the review sites mention it. It seems to me that most sites were really pleased with the stability and smooth operation, even if they weren't overwhelmed by performance.

from http://www.hardocp.com/article...w4LCxoZW50aHVzaWFzdA==

The entire feel of the video card and supporting software felt very ?solid? and ?finished.? From the moment we took the video card out of the anti-static bag it just felt like it was going to work well, and it did. It has been a long time since we?ve had such a confident experience in an ATI Graphics video card.

from http://anandtech.com/video/showdoc.aspx?i=3209&p=13

...during our testing it was very easy to forget that we were dealing with a multi-GPU board since we didn't run into any CrossFire scaling or driver issues.

I can't find a mention of flickering textures in any pf the reviews. Don't get me wrong, I'm not challenging either of your assessments. I just noticed that both of them have the flickering textures in common and that none of the review sites mention it. I'm seriously considering a 3870X2 for what would be my first completely non-NVIDIA build in years... Not even my Athlon64 X2 4400+ Crossfire rig had that distinction, since the SB was ULi. Fickering textures would be a deal breaker though.
i thought i had sufficient disclaimers :p
:confused:

The *only* flickering textures i remember - and mentioned were only with Vista 64 and Crysis Demo ... i am hoping Crysis retail would have that fixed ... and Vista32 did not have that problem - no flickering

Make sure that you take my *preliminary* Crossfire results with a grain ... it is not "optimum" - i have only one Interconnect Bridge - the 2nd one should be here Thursday ... then i'll post more solid results and continue on with my Vista 32 vs 64 testing

for what LITTLE time i have with it i can say it too about my own CrossFire [except Vista64 and Crysis]... "during [my] testing it was very easy to forget that [i was] dealing with a multi-GPU [stup] since i didn't run into any CrossFire scaling or driver issues"

either it scaled or it did not - no "issues" ... my problem is that it doesn't currently scale far enough
[i need the 2nd bridge and then i will OC the pro - report this weekend!]
 

nRollo

Banned
Jan 11, 2002
10,460
0
0
Originally posted by: nitromullet
Interesting that both Blacklash and apoppin would mention flickering textures when none of the review sites mention it. It seems to me that most sites were really pleased with the stability and smooth operation, even if they weren't overwhelmed by performance.

from http://www.hardocp.com/article...w4LCxoZW50aHVzaWFzdA==

The entire feel of the video card and supporting software felt very ?solid? and ?finished.? From the moment we took the video card out of the anti-static bag it just felt like it was going to work well, and it did. It has been a long time since we?ve had such a confident experience in an ATI Graphics video card.

from http://anandtech.com/video/showdoc.aspx?i=3209&p=13

...during our testing it was very easy to forget that we were dealing with a multi-GPU board since we didn't run into any CrossFire scaling or driver issues.

I can't find a mention of flickering textures in any pf the reviews. Don't get me wrong, I'm not challenging either of your assessments. I just noticed that both of them have the flickering textures in common and that none of the review sites mention it. I'm seriously considering a 3870X2 for what would be my first completely non-NVIDIA build in years... Not even my Athlon64 X2 4400+ Crossfire rig had that distinction, since the SB was ULi. Fickering textures would be a deal breaker though.

Multi card and flickering textures is a strange beast.

I played Crysis the other day with my 3 way SLi box, got flickering textures. I don't always, but I do sometimes. I know other people run it without the problem.

OTOH I'm playing Jericho and UT3 fine, no issues. Same for ETQW, FEAR, and Lost Planet.

Crysis and 3 way have not been kind to me.

Maybe Apoppin and Blacklash lucked into a similar situation.

 

bryanW1995

Lifer
May 22, 2007
11,144
32
91
were you guys using the latest drivers? backlash, specifically, since I don't know that the new driver that delayed the 3870x2 is a boon to 2900xt/pro xfire setups. Apoppin, I am seriously intrigued by your setup. I might just need to go get a 3850 512 mb card for xfire...you never know when you might want 1,812 fps in risk... well, I can't say that, I do play civ4 and bg2 a lot, and I even own kotor and nwn2.
 

myocardia

Diamond Member
Jun 21, 2003
9,291
30
91
Originally posted by: Rollo
If you need to go single card, an 8800GTX OCd 10% would cost approximately the same, perform approximately the same, and offer that level of performance in every game, without profiles.

I'm basing this on performance numbers for the 3870X2 vs the 8800Ultra, which is essentially a 8800GTX clocked 10% higher.

http://techreport.com/articles.x/13967

There are several more powerful dual GPU sets than the 3870X2 as well which might provide the longevity you seem to be looking for.

You STILL haven't learned to read?

Originally posted by: lopri
Why do I feel like I'm reading a rehearsed Q/A?

Because you are.
 

nitromullet

Diamond Member
Jan 7, 2004
9,031
36
91
Thanks for the follow ups.

Originally posted by: apoppin
The *only* flickering textures i remember - and mentioned were only with Vista 64 and Crysis Demo ... i am hoping Crysis retail would have that fixed ... and Vista32 did not have that problem - no flickering

Well, I'm running Vista x64. The demo didn't run any differently for me than the retail game did, but it's possible that it's different code. Plus, there is a patch. I also do realize that you are talking about Crossfire with an HD2900XT and an HD2900Pro and not a 3870X2, so I know it's not a total apples-to-apples comparison. I'll be interested to see what happens when you get the second bridge. Also curious to see how running Crossfire in a PCIe 16x/4x configuration will treat you. With Crossfire does the XT run at Pro speeds and/or stream processors, or are they completely independent? (it's really been a while since I've looked into anything ATI with more than one GPU - since X1800XTX)

Originally posted by: nRollo
Multi card and flickering textures is a strange beast.

I played Crysis the other day with my 3 way SLi box, got flickering textures. I don't always, but I do sometimes. I know other people run it without the problem.

OTOH I'm playing Jericho and UT3 fine, no issues. Same for ETQW, FEAR, and Lost Planet.

Crysis and 3 way have not been kind to me.

Maybe Apoppin and Blacklash lucked into a similar situation.

Odd, I never had any flickering texture issues with any of my SLI rigs before. Maybe it's just Crysis... Btw... are you runnning Vista x86 or x64?
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: bryanW1995
were you guys using the latest drivers? backlash, specifically, since I don't know that the new driver that delayed the 3870x2 is a boon to 2900xt/pro xfire setups. Apoppin, I am seriously intrigued by your setup. I might just need to go get a 3850 512 mb card for xfire...you never know when you might want 1,812 fps in risk... well, I can't say that, I do play civ4 and bg2 a lot, and I even own kotor and nwn2.

Of course :p

latest Certified drivers ... no beta anything while i test.

Out of ten ... well, more than 10 [i am d/l'ing the update for Hg:L now ... done by morning :roll:] games, i only got flashing textures in Crysis demo ... and that was only with Vista 64 - NOT Vista 32

i don't think they patched the Crysis demo - at least i don't have it. Crysis is a strange beast ... they haven't tamed it yet and i certainly don't want to buy it till i can run DX10's very high [heck, i will break out my CRT and play 10x7] ... by then it will be $20 - or less and awesome - along with Far Cry - for benchmarking. i am certain they will release another demo or optimize this one for "everything" - after they fix the retail game completely [in a year or so].

i would not judge CrossFire or SLi by Crysis for a long time. Now i AM surprised that Lost Planet is almost playable 'maxed out' on my rig in DX10 - certainly at 10x7 - AMD was waaaaay behind - it was a buggy slideshow in May with ONE 2900xt. CoJ is another game that is starting to look playable fully maxed out in DX10

i can't wait for that Interconnect ... and of course, NewEgg is shipping it from 'Jersey :p



========

Well, I'm running Vista x64. The demo didn't run any differently for me than the retail game did, but it's possible that it's different code. Plus, there is a patch. I also do realize that you are talking about Crossfire with an HD2900XT and an HD2900Pro and not a 3870X2, so I know it's not a total apples-to-apples comparison. I'll be interested to see what happens when you get the second bridge. Also curious to see how running Crossfire in a PCIe 16x/4x configuration will treat you. With Crossfire does the XT run at Pro speeds and/or stream processors, or are they completely independent? (it's really been a while since I've looked into anything ATI with more than one GPU - since X1800XTX)
my 2900xt is slowed down to whatever i can OC the Pro to ... both have 512MB vRAM so that is good ... the only other issue is that my Pro is 256-bit while my XT is 512-bit .... i would say since i am *limited* to 16x10 [or 16x12 if i want to feel nostalgic and break out my CRT for 4:3 gaming] that should be no problem

EDIT: i see you asked about PCIe x16/x4 ... the x4 slot apparently limits a 2900xt's bandwidth by ~10-25% ... i am "figuring" that a 256bit OC'd 2900Pro's bandwidth should be limited by about half that. If so, it might be a good "match up" for my LCD/rig ... to "add" a few FPS to the bottom and allow me to add 2x AA to my current gaming. if i am extremely lucky, it will allow me to play DX10 games fully maxed out [except Crysis] on my CRT. Win-win ... for ~$150 ... a "poor man's" upgrade indeed.

i live for things like this
:D
 

bryanW1995

Lifer
May 22, 2007
11,144
32
91
I always heard that the 2900 pro was EXACTLY like an xt, just with lower clocks. I just checked out newegg, and sure enough, the 2900 pro is listed as 256 bit. However, I still felt that this was incorrect, so I went to sapphire's web site. They list the 2900 pro as 512 bit. Whom to believe? I would go with sapphire just because newegg is full of crap in many of their product descriptions, plus it does stand to reason based upon tons of reviews that the 2900 pro is really just a lower-clocked 2900xt.


http://www.sapphiretech.com/us...iew.php?gpid=190&grp=3

I'm not going to include the newegg link, but trust me, it says 256 bit.

edit: btw, sapphire does list the 2900gt as being 256 bit, which is what I originally heard was one of the main differences between the gt and the pro.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
mine is *definitely* the 256-bit version of the Pro ... Sapphire makes them .. it is VERY clear on the box
- as i said, i don't think it will make *any* difference at 16x10 in the 4x slot

i already know it is about 20% improvement in games that scale ... and it is "not optimal" ... not even close [yet]
- i am looking for +33% as a sort of baseline "worth keeping it" to me.
 

nRollo

Banned
Jan 11, 2002
10,460
0
0
Originally posted by: nitromullet

Odd, I never had any flickering texture issues with any of my SLI rigs before. Maybe it's just Crysis... Btw... are you runnning Vista x86 or x64?

Vista 64, Crysis patched, latest WHQL, all SLi related MS Hotfixes installed. Crysis has been my "Achilles Heel" with 3 way, I'll probably just play it with SLi disabled if a reinstall doesn't take care of it.

 

lopri

Elite Member
Jul 27, 2002
13,329
709
126
Originally posted by: nitromullet
Originally posted by: tuteja1986
Message moved to Personal Forum Issues.

Anandtech Moderator - KP

http://forums.anandtech.com/me...=2147233&enterthread=y
:Q

That was a bit too much of drama for a morning coffee (I work nights) but thanks anyway. I rarely go to Forum Issues section so I had no idea there were so many activities going on there. (Just count the number of 'lifer's in that thread alone)

If anything it changed my viewpoints on Mr. Derek Wilson quite a bit. In a better way, of course.

Back on Topic..

1. 3970X2 is good enough for 30" today, except some obvious occasions (Crysis..)
2. 8800 series are quite capable at that as well. I don't think it's wise to purhcase a GTX/Ultra at this point in time, however. G92 variants make more sense or you could wait for 9800GX2.
 

uclaLabrat

Diamond Member
Aug 2, 2007
5,632
3,046
136
Originally posted by: lopri
Originally posted by: nitromullet
Originally posted by: tuteja1986
Message moved to Personal Forum Issues.

Anandtech Moderator - KP

http://forums.anandtech.com/me...=2147233&enterthread=y
:Q

That was a bit too much of drama for a morning coffee (I work nights) but thanks anyway. I rarely go to Forum Issues section so I had no idea there were so many activities going on there. (Just count the number of 'lifer's in that thread alone)

If anything it changed my viewpoints on Mr. Derek Wilson quite a bit. In a better way, of course.

Back on Topic..

1. 3970X2 is good enough for 30" today, except some obvious occasions (Crysis..)
2. 8800 series are quite capable at that as well. I don't think it's wise to purhcase a GTX/Ultra at this point in time, however. G92 variants make more sense or you could wait for 9800GX2.

That was my dilemma, take the plunge on the X2 or go the GTS route. I don't want to be limited to Nvidia chipsets with SLI (heard bad things about 680i, not sure about 780i but it seems quite expensive) so the X2 seemed like a good compromise for a single card solution that could give a little more juice than the GTS.

However, I won't be building anytime soon, so I'll probably be well into the new gen cards coming out before it's a possibility.

Thanks for the discussion so far, guys.
 

nitromullet

Diamond Member
Jan 7, 2004
9,031
36
91
Originally posted by: nRollo
Originally posted by: nitromullet

Odd, I never had any flickering texture issues with any of my SLI rigs before. Maybe it's just Crysis... Btw... are you runnning Vista x86 or x64?

Vista 64, Crysis patched, latest WHQL, all SLi related MS Hotfixes installed. Crysis has been my "Achilles Heel" with 3 way, I'll probably just play it with SLi disabled if a reinstall doesn't take care of it.

That's really disappointing to hear (I'm sure for you more than me)... Crysis seems like it would be the killer app for 3-way SLI. While improved fps in the one game that kills everything is not the best justification for buying three GTXes/Ultras, a 780i motherboard, and a 1000W+ PSU, it's least it's some sort of justification. If that doesn't work, what's the point other than to help you heat up those cold Wisconsin nights?

I played Crysis all the way through on a single GTX on a 1920x1200 LCD, and I can assure it's gonna suck with a single GTX on a 30" LCD.
 

Blacklash

Member
Feb 22, 2007
181
0
0
If some want to attack me for my opinions so be it. I said what I meant and meant what I said.

The reason to own an HD 3870 X2 is to push a 2560x monitor and its unreliable|unpredictable performance makes that a wash. My intent was to get one and upgrade my 1680x monitor on my second rig.

For people that have a 16x or lower resolution monitor there are much better price vs performance choices than the HD 3870 X2. I've already shown where overclocked 8800GT cards can even push 1920x well.

http://www.driverheaven.net/reviews/8800GTs/index.php

In addition to having too unpredictable and unreliable performance across a wide variety titles for 2560x, the card was also too hot and loud for my tastes.

The end result is the 1680x monitor is staying on my second rig until better options appear for 2560x.
 

heyheybooboo

Diamond Member
Jun 29, 2007
6,278
0
0
Sapphire 2900pro 512mb 256-bit - 600/800

FEAR test - 1366x768x32 - max settings (AAx4 - AFx16 - soft shadows)
OC 750/840

Crossfire
min - 41
avg - 102
max - 283
100% ^ 40 FPS

Single
min - 31
avg - 78
max - 194
94% ^ 40 FPS

Now, it gets kinda ugly ...

FEAR test - 1920x1080x32 - max settings (AAx4 - AFx16 - soft shadows)
OC 750/840 - FPS

Crossfire
min - 22
avg - 55
max - 117

Single
min - 17
avg - 51
max - 194

XP / dx9 / MSI 790fx / x2 5400 @ 2.8GHz / Westy W4207 42-inch 720p HD Monitor (for whatever reason it will run 1920x1080 ...)

Since the native rez is 1366x768 I'm pretty happy with the 30% or so increase in Crossfire - especially when FEAR 'noted' that the rez 'was not optimized' (whatever that means). For $290 (if the rebates come in :) ) it's a dang good deal, crossfire or not.

I used the AMD GPU Clock Tool. Temps never got above 49c -66c on the sensors on either card. The cards were actually quiet. I think there's some OC headroom but I've only got a 650w PS and don't want to press my luck much further.

If I get the time this evening I'll disable soft shadows and run the benchies again ....
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: nitromullet
Originally posted by: nRollo
Originally posted by: nitromullet

Odd, I never had any flickering texture issues with any of my SLI rigs before. Maybe it's just Crysis... Btw... are you runnning Vista x86 or x64?

Vista 64, Crysis patched, latest WHQL, all SLi related MS Hotfixes installed. Crysis has been my "Achilles Heel" with 3 way, I'll probably just play it with SLi disabled if a reinstall doesn't take care of it.

That's really disappointing to hear (I'm sure for you more than me)... Crysis seems like it would be the killer app for 3-way SLI. While improved fps in the one game that kills everything is not the best justification for buying three GTXes/Ultras, a 780i motherboard, and a 1000W+ PSU, it's least it's some sort of justification. If that doesn't work, what's the point other than to help you heat up those cold Wisconsin nights?

I played Crysis all the way through on a single GTX on a 1920x1200 LCD, and I can assure it's gonna suck with a single GTX on a 30" LCD.

i am quite certain that Crysis *would* be playable on Tri-SLi ... if they could optimize the game. Look at how long it took FarCry to become really optimized ... i am expecting another year - at a minimum. 64-bit should also give a much better experience than 32-bit if FC is an indication.

==================
I used the AMD GPU Clock Tool. Temps never got above 49c -66c on the sensors on either card. The cards were actually quiet. I think there's some OC headroom but I've only got a 650w PS and don't want to press my luck much further.
Hey, heyheybooboo - Are you using 2 bridge interconnects? And just for laughs, what is 3DMark06? ... and what is the *rest* of your rig like?
 

nitromullet

Diamond Member
Jan 7, 2004
9,031
36
91
Originally posted by: apoppin
i am quite certain that Crysis *would* be playable on Tri-SLi ... if they could optimize the game. Look at how long it took FarCry to become really optimized ... i am expecting another year - at a minimum. 64-bit should also give a much better experience than 32-bit if FC is an indication.

Maybe true, but who's gonna buy 3 GTX/Ultras, a 780i motherboard, and a 1000W+ PSU so they can play Crysis next year? Did Far Cry really get more 'optimized' or did hardware just get better?

 

bryanW1995

Lifer
May 22, 2007
11,144
32
91
when did they come out with a 256 bit 2900 pro? It makes sense, of course, since the 512 bit was way overkill and 256 bit is cheaper to make, I've just never seen any spec sheets on one. I just re-checked sapphire's site and couldn't find any links to it, either.