New 9800gx2/9800gtx/HD3870 X2 benchies

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

superbooga

Senior member
Jun 16, 2001
333
0
0
Originally posted by: Extelleron
Regardless, in the past buying an ATI has always been a better decision than buying an nVidia card. Just look at the X1950 Pro vs 7900GS... they seemed like good competitors a year or so ago at the $199 price point... now the X1950 Pro is close to 2X faster in games like Crysis.

True, but your gain = ATI's loss.
 

Extelleron

Diamond Member
Dec 26, 2005
3,127
0
71
Originally posted by: superbooga
Originally posted by: Extelleron
Regardless, in the past buying an ATI has always been a better decision than buying an nVidia card. Just look at the X1950 Pro vs 7900GS... they seemed like good competitors a year or so ago at the $199 price point... now the X1950 Pro is close to 2X faster in games like Crysis.

True, but your gain = ATI's loss.

Not always. X1900 was way bigger than G71, so you are right there... G71 was 196mm^2 and R580 was around 314mm^2.

R520 was smaller than G70 and R420 was smaller than the GeForce 6 though, so it hasn't always been that way.

R600 was also smaller than G80 (408mm^2 vs ~500mm^2) and RV670 is way smaller than G92 (192mm^2 vs 324mm^2). So I think ATI is getting away from the larger, more expensive chip... and nVidia is now the one with a huge die.
 

aussiestilgar

Senior member
Dec 2, 2007
245
0
0
I'm seeing a major flaw (disregarding that this could all be FUD). Those 3DMark scores were done on a P965 board, so this doesn't even factor PCI Express 2.0
 

nRollo

Banned
Jan 11, 2002
10,460
0
0
Originally posted by: Extelleron
Originally posted by: superbooga
nVIDIA has been far better at actually making money than ATI since the x800/6800 era. In the past, ATI was constantly late and overdesigned the chip, possibly offering better performance for the future but not enough to justify its costs. That is not a good thing in the semiconductor industry. You have to factor in whats feasible and the costs required, then build the best possible chip given those constraints. ATI has never really been able to perfect that.

Regardless, in the past buying an ATI has always been a better decision than buying an nVidia card. Just look at the X1950 Pro vs 7900GS... they seemed like good competitors a year or so ago at the $199 price point... now the X1950 Pro is close to 2X faster in games like Crysis.

With gaming video cards, the "future" is largely irrelevant.

By the time it gets here the cards are pretty much useless.
 

Blacklash

Member
Feb 22, 2007
181
0
0
Most of the time 8800GT SLi beats the X2 badly. If you factor in games where Crossfire works poorly causing the X2 to perform slower than a single HD 3870, that card is even less attractive. It's amazing to me people simply ignore this.

Titles I am aware of this occurs in; SupCom, Gothic 3, NWN2, Need for Speed: Pro Street, WiC on the DX10 path with AA, Lost Planet: Extreme Condition, Tomb Raider: Legend, Hitman: Blood Money, Hellgate: London, and Jericho with edge smoothing active. There may be more.

If the GX2 turns out to be roughly equal to GTS 512Mb SLi it won't even be a contest 95% of the time, particularly with AA or "edge smoothing" enabled.
 

Giacomo

Junior Member
Jan 29, 2008
16
0
0
Originally posted by: Blacklash
Most of the time 8800GT SLi beats the X2 badly. If you factor in games where Crossfire works poorly causing the X2 to perform slower than a single HD 3870, that card is even less attractive. It's amazing to me people simply ignore this.

8800GT SLI beating the HD3870 X2 is not a clever point. You're putting two cards vs one, and yes, the 3870 X2 may still be a "crossfire" solution, but you can stick it in one PCI-X slot and you don't need an nFORCE mainboard to enjoy. Pretty different flexibility, don't you agree?

The HD3870 X2 is a real two-gpu-on-one-pcb which works, and it should only be encouraged. People like nrollo takes that as a drawback just because love (or working for) their company makes them support the company idea. NVIDIA recently criticized the ATI solution (and I'll skip comments on this...), maybe because they haven't realized how to build such a solution yet. Maybe their GPU size doesn't help, nor their power requirements does.

Multiple and programmable stream processors are one step towards multicore GPU technology, but as with CPUs, moving an another step in that direction (with multiple GPUs) should only be encouraged. Yes, the earliest tries could end up in slightly inelegant and not 100% efficient solution (the onboard PCI-X interface is not a really cool solution, far away from a "dual-GPU" and more similar to a "dual-VGA-on-single-PCB"), but folks, at least someone is trying.

I hope that the ones who criticize X2 won't even try to comment on the GX2, which can't even put the GPUs on a single board.

Once drivers will be mature enough, CrossFire (and SLI) will be an almost seamless integration, and then I hope that many of you will start realizing that this, even if we can't talk about "single die" yet, is the future.

Giacomo
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: Blacklash
Most of the time 8800GT SLi beats the X2 badly. If you factor in games where Crossfire works poorly causing the X2 to perform slower than a single HD 3870, that card is even less attractive. It's amazing to me people simply ignore this.

Titles I am aware of this occurs in; SupCom, Gothic 3, NWN2, Need for Speed: Pro Street, WiC on the DX10 path with AA, Lost Planet: Extreme Condition, Tomb Raider: Legend, Hitman: Blood Money, Hellgate: London, and Jericho with edge smoothing active. There may be more.

If the GX2 turns out to be roughly equal to GTS 512Mb SLi it won't even be a contest 95% of the time, particularly with AA or "edge smoothing" enabled.

You are SO out of date ... so like 2 months ago:p

:roll:

AMD has fixed these games and this is the THIRD time in a month i have had to correct you here

Make up some NEW FUD, please
- the old stuff is getting really tiresome


 

bryanW1995

Lifer
May 22, 2007
11,144
32
91
Originally posted by: BFG10K
3DMark or not, I really don't have much hope for the 9800 GTX to be honest. To me it looks like nothing more than a respin in order to support Tri-SLI.

QFT
 

bryanW1995

Lifer
May 22, 2007
11,144
32
91
Originally posted by: nRollo
Originally posted by: jaredpace
hah nice post, check my edit! :)

Still would like to see the tests ran on the same cpu. pretty gay amd wanted a phenom on that test.


Forgot to add this part too:
price (for video cards):

Ati: $860.00

Nvidia: $1840.00


but hey 1000 or 2000 on video cards! Whats the difference? :)

Heh- It's all good- I would've been pretty shocked if they could wrangle 3 way beating performance out of RV670s.


In my eyes, the main problem with the Quadfire vs 3 way SLi question (or even highend 2 way Sli) is when the scaling isn't happening a single RV670 is about the LAST thing you'd want to run a 25X16 monitor, and a GTX or Ultra can handle it pretty well.

Oh yeah- and on the $1000 vs $2000 thing:

Only people living in vans by the river or refridgerator boxes can't afford $2000 for video cards!

LOL- I used to love to say that as a joke here. Seriously, obviously 3 way Ultras is a "rare air, well off buyer" solution and can only be compared to QuadFire on a "flagship to flagship" basis. (these rigs wouldn't be competing except for the people who don't care about $1000)

that price comparison is a valid point right now. Unless you're a .com multimillionaire you almost certainly would prefer to spend ~850 for kickass performance instead of ~1850 for slightly better performance. Now, I make good money and have the disposable income to get a brand new tri-sli rig every six months if I deem it worth the price, but I sure as hell don't want to spend $1850 on a system upgrade when $850 will do. In fact, I would say that most people I know who are in a similar or better financial position to mine would be EXTREMELY concerned about wasting $1000. I'm not saying that it wouldn't be an option for a well-to-do buyer, I'm just saying that it had damned well better have a VERY LARGE performance increase for more than double the cost.

Now, having said all of that, tri-sli is going to be a GREAT match for 9800gtx. Same/better performance to current tri-sli, less than $1200 for the cards, and a fraction of the heat produced. Affluent buyers will be much more tempted by THAT.
 

Tempered81

Diamond Member
Jan 29, 2007
6,374
1
81
another bonus of the $840 is you get to keep the intel chipsets. I consider most nv chipsets a drawback, except for the elusive 790i. Nv is very competitive at the extreme high end now that you can buy 3 gtx's for $900.

$850 for extreme graphics, w/ intel chipset, accept any crossfire non-scaling limitations - single card support is 3870x2, multi gpu w/ multi monitor support.

or

$900 for extreme graphics, w/ nvidia chipset, accept any sli non-scaling limitations - single card is 8800 gtx, Linkboost (jk)


No doubt tri sli 9800 gtx's will wear the performance crown. crossfire x 4870x2's might match/exceed it 3 months later.

Quad SLI on 9800GX2? Who knows on that one. I say dont get your hopes up.
 

ArchAngel777

Diamond Member
Dec 24, 2000
5,223
61
91
Originally posted by: nRollo

With gaming video cards, the "future" is largely irrelevant.

By the time it gets here the cards are pretty much useless.

Only if you have unlimited funds or get your hardware for free...

I guess I have to get this off of my chest. I am real tired of bias. A true computer enthusiest should be hoping for improvement, and should be objective, no matter what company produces the best product. Sadly, instead of computer enthusiests, we have nVidia enthusiests and AMD enthusiests. To each their own, but I cannot help but think how short sighted it is.

I have noticed that you are willing to concede the small, irrelevent things to AMD so as to appear objective. It doesn't fool me though. Give credit where credit is due. Do that I might just gain some respect for you and others on this forum.
 

bryanW1995

Lifer
May 22, 2007
11,144
32
91
I don't know who's better, but 3dmark06 sure is pretty on my 3870 ;)

seriously, archangel, you're just now getting this off your chest? rollo doesn't pretend to be impartial. he does give some minor lip service to amd, but he is pretty consistently in favor of all things nvidia. having said that, amd/ati hasn't done much in the past few years to give an impartial 3rd party a reason to think that they are much competition for nvidia, either. rv670/680 have been a good step in the right direction, but amd was flat out lucky to get them out as quickly as they did. I remember reading something on AT about "first silicon" being used... hopefully, r700/v770 will put them back on a more even footing.
 

angry hampster

Diamond Member
Dec 15, 2007
4,232
0
0
www.lexaphoto.com
Originally posted by: bryanW1995


seriously, archangel, you're just now getting this off your chest? rollo doesn't pretend to be impartial. he does give some minor lip service to amd, but he is pretty consistently in favor of all things nvidia..


Can you blame him? :laugh:

I think I'd be willing to endorse a company as well if they gave me thousands of dollars in high-end computer parts.
 

ArchAngel777

Diamond Member
Dec 24, 2000
5,223
61
91
Originally posted by: angry hampster
Originally posted by: bryanW1995


seriously, archangel, you're just now getting this off your chest? rollo doesn't pretend to be impartial. he does give some minor lip service to amd, but he is pretty consistently in favor of all things nvidia..


Can you blame him? :laugh:

I think I'd be willing to endorse a company as well if they gave me thousands of dollars in high-end computer parts.

There is a moral line that one must walk to maintain integrity. You can work for a company and still be objective. That is the only way you can keep your integrity. If you use double standards to place one product over another, then that is wrong, period.

As for 'just finding this out', no. I have always known. I have been at AT a good long time.

Edit ** Please do not reply to this or discuss this further on account of my posts. I don't want to derail a thread.

 

bryanW1995

Lifer
May 22, 2007
11,144
32
91
that's more what I meant. I don't "blame you" for saying it, I'm just surprised that you're so offended NOW. You were here when rollo was on his tear, right? I wasn't, but I've heard enough from others and seen enough posts from rollo to know what he is. He's not my "enemy" per se, but he is clearly extremely biased. At least we know him for what he is. I would pick a visible foe any day, even a dangerous one, over a hidden foe. Besides, we have apoppin to properly represent AMD and everyone and his dog sniffs at rollo's every post, so rollo is unlikely to unfairly bias a n00b into a poor nvidia purchase.

ps: of course, it's hard to blame somebody for purchasing nvidia cards these days, anyway. with the 9600gt at 160 and 8800gt at 190AR amd isn't left with much wiggle room...
 

ArchAngel777

Diamond Member
Dec 24, 2000
5,223
61
91
Originally posted by: bryanW1995
it's hard to blame somebody for purchasing nvidia cards these days, anyway. with the 9600gt at 160 and 8800gt at 190AR amd isn't left with much wiggle room...

You are correct. Some of the prices of the nVidia cards are stellar. I certainly don't blame anyone for purchasing one of them. But I also don't think the people who purchased the 3850's or 3870's made a bad decision. I think they are both great products, with AMD/ATi behind nVidia at this point, at least in the high-end. But price/performance I really have to call it a tie between the two, honestly.
 

superbooga

Senior member
Jun 16, 2001
333
0
0
We should all be glad there are finally sub-$200 cards that can handle most games with the details maxed out on a 20" - 22" display.
 

bryanW1995

Lifer
May 22, 2007
11,144
32
91
no, price/performance is currently favoring nvidia. how much less than $190 would a 3870 have to be to put it on even price/performance with an 8800gt? about $30, right? I don't see any 3870's for $160. When I bought my 3870 for $225 shipped it compared very favorably price/performance with 8800gt, so I bought it. I love the card, it maxes out everything I play except crysis, and I fully expect to keep it for quite a while unless I upgrade to something greater than 14x9. However, if I was buying a new card today I would get the msi 8800gt OC with the dual slot cooler for $190AR. That card is simply a price/performance beauty queen. It should suck up all video card purchases in the $150-$400 price range until at least the 9800 series comes out, and even then it might STILL own that category. The only exception would be intel mobo owners who realistically expect to go xfire; for those guys a 3870 for $170 or even $180 would make sense.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: bryanW1995
that's more what I meant. I don't "blame you" for saying it, I'm just surprised that you're so offended NOW. You were here when rollo was on his tear, right? I wasn't, but I've heard enough from others and seen enough posts from rollo to know what he is. He's not my "enemy" per se, but he is clearly extremely biased. At least we know him for what he is. I would pick a visible foe any day, even a dangerous one, over a hidden foe. Besides, we have apoppin to properly represent AMD and everyone and his dog sniffs at rollo's every post, so rollo is unlikely to unfairly bias a n00b into a poor nvidia purchase.

ps: of course, it's hard to blame somebody for purchasing nvidia cards these days, anyway. with the 9600gt at 160 and 8800gt at 190AR amd isn't left with much wiggle room...

jeeze ... cut it out .. i was playing a video game and my ears were burning ... i had to log back in to see why :p
:Q

i am not sure how to put it, but there really are no "enemies" here ... well except for trolls bent on tearing apart the forums - they don't last long. We are all tech-oriented people with our own PoVs. We can often see and experience the same thing with a completely different conclusion.
--That is partly why we have such passionate disagreements amongst ourselves. And i think it is OK, many precious metals are refined by fire and diamonds are made out of carbon with pressure and heat.

The problem occurs when companies try to take advantage of this "hot mix" by injecting "company people" - secretly - with a motive into our forums. These people have access to info not generally available and ESPECIALLY "dirt" on their competitor's product.

Now nvidia had AEG do their viral marketing and ATi went with High Road for theirs originally; both companies have moved their programs "in-house" by now AFAIK. So we have BOTH companies well represented "virally" here along with MS and intel. ADD to this 'hotter mix' "fanboys" with their OWN agenda and it can be pretty difficult to navigate thru hostile waters to figure out the truth of any matter.
--it amazes me that we manage to do it and we do it probably better than any other forum. i know there are more 'technical forums' that get along but i don't want to be discussing "white papers" at WayBeyondDullsTechnical - or much worse at a company fan site forum.

So ... my POINT is that you do not need me to "represent AMD" ... i hope you do not think i am paid by AMD to do it; i am not compensated by anybody for my posting. Let the truth be known - i pretty much picked ATi for the *sole reason* that it was the perpetual underdog and i always saw it as an excellent "alternative" to nvidia GPUs for my rig beginning with ATI's Rage Fury32. And by my personal 'timing' or by comparing pricing, i have only owned 3 nvidia GPUs since them - but it has not stopped me from defending nvidia when i think they are unfairly attacked [as recently with their supposedly "shady" practice in O/Cing the PCI bus]

As a matter of fact, if i didn't get treated SO BADLY by nvidia fanboys when i first joined, i might just as well have been a nvidia fan. i like both AMD and NVIDIA's products and i tend to tell it like i think it is without holding back. i think i gave a very honest assessment of 6800GS-OC vs x850xt and x1950p/512M and also 2900xt vs. 8800GTS-OC .. and the reasons for my own [sometimes peculiar] choices.

So ... just remember we are all "people" .. same species ... same feelings that are easily hurt. Just remember to regard everything with a little healthy credulity and make sure of what we believe. For ourselves.





 

Dadofamunky

Platinum Member
Jan 4, 2005
2,184
0
0
I see the 8800GTs as excellent cards for the money, and if I had a build with two x16 slots that's probably the direction I'd head in (except for SLI not supporting multiple displays, which kinda takes away the point of it for me). I really like the 3870x2 concept because it gives me a clear upgrade path. Also, the dollar for dollar comparison has to be looked at. In the situation, the 9600GT looks like the clear winner over everybody. Just my two kopeks
 

JPB

Diamond Member
Jul 4, 2005
4,064
89
91
More benches

At 730MHz overclock scores 16100 3Dmark06


As we showed you last week, Geforce 9800GX2 has the look. You can check it here and here.

Of course you all want to know how it scores and on the high end system powered with 3GHz Quad core and Vista 64 bit we can report some nice numbers.

The first test is done on a 600MHz core, 2000MHz memory and 1500MHz Shader clock and at 3Dmark05 default settings we scored 17600.

In 3Dmark 06 the card at the same clock scored 14400 and this is the normal default score.

If you push the card to 730MHz core and 2080MHz memory and leave the Shaders at 1500MHz you end up with 19400 in 3Dmark05 and about 16100 in 3Dmark 06.

We are about to go and join CeBit, the biggest IT trade show of the year. We didn?t have much more time for more numbers but don?t be surprised if we show you a few more scores before the end of the show.
 

nRollo

Banned
Jan 11, 2002
10,460
0
0
Originally posted by: apoppin
Originally posted by: Blacklash
Most of the time 8800GT SLi beats the X2 badly. If you factor in games where Crossfire works poorly causing the X2 to perform slower than a single HD 3870, that card is even less attractive. It's amazing to me people simply ignore this.

Titles I am aware of this occurs in; SupCom, Gothic 3, NWN2, Need for Speed: Pro Street, WiC on the DX10 path with AA, Lost Planet: Extreme Condition, Tomb Raider: Legend, Hitman: Blood Money, Hellgate: London, and Jericho with edge smoothing active. There may be more.

If the GX2 turns out to be roughly equal to GTS 512Mb SLi it won't even be a contest 95% of the time, particularly with AA or "edge smoothing" enabled.

You are SO out of date ... so like 2 months ago:p

:roll:

AMD has fixed these games and this is the THIRD time in a month i have had to correct you here

Make up some NEW FUD, please
- the old stuff is getting really tiresome

To be fair though Apoppin- look at the vintage of some of those games they just fixed.

n7 and others here are posting that NVIDIA's "out of box" Sli experience is only slightly better, and I can't remember the last time a big game like these didn't have a SLi profile when it launched.

For that matter the AT review said the games just worked with the X2, IMO, that's because the games they tested had been out a while.

To me that's a big difference- a lot of the time people buy a big game the day it launches. They don't want to wait weeks or months for ATi to make a profile.

CF is way better than it was, but ATi's dev relations and profile generation are still lacking IMHO.
 

nRollo

Banned
Jan 11, 2002
10,460
0
0
Originally posted by: bryanW1995
Originally posted by: nRollo
Originally posted by: jaredpace
hah nice post, check my edit! :)

Still would like to see the tests ran on the same cpu. pretty gay amd wanted a phenom on that test.


Forgot to add this part too:
price (for video cards):

Ati: $860.00

Nvidia: $1840.00


but hey 1000 or 2000 on video cards! Whats the difference? :)

Heh- It's all good- I would've been pretty shocked if they could wrangle 3 way beating performance out of RV670s.


In my eyes, the main problem with the Quadfire vs 3 way SLi question (or even highend 2 way Sli) is when the scaling isn't happening a single RV670 is about the LAST thing you'd want to run a 25X16 monitor, and a GTX or Ultra can handle it pretty well.

Oh yeah- and on the $1000 vs $2000 thing:

Only people living in vans by the river or refridgerator boxes can't afford $2000 for video cards!

LOL- I used to love to say that as a joke here. Seriously, obviously 3 way Ultras is a "rare air, well off buyer" solution and can only be compared to QuadFire on a "flagship to flagship" basis. (these rigs wouldn't be competing except for the people who don't care about $1000)

that price comparison is a valid point right now. Unless you're a .com multimillionaire you almost certainly would prefer to spend ~850 for kickass performance instead of ~1850 for slightly better performance. Now, I make good money and have the disposable income to get a brand new tri-sli rig every six months if I deem it worth the price, but I sure as hell don't want to spend $1850 on a system upgrade when $850 will do. In fact, I would say that most people I know who are in a similar or better financial position to mine would be EXTREMELY concerned about wasting $1000. I'm not saying that it wouldn't be an option for a well-to-do buyer, I'm just saying that it had damned well better have a VERY LARGE performance increase for more than double the cost.

Now, having said all of that, tri-sli is going to be a GREAT match for 9800gtx. Same/better performance to current tri-sli, less than $1200 for the cards, and a fraction of the heat produced. Affluent buyers will be much more tempted by THAT.

????

I agreed with you on the price thing, so I guess I will again. Current 3way isn't a "bang for buck" solution, it's a niche high end solution with some big advantages, and one disadvantage- cost.



 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: nRollo
Originally posted by: apoppin
Originally posted by: Blacklash
Most of the time 8800GT SLi beats the X2 badly. If you factor in games where Crossfire works poorly causing the X2 to perform slower than a single HD 3870, that card is even less attractive. It's amazing to me people simply ignore this.

Titles I am aware of this occurs in; SupCom, Gothic 3, NWN2, Need for Speed: Pro Street, WiC on the DX10 path with AA, Lost Planet: Extreme Condition, Tomb Raider: Legend, Hitman: Blood Money, Hellgate: London, and Jericho with edge smoothing active. There may be more.

If the GX2 turns out to be roughly equal to GTS 512Mb SLi it won't even be a contest 95% of the time, particularly with AA or "edge smoothing" enabled.

You are SO out of date ... so like 2 months ago:p

:roll:

AMD has fixed these games and this is the THIRD time in a month i have had to correct you here

Make up some NEW FUD, please
- the old stuff is getting really tiresome

To be fair though Apoppin- look at the vintage of some of those games they just fixed.

n7 and others here are posting that NVIDIA's "out of box" Sli experience is only slightly better, and I can't remember the last time a big game like these didn't have a SLi profile when it launched.

For that matter the AT review said the games just worked with the X2, IMO, that's because the games they tested had been out a while.

To me that's a big difference- a lot of the time people buy a big game the day it launches. They don't want to wait weeks or months for ATi to make a profile.

CF is way better than it was, but ATi's dev relations and profile generation are still lacking IMHO.

what you are doing is called "nitpicking" .. it is the very same thing n7 and others are doing to you.

i agree that IMPATIENT people expect a game to work on Day 1 ... these same people are the ones who *always* clog the forums with "my game doesn't work" threads .. and i usually find that both NVIDIA and AMD have issues for 9/10 new games that usually take at least one patch and one driver revision to fix. :p

IF you are here to point out the *obvious* - that NVIDIA has a massive and hugely expensive program to attempt to roundup 100% of the devs into their twiimtbp program - i will agree with you. What surprises me is that AMD is able to catch up within a month, almost 100% of the time. i DID buy BioShock on DAY ONE [only because i got it on sale] ... AMD released a "hot fix" within 2 days that made the game totally playable for me and i finished it over the first 3-day weekend it was out. STALKER was a disaster for both AMD and NVIDIA ... i was lucky, my own particular configuration played it surprising well OoB ... better than many NVIDIA GPU configurations that took forever to enable dynamic lighting. The Witcher ran bad on BOTH amd and nvidia cards ... same with Hellgate .. and NWN2 .. Lost Planet ran better on nvidia and CoJ runs better with AMD HW - to this day! Jericho plays OK on my crossfire rig fully maxed out - it is a demanding performance hog

so ... to answer you .. "not really" an advantage choosing nvidia over amd.

.. and although ATi started out a year late - behind nvidia - with multi GPU solutions, they have generally surpassed them in every way [imo]

My reply was directed specifically at Blacklash who has posted the *exact same FUD* 3-times-in-a-row ... even though i was behind him with a sweeper and garbage can every time and will i continue to correct deliberate misinformation. It is very strange to me that Blacklash would buy a HD3870x2, keep it for a SINGLE DAY with the apparent sole purpose of dissing it
:thumbsdown: