3870x2 reviews are here and where to buy

Page 5 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

bluehaze013

Junior Member
Jan 29, 2008
16
0
0
Originally posted by: ghost recon88
Ok, just wondering something. Lets say you have an X38 board, so you can run each PCI-E lane @ 16x. Wouldn't it be better to get 2x 3870s? Because, with each 3870 the RAM is stock clocked 600MHz higher than the 3870x2, plus you wouldn't have the bottleneck of trying to run 2x GPUs on one single PCI-E 16x lane. You're not gonna max out a 16x lane with a single 3870.

Not neccesarily, because the 3870x2 has a hardware chip that does all the Crossfire work onboard meaning the application/cpu doesn't have to do the work because the dual processors are invisible to the OS and Applications you are totally dependent on drivers for proper performance. This is good and bad, it means that with proper driver support from ATI this card should concievably be faster than a conventional crossfire arrangement but on the other hand with improper support it could quite possibly end up with worse performance.

It's kind of the risk you take buying new technology the 2 card crossfire can benefit from application/OS optimizations but I don't think the 3870x2 would as much because the OS/Applications don't even see it as dual gpu but with proper support the 3870x2 should technically be faster than a conventional crossfire setup due to lower latencies on the card itself.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Originally posted by: DerekWilson

We do test gameplay performance in these games, and we will include real gameplay numbers in our graphics benchmarks for the foreseeable future.

It shouldn't be an issue though.

I'll still stand by the fact that it is not necessary to look at gameplay situations in order to build an accurate picture of the relative performance of a graphics card. You just have to keep in mind that it's a relativistic rank of the ability of each graphics card to playback the graphics generated by the game and not an indication of the performance you will get when you play the game.

In fact, it's not hard to make a case that timedemos and graphics only benchmarks are a better indication of relative worst case scenario performance. In a situation where graphics is the largest bottleneck in the system (which is not a rare occurrence), a graphics only or timedemo benchmark will even more accurately predict relative performance.

"Crysis was probably the game that gave us the most headaches during our testing, as AMD released a new driver just before launch that claimed to drastically improve performance in the title?I mean, by almost 60 percent. However, no matter what we did, the game didn?t perform much better than it did with the older driver in our demo (around 1.5 fps better to be specific).

We decided giving the Crysis GPU benchmark a go to see whether that was where the performance advances were and, low and behold, there was an almost 60 percent performance increase between the two drivers. We even went so far as to manually play through the same section of the game that we?d benchmarked, just to check that it wasn?t something on our end that was the problem. The result was pretty much exactly the same and the difference between the two drivers was roughly two frames per second in favour of the newer driver." - Bit-Tech.net
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Originally posted by: Midnight Rambler
And its released with plenty of stock, at MSRP. Despite some people with no clue clamoring that it was a paper launch last week.
And now we know why supply of single 3870 cards has been tight ... :roll:

Needs to be ~$400 maybe, since right now, an extra $150-200 over the GTS 512 MB isn't worth it unless you just play the games in which it does far better
Show me where you can get an 8800GTS 512MB for $300, let alone $250 ... best I've seen is $320.

8800GTS 512 - $270
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Originally posted by: nitromullet
Just looked at the DriverHeaven review... Wonder why one heatsink is aluminum and the other copper... My initial thought was to balance out the heat distribution. I guess they don't want the heat from the first core to heat up the sink on the second core, so they put a sink with less thermal conductivity closer to the fan. Just a guess really...

"...enhanced weight management means is that the company [AMD] opted for a combination of copper and aluminium to keep the Radeon HD 3870 X2?s GPUs cool. There are a few parts to the heatsink ? the main portion is basically the one-piece extruded aluminium design that cools everything except the two GPUs. The GPUs are then cooled by two inserts ? the GPU nearest to the fan is cooled by an aluminium heatsink, while the other is cooled by a lump of copper.

Aluminium is lighter than copper (thus helping to keep the weight of the card down) and it also dissipates heat better than copper. Meanwhile copper is a better conductor of heat, thanks to its higher specific heat capacity, which basically means that it can ?hold? more heat than aluminium ? this is good because the second GPU is likely to be hotter than the first GPU, as it is cooled second by what is ultimately warmer air straight from the first GPU's heatsink."

Bit-tech.net
 

jedisoulfly

Member
Jul 2, 2007
61
0
0
Originally posted by: Sylvanas
Originally posted by: GonePlaid
Well guys I'm going to go for one. Not sure either the HIS one or Asus. Both are pretty much the same in specs. Stay away from some of the others.

The Diamond has a Core clock @ 775MHz the HIS, ASUS and others are clocked at 825mhz. Not too sure if those are overclocked.
Stay away from the Sapphire one it only seems to list as having a Memory Interface of 256-bit while the others have 512-bit (256x2) not sure if it's a typo.


I'm sure that most of the issues are with the drivers. Toms hardware used the beta drivers for god's sake. Also, the PCIE 2.0 is just a marketing tool. Nothing made today has even bottlenecked the 1.1 pcie.

I'd go with the HIS as you get a cool magnetic LED screwdriver with a spirit level :p, at least thats what Techpowerup got with theirs (I got two of these with my recent crossfire setup).

It seems there will be some drivers appearing soon and even one listed by Sapphire, how new it is I don't know.

IMO the standard response of 'well nothing saturates PCI-E 1.1' is coming to a close, when running a Crossfire/SLI setup in PCI-E 16x 1.1 there are gains to be had by increasing the PCI-E frequency (just check XS) and thus increasing the bandwidth for that bus, so theres no reason that should not come into play with a PCI-E 1.1 bridge chip. In this thread there are also some graphs that supposedly demonstrate the difference between the two interfaces with the 3870X2 how reliable that is I don't know but it does give food for thought.

Good to see DangerDen is out of the blocks quick with an FC waterblock:thumbsup:

Referring to the Danger Den blocks
8800gtx is $135
IONE 8800 G92 GT & GTS is $110
x2900xt is $145

I'm curious to see how much the new 3870x2 block will be.
Also I wonder if it will have a plate for the backside like the block for the x2900xt did for the RAM on the back. I assume that is why the x2900xt block was $10 more than the 8800gtx.
Strange that the G92 block is$25 less than the GTX is. Maybe Danger Den's R&D/CAD programmer is getting better.
 

GonePlaid

Member
Jan 12, 2008
39
0
0
Phew, good thing I put that order through last night with newegg. They are now out of stock of the HIS. I sent them the money order today express mail. If everything goes the way it does with newegg, I'll have the card by next Wednesday the latest.
 

jedisoulfly

Member
Jul 2, 2007
61
0
0
Originally posted by: Sylvanas
Originally posted by: GonePlaid
Well guys I'm going to go for one. Not sure either the HIS one or Asus. Both are pretty much the same in specs. Stay away from some of the others.

The Diamond has a Core clock @ 775MHz the HIS, ASUS and others are clocked at 825mhz. Not too sure if those are overclocked.
Stay away from the Sapphire one it only seems to list as having a Memory Interface of 256-bit while the others have 512-bit (256x2) not sure if it's a typo.


I'm sure that most of the issues are with the drivers. Toms hardware used the beta drivers for god's sake. Also, the PCIE 2.0 is just a marketing tool. Nothing made today has even bottlenecked the 1.1 pcie.

I'd go with the HIS as you get a cool magnetic LED screwdriver with a spirit level :p, at least thats what Techpowerup got with theirs (I got two of these with my recent crossfire setup).

It seems there will be some drivers appearing soon and even one listed by Sapphire, how new it is I don't know.

IMO the standard response of 'well nothing saturates PCI-E 1.1' is coming to a close, when running a Crossfire/SLI setup in PCI-E 16x 1.1 there are gains to be had by increasing the PCI-E frequency (just check XS) and thus increasing the bandwidth for that bus, so theres no reason that should not come into play with a PCI-E 1.1 bridge chip. In this thread there are also some graphs that supposedly demonstrate the difference between the two interfaces with the 3870X2 how reliable that is I don't know but it does give food for thought.

Good to see DangerDen is out of the blocks quick with an FC waterblock:thumbsup:

The Diamond website says the core clock is 825. Seems they also have a 512mb version. I have not seen the 512 version yet.

http://www.diamondmm.com/HD3800.php

The saphire site lists it as 2 x 256-bit

http://www.sapphiretech.com/us...iew.php?gpid=209&grp=3
 

GonePlaid

Member
Jan 12, 2008
39
0
0
Originally posted by: jedisoulfly

Referring to the Danger Den blocks
8800gtx is $135
IONE 8800 G92 GT & GTS is $110
x2900xt is $145

Ouch, I was thinking about maybe doing water cooling one day if I have to clean out this case every week because of the this new case and uber air flow but didn't think blocks were that expensive.
 

Rusin

Senior member
Jun 25, 2007
573
0
0
"To measure performance when playing The Witcher we ran FRAPS during the game's first major cutscene at the start of play. "
This is second example. Our Derek said that Anandtech didn't test cut scenes..
 

Sylvanas

Diamond Member
Jan 20, 2004
3,752
0
0
Originally posted by: Rusin
"To measure performance when playing The Witcher we ran FRAPS during the game's first major cutscene at the start of play. "
This is second example. Our Derek said that Anandtech didn't test cut scenes..

Nobody cares anymore, we've seen this is the best single card solution and if 2 of the 9 games AT tested were of a 'cut scene' it's not as if that discounts the performance figures shown.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: Rusin
"To measure performance when playing The Witcher we ran FRAPS during the game's first major cutscene at the start of play. "
This is second example. Our Derek said that Anandtech didn't test cut scenes..

i *bet* its a TYPO ... change "during" to "after"

note he said " ... at the start of play. "

even Derek makes typos :p