Official ATI 4870 X2 reviews thread.

Page 4 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

SSChevy2001

Senior member
Jul 9, 2008
774
0
0
I was expecting 500 launch price and after a couple month a drop to about $450, which is the most I'll pay for one. Just going to wait it out till Nvidia launches the 55nm and see how the prices look by then. This is one super nice card though and ATI really improved the idle power which I'm glad to see.
 

nitromullet

Diamond Member
Jan 7, 2004
9,031
36
91
Originally posted by: SteelSix
Ah screw it, I'm gonna pick one up. Wouldn't be the first time I've paid >$500 for a graphics card.

I like your style. :)

I think I might grab one of these myself, but I'm going to wait a week or two before making my final decision. First, I want to see how people like theirs, and second maybe (just maybe) they'll drop in price a tad after the initial rush.

 

JPB

Diamond Member
Jul 4, 2005
4,064
89
91
AMD ecstatic over R700 success



AMD IS PLEASED as punch about regaining the graphics crown from Nvidia, but has said it won't be relaxing now that it has achieved that.

Speaking to The INQUIRER, Ian 'the happiest man in the graphics market today' McNaughton said, "today we may be able to plant the flag as the fastest enthusiast graphics card on the planet, but trust me when I say we're not going to rest on those laurels."

"Today's success is the result of a couple of years of effort and a change in strategy we started two years ago."

He pointed out that the R700 was by design rather than by accident, and that although there are some naysayer's in the market, the product reviews of the new cards will show that AMD is on to a winner.

"It's a good day for enthusiasts and it's a good day for AMD, but maybe not such a good day for our competitors. But I'll never count out Nvidia, they are very good at what they do and competition is good," added McNaughton.

He stressed that, rather than reacting to the competition or playing catch up, AMD has little idea what others may be developing and that it is just working on roadmaps based on what its engineers are capable of.

"Nvidia, like us, is very good at keeping things secret and we all have our own Cadbury's secret way of getting the caramel in the chocolate bar," McNaughton said, piling on the syrup.

Although AMD is keeping a watchful eye on Intel's upcoming Larabee chip, McNaughton is holding out judgement one way or the other until actual hardware hits the ground, at which point he reckons that the market will judge for itself.

McNaughton said this launch is the second step in the company's three step process. The first step was to bring to market the high end, mainstream 4800 series. The second step was to capture the enthusiast market with the launch of the R700.

The third, upcoming step will see the rounding out of the platform with launch of the RV710 and RV730, being the value and mainstream market products. Although he wouldn't give an exact timetable, McNaughton did say that this was 'imminent'.

"Hopefully there are lots of Geforce owners in the market who will see the light and go and buy Radeon graphics cards," he said.
 

chizow

Diamond Member
Jun 26, 2001
9,537
2
0
Originally posted by: nitromullet
Originally posted by: SteelSix
Ah screw it, I'm gonna pick one up. Wouldn't be the first time I've paid >$500 for a graphics card.

I like your style. :)

I think I might grab one of these myself, but I'm going to wait a week or two before making my final decision. First, I want to see how people like theirs, and second maybe (just maybe) they'll drop in price a tad after the initial rush.
Good news is we should see 40+ of these based on that poll we had a while back. ;) But then you'll actually have to filter the user feedback as well....chewing up Crysis Very High @ 1920 with 4xAA ya......at 17FPS?

Originally posted by: Hunt3rj2
I don't know anyone who wouldn't like to use AA on their games, especially at lower resolutions where aliasing is more likely to happen, so ATI wins there, Chizow. I don't know what you think, but 4870 CF on a single PCB works for people who have nvidia motherboards. I personally think the real problem with any CF/SLI/X2 setup is that the performance gains need drivers with game profiles to achieve that. It should appear as 1 card/GPU to the drivers, so ideally there needs to be a scheduling device to direct which frames each GPU should render, and the RAM needs to also be shared so that both GPUs can access it without needing to have 2 copies and the GPUs working on their own.
No doubt AA is great, but is it worth it for 8xAA instead of 4xAA or 16xQ? Or worth it for 2x the money for a bit more AA? Or worth it to deal with the potential problems with multi-GPU for a bit more AA?

 

Elfear

Diamond Member
May 30, 2004
7,165
824
126
Decided to grab one off Ebay since they still have their 25% off deal. I think it came to $450 total which is a reasonable price for me. My single HD 4870 just wasn't cutting it for Crysis and Stalker.

I'll post up my impressions when I get the card. I'm hoping the PCI-E 1.0 isn't going to bottleneck the card.
 

OCGuy

Lifer
Jul 12, 2000
27,224
37
91
Originally posted by: Elfear
Decided to grab one off Ebay since they still have their 25% off deal. I think it came to $450 total which is a reasonable price for me. My single HD 4870 just wasn't cutting it for Crysis and Stalker.

I'll post up my impressions when I get the card. I'm hoping the PCI-E 1.0 isn't going to bottleneck the card.

I take it your mobo or PSU wouldnt work with just adding another 4870, which is the same thing?
 

chizow

Diamond Member
Jun 26, 2001
9,537
2
0
Originally posted by: Ocguy31
Originally posted by: Elfear
Decided to grab one off Ebay since they still have their 25% off deal. I think it came to $450 total which is a reasonable price for me. My single HD 4870 just wasn't cutting it for Crysis and Stalker.

I'll post up my impressions when I get the card. I'm hoping the PCI-E 1.0 isn't going to bottleneck the card.

I take it your mobo or PSU wouldnt work with just adding another 4870, which is the same thing?
Not quite the same, there's the dual 1GB buffers on the X2. Even though 4870CF edges out the 4870X2 in some games (probably due to the shared x16), its not sink or swim like 512MB vs. 1GB in the settings/resolutions you'd actually need a 4870X2.

That Ebay/MS deal is great though, sweet price on the card for sure.
 

Tempered81

Diamond Member
Jan 29, 2007
6,374
1
81
do any sites have comparisons of pci-e 1.0 vs. 1.1 vs. 2.0, or p35 vs. p45 vs. x38 vs. x48 for the 4870x2?
 

Sylvanas

Diamond Member
Jan 20, 2004
3,752
0
0
Originally posted by: MichaelD
I just read AT's review and am kind of surprised. The impression I came away with was that the reviewers were pretty much underwhelmed and well, bored, for lack of a better word.

IMO, here's a summary of the review:

Yes, it's the fastest single card solution out, but the performance scaling above the $300 mark doesn't justify it's price. We're really upset about the sideport thing. Same old software solution instead of hardware. Card's really fast but if the game has no XFire profile, it's a waste. Great card but not worth the money to us.

I'm being totally serious about my summary. The AT PREVIEW of the card completely smothered the card in praise and superlatives but the REVIEW of the shipping card showed the reviewers pretty much telling people that the card is not worth the money. "Until we have x-feature and y-feature, we'll just have to suffer, etc" type comments. :confused: Every time they would state a "plus" for the card, a "negative" would follow it.

What's going on here? I've been around long enough to know a "negative slanted review" when I see one.

I got this impression aswell. Also for those that are picking one of these up there's a new GPU-Z out that supports all the temp monitoring etc.
 

solofly

Banned
May 25, 2003
1,421
0
0
Had a chance to get a Diamond by tomorrow but I went with Visiontek instead. Will have it next week...
 

Tempered81

Diamond Member
Jan 29, 2007
6,374
1
81
Originally posted by: solofly
Had a chance to get a Diamond by tomorrow but I went with Visiontek instead. Will have it next week...

hope you're putting it on a 4ghz cpu
 

Elfear

Diamond Member
May 30, 2004
7,165
824
126
Originally posted by: Ocguy31
Originally posted by: Elfear
Decided to grab one off Ebay since they still have their 25% off deal. I think it came to $450 total which is a reasonable price for me. My single HD 4870 just wasn't cutting it for Crysis and Stalker.

I'll post up my impressions when I get the card. I'm hoping the PCI-E 1.0 isn't going to bottleneck the card.

I take it your mobo or PSU wouldnt work with just adding another 4870, which is the same thing?

Like chizow said, I wanted the extra memory of the x2. 512MB just isn't enough for 2560x1600. Plus the fact that I'd have to buy another motherboard if I wanted the full performance of Crossfire.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: chizow
It performs pretty much as expected, 4870CF on a single card. What was interesting was how the X2 still seemed to trade wins with the 4870CF. I suppose it could be the shared x16 slot vs. 1GB buffer balancing each other out, but it seemed the differences were in favor of the X2 in situations where frame buffer caused performance to tank on the 512MB parts. It certainly would be the card to own at 2560, but for anything else it looks like we'll probably need more intensive games or faster CPUs before it justifies its price over slower solutions. In the meantime, you'll get a lot of "free AA", whether or not that's worth the premium is up to you.

i am not sure what to do. i "only" game at 19x12 , so another 4870 might be enough for me

BUT ,, IF i run a 2nd 4870, my P35's 4x [2nd] Slot is gonna hold back the performance like 20% if some reviews are to believed. So i would also need to by a x48 Intel MB [$250]

IF i get a 4870x2, it will not be held back much by the PCIe1.0 [x16 slot] BUT i will have an "extra" 4870/512MB to stick in the 4x slot .. for who knows what "extra" performance it will give

i am looking for suggestions. A CPU upgrade to Quad is in order after i try to find my e4300's Max OC [probably ~3.5Ghz]. Not sure what reasonably priced QC pretty routinely hits 4Ghz either
:confused:


wait might be my best option
 

OneOfTheseDays

Diamond Member
Jan 15, 2000
7,052
0
0
What 3rd party coolers will work on the x2?

I have no other cards in my system other than the video card so I want to slap the biggest heatsink I can find with a low-noise fan to reduce the sound level.
 

mrmessyhair

Junior Member
Aug 12, 2008
5
0
0
Wow I'm definitely taking advantage of 25% off through ebay.

What's a better option, Sapphire or Powercolor?
 

ViRGE

Elite Member, Moderator Emeritus
Oct 9, 1999
31,516
167
106
Originally posted by: SteelSix
At the end of their preview, AT did have a strong tone of caution though, saying that ATI needs to get it right (speaking mainly in terms of CF & drivers). I think if anything, there's a general dissapointment in CF sideport being disabled. The reason sounds like a lazy/lack of time or knowhow driver issue, and if so, AT cautioned. There's been enough time to get this thing 100% right IMO..
If you want to piss off a hardware reviewer, ship them preview parts so that they generate some early hype, then remove features before the real product ships and be dodgy about the whole thing so that the reviewers look particularly bad. I imagine that Anand and Derek are quite unhappy with the situation, and if AMD is keeping up its track record of lousy PR, that's not helping things.
 

n7

Elite Member
Jan 4, 2004
21,281
4
81
Originally posted by: Hunt3rj2
Sapphire, Powercolor is notorious for cheaping out on quality of parts.

Oh?

How about the fact you have to pay a $15 "handling fee" + shipping if you ever have to RMA thru Sapphire?

Considering Powercolor isn't doing anything other than stamping stickers onto the cards (some of which i believe are assembled by Sapphire), i have a hard time agreeing with you.
 

bryanW1995

Lifer
May 22, 2007
11,144
32
91
Originally posted by: chizow
Originally posted by: nitromullet
Originally posted by: SteelSix
Ah screw it, I'm gonna pick one up. Wouldn't be the first time I've paid >$500 for a graphics card.

I like your style. :)

I think I might grab one of these myself, but I'm going to wait a week or two before making my final decision. First, I want to see how people like theirs, and second maybe (just maybe) they'll drop in price a tad after the initial rush.
Good news is we should see 40+ of these based on that poll we had a while back. ;) But then you'll actually have to filter the user feedback as well....chewing up Crysis Very High @ 1920 with 4xAA ya......at 17FPS?

Originally posted by: Hunt3rj2
I don't know anyone who wouldn't like to use AA on their games, especially at lower resolutions where aliasing is more likely to happen, so ATI wins there, Chizow. I don't know what you think, but 4870 CF on a single PCB works for people who have nvidia motherboards. I personally think the real problem with any CF/SLI/X2 setup is that the performance gains need drivers with game profiles to achieve that. It should appear as 1 card/GPU to the drivers, so ideally there needs to be a scheduling device to direct which frames each GPU should render, and the RAM needs to also be shared so that both GPUs can access it without needing to have 2 copies and the GPUs working on their own.
No doubt AA is great, but is it worth it for 8xAA instead of 4xAA or 16xQ? Or worth it for 2x the money for a bit more AA? Or worth it to deal with the potential problems with multi-GPU for a bit more AA?

yeah, I have to question sourthings postings when he claims to be "chewing up" crysis at 19x12, very high, and 4xAA...
 

bryanW1995

Lifer
May 22, 2007
11,144
32
91
Originally posted by: apoppin
Originally posted by: chizow
It performs pretty much as expected, 4870CF on a single card. What was interesting was how the X2 still seemed to trade wins with the 4870CF. I suppose it could be the shared x16 slot vs. 1GB buffer balancing each other out, but it seemed the differences were in favor of the X2 in situations where frame buffer caused performance to tank on the 512MB parts. It certainly would be the card to own at 2560, but for anything else it looks like we'll probably need more intensive games or faster CPUs before it justifies its price over slower solutions. In the meantime, you'll get a lot of "free AA", whether or not that's worth the premium is up to you.

i am not sure what to do. i "only" game at 19x12 , so another 4870 might be enough for me

BUT ,, IF i run a 2nd 4870, my P35's 4x [2nd] Slot is gonna hold back the performance like 20% if some reviews are to believed. So i would also need to by a x48 Intel MB [$250]

IF i get a 4870x2, it will not be held back much by the PCIe1.0 [x16 slot] BUT i will have an "extra" 4870/512MB to stick in the 4x slot .. for who knows what "extra" performance it will give

i am looking for suggestions. A CPU upgrade to Quad is in order after i try to find my e4300's Max OC [probably ~3.5Ghz]. Not sure what reasonably priced QC pretty routinely hits 4Ghz either
:confused:


wait might be my best option

just keep the 1x4870 for now. get a crossfire mobo when bloomfield comes out.
 

Hunt3rj2

Member
Jun 23, 2008
84
0
0
Originally posted by: chizow
Originally posted by: nitromullet
Originally posted by: SteelSix
Ah screw it, I'm gonna pick one up. Wouldn't be the first time I've paid >$500 for a graphics card.

I like your style. :)

I think I might grab one of these myself, but I'm going to wait a week or two before making my final decision. First, I want to see how people like theirs, and second maybe (just maybe) they'll drop in price a tad after the initial rush.
Good news is we should see 40+ of these based on that poll we had a while back. ;) But then you'll actually have to filter the user feedback as well....chewing up Crysis Very High @ 1920 with 4xAA ya......at 17FPS?

Originally posted by: Hunt3rj2
I don't know anyone who wouldn't like to use AA on their games, especially at lower resolutions where aliasing is more likely to happen, so ATI wins there, Chizow. I don't know what you think, but 4870 CF on a single PCB works for people who have nvidia motherboards. I personally think the real problem with any CF/SLI/X2 setup is that the performance gains need drivers with game profiles to achieve that. It should appear as 1 card/GPU to the drivers, so ideally there needs to be a scheduling device to direct which frames each GPU should render, and the RAM needs to also be shared so that both GPUs can access it without needing to have 2 copies and the GPUs working on their own.
No doubt AA is great, but is it worth it for 8xAA instead of 4xAA or 16xQ? Or worth it for 2x the money for a bit more AA? Or worth it to deal with the potential problems with multi-GPU for a bit more AA?

Are you retarded? Being able to have extremely high anti-aliasing and better image quality without sacrificing any performance is great. Even if crossfire doesn't scale at ALL in a game you still get GTX 260 performance, which is quite respectable. Crossfire is a bit more hit or miss with scaling but at least it scales very well. Even the 9800 GX2 doesn't scale consistently. Also about microstutter. You have to realize that it is NOT such a big issue, as now the timing of the frames has been significantly improved in the 48xx series. Even if the amount of microstutter was 100 percent and present in all frames, if the game was running at 60 FPS average it'd be seen at 30 FPS, which is already very smooth and enjoyable. Crysis does not count as a game and no one plays it for anything but graphics. Microstutter is always present in all games, because even the monitor can cause a bit of microstutter. Just make sure the game doesn't dip below 60 fps. Most people cannot even perceive microstutter, it only appears as a slower frame rate.
 

evolucion8

Platinum Member
Jun 17, 2005
2,867
3
81
Originally posted by: Hunt3rj2
Originally posted by: chizow
Originally posted by: nitromullet
Originally posted by: SteelSix
Ah screw it, I'm gonna pick one up. Wouldn't be the first time I've paid >$500 for a graphics card.

I like your style. :)

I think I might grab one of these myself, but I'm going to wait a week or two before making my final decision. First, I want to see how people like theirs, and second maybe (just maybe) they'll drop in price a tad after the initial rush.
Good news is we should see 40+ of these based on that poll we had a while back. ;) But then you'll actually have to filter the user feedback as well....chewing up Crysis Very High @ 1920 with 4xAA ya......at 17FPS?

Originally posted by: Hunt3rj2
I don't know anyone who wouldn't like to use AA on their games, especially at lower resolutions where aliasing is more likely to happen, so ATI wins there, Chizow. I don't know what you think, but 4870 CF on a single PCB works for people who have nvidia motherboards. I personally think the real problem with any CF/SLI/X2 setup is that the performance gains need drivers with game profiles to achieve that. It should appear as 1 card/GPU to the drivers, so ideally there needs to be a scheduling device to direct which frames each GPU should render, and the RAM needs to also be shared so that both GPUs can access it without needing to have 2 copies and the GPUs working on their own.
No doubt AA is great, but is it worth it for 8xAA instead of 4xAA or 16xQ? Or worth it for 2x the money for a bit more AA? Or worth it to deal with the potential problems with multi-GPU for a bit more AA?

Are you retarded? Being able to have extremely high anti-aliasing and better image quality without sacrificing any performance is great. Even if crossfire doesn't scale at ALL in a game you still get GTX 260 performance, which is quite respectable. Crossfire is a bit more hit or miss with scaling but at least it scales very well. Even the 9800 GX2 doesn't scale consistently. Also about microstutter. You have to realize that it is NOT such a big issue, as now the timing of the frames has been significantly improved in the 48xx series. Even if the amount of microstutter was 100 percent and present in all frames, if the game was running at 60 FPS average it'd be seen at 30 FPS, which is already very smooth and enjoyable. Crysis does not count as a game and no one plays it for anything but graphics. Microstutter is always present in all games, because even the monitor can cause a bit of microstutter. Just make sure the game doesn't dip below 60 fps. Most people cannot even perceive microstutter, it only appears as a slower frame rate.

Yeah, that's true, but the Microstutter doesn't make you see the game running at 30fps when it is running at 60, Microstuttering will make a very small pause at a constant rate, not enough to make it look like 30, but enough to look erratic and bothersome. You can try to search in Youtube about Microstutter demonstrations or try to simulate one. How? Play an FPS capped game like Quake 4 or Doom 3, set the refresh rate at 85Hz in the CCC and use V-Sync in the game. You'll see that the game is smooth running at 60fps, but since the Refresh rate is at 85Hz, it will stutter constantly at an exact interval, a bit annoying for us the CRT users, but LCD users shouldn't have issues as far as it uses it's 60Hz Default Refresh Rate..., but in this case the Microstutter happens because the Monitor cannot synchronize it's refresh rate with the video card's FPS since the game engine is capped at 60. Something similar happens across 2 GPU's when they cannot synchronize, but I'm just too sleepy to explain it... Sorry for any incoherence written above...
 

MichaelD

Lifer
Jan 16, 2001
31,528
3
76
Originally posted by: ViRGE
Originally posted by: SteelSix
At the end of their preview, AT did have a strong tone of caution though, saying that ATI needs to get it right (speaking mainly in terms of CF & drivers). I think if anything, there's a general dissapointment in CF sideport being disabled. The reason sounds like a lazy/lack of time or knowhow driver issue, and if so, AT cautioned. There's been enough time to get this thing 100% right IMO..
If you want to piss off a hardware reviewer, ship them preview parts so that they generate some early hype, then remove features before the real product ships and be dodgy about the whole thing so that the reviewers look particularly bad. I imagine that Anand and Derek are quite unhappy with the situation, and if AMD is keeping up its track record of lousy PR, that's not helping things.

I assume you're speaking about the Sideport?

I agree that if this was the case, it would not be cool. But it was never verified that the Sideport ever worked anyway, nor was it ever specified (IIRC) what the thing did to begin with.

From the AT review:
RV770 has a CrossFire X Sideport...we assume that the two RV770s on a single R700 board somehow connect Sideports and make fast. AMD hasn't told us how yet.
It is not clear how extensive this communication will be, what information will be shared, or how much bandwidth requirements are increased because of this feature

Should have AMD enabled it? Maybe. But if they knew or had an inkling that they would not be able to enable it (whatever it is/does) then they should've taken it off that block diagram/chart. But still, I don't think that's anything to beat them up over.

XFire works with the X2; other review sites have gotten two X2 cards working in XFire w/o a problem.

There's little doubt in my mind that AT's review is a "negatively worded backhand slap positive review." Sort of like slapping someone while saying "Hey, you did a great job!" :confused:

I think if a reviewer has an axe to grind they should come right out and state "I am disappointed/mad b/c of "THIS REASON." Instead of going about it they way they did. I.E. "It gets 85FPS in this game at 19x12...but that's only 7% faster than a GTX280" and "Until we get this feature working, we'll just deal with it I guess" and other negatively-worded positives.