Official ATI 4870 X2 reviews thread.

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Narynan

Member
Jul 9, 2008
188
0
0
I am actually awake FOR tha card, and I have yet to see it available.

Hope I dont fall alseep at the keyboard.
 

MichaelD

Lifer
Jan 16, 2001
31,528
3
76
I'm very surprised that the major etailers don't have the card online yet. I remember the 8800GT launch; Newegg had it online exactly two minutes after midnight on the release day. :confused: I know b/c I bought one.
 

JPB

Diamond Member
Jul 4, 2005
4,064
89
91
I don't think a die shrink and higher clocks will help nvidia at all this time around.

The X2 is just too much to handle. :thumbsup:
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,003
126
This is damn good news for AMD but realistically I'm waiting for the 55 nm GTX280 as my next upgrade.
 

Narynan

Member
Jul 9, 2008
188
0
0
zzzz where the hell ...... zzzzzz but mom the moneys have the 4870 and wont put it up for sale.....zzzzzzz

*Snort* Huh? Oh. This isnt a PS3 launch? Alright I'm off to bed.
 

Sylvanas

Diamond Member
Jan 20, 2004
3,752
0
0
Sweet, it is a monster as predicted. Also waiting for it to become available with some Full coverage waterblocks...
 

darXoul

Senior member
Jan 15, 2004
702
0
0
Undoubtedly, an impressive card. I'm still "on hold" with my upgrade, currently with a venerable X2 4200+ OCed to 2.64 GHz on s939, 2 GB RAM and 8800 GTX under the hood, as well as Samsung 226BW LCD.

The upcoming upgrade will be pretty radical, probably featuring the new S-IPS LG W2600HP monitor (19*12 res, 26", nicely readable 0.287 mm pixel pitch, pretty much no input lag and very good response time), E8600 CPU, 4 GB DDR2 along with a new mobo, and... a brand new video card. Will it be the 4870X2? I doubt it. I'm very happy that there is competition and intensity in the graphics market again but I could very well end up getting another "green camp" card for my PC. Why? Simple reasons:

1. 4870X2 only seems to make real sense for 30" LCD users, and I don't want one of these beasts because of IMO too small pixels and virtually no model on the market with good response time and no input lag (Dell 3007 could be it but it's very hard to get here). In 19*12, even with eye candy, the 280 GTX or HD 4870 can maintain playable FPS as well - at a lower cost.

The game that really challenges all cards in 19*12 without being CPU-limited is Crysis but benchmarks are highly inconsistent here - the 4870X2 doesn't really look like a convincing winner in this game, it doesn't suddenly enable us to play in full eye candy and enjoy smooth frame rates.

2. I'm still not a fan of multi-GPU solutions. As computerbase.de proves, there is still apparent microstuttering on 4870X2 - the frame distribution graph is as uneven as it can be. Yes, it's rare that you can bring the card to its knees where this phenomenon is present, but it's still there nonetheless. Also, noise levels and power consumption are kind of scary. Not even mentioning poor 2D/windowed mode performance, lowish minimum FPS in some games, 100% reliance on XF profiles, etc.

The 4870X2 is tempting for sure, also thanks to its relatively moderate price tag. However, I guess I'll keep waiting for a few weeks more to see GT200b 55nm performance before making a decision. If nVidia can improve 280's performance by even 10% and sell the cards at launch for decent prices, I'm sold.
 

SniperDaws

Senior member
Aug 14, 2007
762
0
0
Woohoo another great graphics card released nice one, Erm now if we could just get some decent games to play it might be worth buying one.

The only games worth playing are World in Conflict and BF2.
 
Apr 27, 2004
32
0
0
Here is what Guru3d's owner says about MS and the R700 (whom I trust more than some random german site). Also...

Stuttering != micro stutter

No microstuttering.

The little microstuttering that ATI caught was related to different power states (power management) and clock-gating of the GPUs according to ATI. They took a peek at that with R700 and previous models. For the final R700 products (which we'll test tomorrow) it should really be a non-issue. I did look into it.

However the R700 sample that PCG used was an early R700, the same that I have here. There is no power-management, clock-gating or any control in these cards, it has an early .04 BIOS and thus the control software is missing completely. The final products will have a .07 BIOS with everything enabled.

Fact is that even single GPU based cards stutter every now and then due to bottlenecks, framebuffers, thermal control, heck even HDD activity is commonly confused and labeled as micro-stuttering these days. I'm not saying that there is no micro-stuttering on multi-gpu solutions. I'm just saying I have an awful hard time replicating them and stopped wasting my time hunting a ghost.

If you actually need to measure it scientifically in milliseconds .. well in my book it doesn't count. E.g. -- I have had no problems with it, as I can't see / experience it.

Thus sayeth the Hilbert http://forums.guru3d.com/showthread.php?t=270113
 

Bateluer

Lifer
Jun 23, 2001
27,730
8
0
I am somewhat surprised the X2 didn't launch with the 8.8 Catalysts myself. Oh well, those should be out soon.
 

Hauk

Platinum Member
Nov 22, 2001
2,806
0
0
I'm kinda turned off by the price. Yes it's where they said it would be; but when I can buy a GTX 280 OC2 for $439 w/free shipping and a $30 rebate, plus step-up option, $550 seems pricey all of a sudden.

Yea yea I know, but I was really thinking they'd launch for $499...
 

nRollo

Banned
Jan 11, 2002
10,460
0
0
Originally posted by: JPB
I don't think a die shrink and higher clocks will help nvidia at all this time around.

The X2 is just too much to handle. :thumbsup:

Disagree, I don't even think they need a die shrink to compete, and a die shrink will only help.

First I have to say kudos to ATi for coming out with the fastest single slot card, very nice performance indeed.

However, here's why I think NVIDIA doesn't need "help":

1. Multi GPU is a niche market, and $500> multi GPU is a smaller niche market within it. When you have two discrete cards your options for disposal are greater, can sell both or one.

2. There are reasons some people won't do multi GPU at all: microstutter, AFR input lag, variable scaling, and with Crossfire, the hidden profiles.

3. The VAST majority of the market does not have 25X16 monitors. If you have a 16X10 or 19X12 monitor, what is a person's reason to put up with the items in #2 and pay $150 more for them?

4. 444W at load is a big difference from 299W at load, and may require a PSU update as well.

5. ATi cards have no PhysX, and that is the biggest leap forward in game immersion in a long time.

6. Only one ATi vendor offers a lifetime warranty that I know of (Visiontek) and none offer transferable warranty, step up, or modification warranty.

7. No custom cooling for parts like this I know of.

8. People with SLi motherboards can get similar performance at a similar price with two GTX260s and have more flexible multi GPU. (create/edit profiles, force three render modes instead of one)

9. Last, the market for $550 cards is much smaller than the market for $400 cards.

So I don't think NVIDIA really has to do anything at all, and doesn't really need "help".


Oh not this again. This is an ATI review thread, and you know that. Enjoy a week off.

-ViRGE
 
Apr 27, 2004
32
0
0
1. Multi-GPU is a 'niche market'? Thats why Nvidia made 9800 GX2 as one of their flagship products? Why would you sell one of your cards and not both?

2. You mean like the microstutter that exists in Nvidia graphics cards?

3. You mean pay $150 more for more performance? Or like how you could buy 8800 Ultra for $150 more than the GTX just so you could get TRI-SLI (insert all problems with multi-GPU problems you cite).

4. 444W at load means you can still use a 600-700W PSU. Which a good amount of people buying a 400-600 dollar video card has.

5. You mean the developers being helped Nvidia doesn't have PhysX working on ATI cards? You mean Nvidia is not going to allow PhysX to work with 8x00/9x00 cards in a system that has an ATI card as its GPU? (meaning using an 8600gt for PhysX in an ATI system).

6. Why do you need step up if you don't release a new rebranded graphics card every other month?

7. It's pretty common for both ATI and Nvidia to launch their products based on reference design and later AIB partners release their own versions.

8. Except Nvidia's chipsets aren't as good as Intel/AMD ones and most people prefer them. AMD cards in CF work in both AMD and Intel systems.

9. Really? So why has Nvidia been releasing their graphics cards at $600+ for the past few years? How much was the GTX 280 at launch?
 

darXoul

Senior member
Jan 15, 2004
702
0
0
Just to comment on the microstutter issue: the aforementioned guy's opinion isn't worth much to me. Numbers and "scientific evidence" are of course useless if they don't reflect real world conditions but they're at least objective, while his perception is most definitely far from it. There were people who claimed more than 24 FPS was overkill, people who claim they don't experience multi-GPU stuttering, people who say there is no input lag on their 24" Dells, etc.

This "random German site" is: a) way more credible in my eyes than some guy from some forums because it's a professional review site b) not the only one investigating microstutter and claiming it's a problem (PCGH being another example, by coincidence also German).

Anyway, most of us will probably never exeperience MSing on the new Radeon, simply because it's pretty hard to force the card to run under 60 FPS, where MS appears.
 

ajaidevsingh

Senior member
Mar 7, 2008
563
0
0
Anyone want to buy my 4850's CF!!

No this is epic i guess 4870 x2 = 5870 OCed "Just a taught"

So i am getting a 4850 x2 for sure.. 4870x2 is kk but a bit too expensive in front of the 4850x2 "$150"!!

Besides 4850x2 ~ GTX280 and on top will equal the next gen cards in performance and i can CF them later...!!!

When is 4850X2 gonna release anyhow????
 

Creig

Diamond Member
Oct 9, 1999
5,170
13
81
Looks like the 4870X2 is proving to be quite the FPS monster, especially at higher res. I'm sure we all would have been happier to see it debut at a $499 price point, but at $550 it's still cheaper than 4870 in crossfire and has 1GB of memory per GPU rather than the 512MB on current 4870's.

All in all, it looks like it should be a good seller for the "cutting edge" enthusiasts. Personally, I'm going to wait for the 1GB 4870 and 4850X2 reviews before I decide what to replace my 4850 with.
 

hclarkjr

Lifer
Oct 9, 1999
11,375
0
0
the anandtech review is up on main page, unfortunately anandtech is having server problems currently and loading real slow so it will take awhile to read it
 
Apr 27, 2004
32
0
0
Originally posted by: darXoul
Just to comment on the microstutter issue: the aforementioned guy's opinion isn't worth much to me. Numbers and "scientific evidence" are of course useless if they don't reflect real world conditions but they're at least objective, while his perception is most definitely far from it. There were people who claimed more than 24 FPS was overkill, people who claim they don't experience multi-GPU stuttering, people who say there is no input lag on their 24" Dells, etc.

This "random German site" is: a) way more credible in my eyes than some guy from some forums because it's a professional review site b) not the only one investigating microstutter and claiming it's a problem (PCGH being another example, by coincidence also German).

Anyway, most of us will probably never exeperience MSing on the new Radeon, simply because it's pretty hard to force the card to run under 60 FPS, where MS appears.

Some random guy on some forums? Uh, thats the owner of Guru3d.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Originally posted by: Ocguy31

For those with a X-fire board, what are the advantages of this over 2 4870's which you can get cheaper? They pretty much bench the same (There is no reason they wouldnt, other than the VRAM.)

I guess it still leaves you an upgrade path...?

- In comparison to the Radeon HD 4870 CrossFire setup, the Radeon HD 4870 X2 really shined at 2560x1600 in games like BioShock, Company of Heroes, and Lost Planet. In these cases the 4870 CrossFire setup just didn?t have enough memory and the 4870 X2 was able to pull away by a factor of 3-5X!! (In Company of Heroes the margin was just 27% at 2560x1600, although it also outran the 4870 CrossFire rig at 16x12 and 1920x1200 by 8-10%). If 8xAA gaming isn?t a priority for you, these margins would obviously tighten up quite a bit, but considering the pricing ATI is offering on the 4870 X2 it?s probably a better deal than buying two Radeon 4870 cards separately. And once again, you don?t need a CrossFire motherboard if you get the X2." - Firingsquad
 

ViRGE

Elite Member, Moderator Emeritus
Oct 9, 1999
31,516
167
106
After reading AT's review, I just had a thought. If the Sideport feature is disabled, then doesn't this mean that all of the inter-device communication is occurring over the PCIe bus? Isn't this akin to running CrossFire without a CF bridge?
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: RussianSensation
Originally posted by: Ocguy31

For those with a X-fire board, what are the advantages of this over 2 4870's which you can get cheaper? They pretty much bench the same (There is no reason they wouldnt, other than the VRAM.)

I guess it still leaves you an upgrade path...?

- In comparison to the Radeon HD 4870 CrossFire setup, the Radeon HD 4870 X2 really shined at 2560x1600 in games like BioShock, Company of Heroes, and Lost Planet. In these cases the 4870 CrossFire setup just didn?t have enough memory and the 4870 X2 was able to pull away by a factor of 3-5X!! (In Company of Heroes the margin was just 27% at 2560x1600, although it also outran the 4870 CrossFire rig at 16x12 and 1920x1200 by 8-10%). If 8xAA gaming isn?t a priority for you, these margins would obviously tighten up quite a bit, but considering the pricing ATI is offering on the 4870 X2 it?s probably a better deal than buying two Radeon 4870 cards separately. And once again, you don?t need a CrossFire motherboard if you get the X2." - Firingsquad

there is no reason for me to sell my 4870/512MB and get a 4870x2/2GB instead of just getting another 4870/512M for Crossfire; at least, not at 19x12 to get a paltry 8% more when it is running plenty fast with everything maxed.

However, if i was gaming at 25x16, i'd definitely consider it. And i am RE-considering getting an X2 to pair with my X1 - for a CrossfireX3 solution - as the 2GB card would be neutered to sharing 512MB vRAM effectively. Of course it would be like 4870/512MB x3 - blisteringly fast - but still limited somewhat in some circumstances to what a true 4870x3 all sharing maximum vRAM at the highest resolution and with maxed out settings.
it might be fun to test however - and i could demonstrate how FPS would increase changing MB platforms; going from 4870x2 in my 16x PCIe slot paired with my current 4870 in the 4xPCIe slot of my p35 crossfire MB and then upgrading to an x48 solution for 16+16 ultimate performance. X3 looks to be an offbeat way to get some extra FPS without going for X4 which does not seem to scale
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,003
126
Originally posted by: NullSubroutine

Here is what Guru3d's owner says about MS and the R700 (whom I trust more than some random german site).
Just curious, do you trust Guru3D more because their testing methodology was better or because they're saying what you want to hear?

Based on Guru3D's comments it doesn't appear they understand what micro-stutter is. The fact is it's mathematically provable an AFR system will micro-stutter if it becomes sufficiently GPU bound.

As for millisecond measurements not counting, I guess he just prefers to bury his head in the sand and pretend the issue doesn?t exist. I mean if someone can't notice more than 30 FPS is it okay for them to claim a 3850 is equal to a GTX280 since measuring frames ?don't count?? :roll:

And yeah, anyone that understands micro-stutter ? the variance between frames ? knows it?s not the same thing as regular stuttering.
 

darXoul

Senior member
Jan 15, 2004
702
0
0
Originally posted by: NullSubroutine
Some random guy on some forums? Uh, thats the owner of Guru3d.

Okay, whatever. It's reviewer's word and "perception/experience" vs. reviewer's word and mathematical proof then. Plus video from PCGH, etc. etc. Still very lopsided if you ask me.