My experience with 5600fx

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

5150Joker

Diamond Member
Feb 6, 2002
5,549
0
71
www.techinferno.com
Originally posted by: walk2
ATI is notorious for gimping features to improve their performance (or to cover their half-assed hardware implementations of DX features). I suppose they might have improved in the last 6 months? I'm speaking from working in the QA dept of a major Games company for the last 7 years - ATI cards are ALWAYS the ones that give us trouble. They just don't support DX features like they are supposed, drives our DX programmers CRAZY!


You're either trolling or have very crappy programmers at whatever half-assed company you work at.
 

PowerMacG5

Diamond Member
Apr 14, 2002
7,701
0
0
Originally posted by: 5150Joker
Originally posted by: walk2 Oh 1 more thing. The 5600 looks like a great card for the price. I'm about to buy a 5600-Ultra 128MB for $189. It should definitely be a lot faster than the GF4 MX. I don't feel like spending $500 for a 5900 so there you go. ATI is totally out of the question (sorry, personal thing for me, seen far too many issues with them - sure the benchmark #'s are good, but they cut too many corners to get those).
What kind of crack are you smoking? ATi cuts corners to get high numbers?? You seem to be confusing them with nVidia seeing as how it's nVidia that's been "cutting corners" (cheating) to get higher numbers in games and applications.

Alright, lets all stop with the cheating flame thrower. Everyone has done it. Both ATI and NVidia have done it.
 

Pete

Diamond Member
Oct 10, 1999
4,953
0
0
walk2, do more research. The 5600 requires extra power just like the 5800--you have to plug a four-pin power connector from the PSU into both cards. The 5800 does NOT have the infamous FX Flow--it uses a more reasonable cooling solution, and one that lets it OC very well (most seem to OC to Ultra speeds using nVidia's own OC'ing tool). It's probably bigger and hotter than a 5600U, I'll grant you that. But that extra heat wouldn't deter me from its superior price/performance ratio. $220 for a 5800 is a better deal than $190 for a 5600U.

I didn't know ATi had problems with DX. I always thought they were better at DX, while nVidia was better at OGL. Huh.

Rollo, I don't NV30's delay was entirely ascribed to TSMC's problems. I've heard a few people say nVidia themselves were somewhat at fault with the design, due to the way nVidia people described the delay in conference calls. Whatever the problem was, nV seems to have fixed it, as the 5800 seems to OC very well.
 

nRollo

Banned
Jan 11, 2002
10,460
0
0
Perhaps Pete, but I don't know that it was a "rushed design" as Otis notes.
Didn't they have to switch the design at the last minute because TSMC couldn't deliver on some process that was supposed to reduce thermal migration, and that's one of the reasons they ended up on the mega layer pcb?
(I've got the fragments of a memory rattling around in my head)

In any case, having used a 5800 for ~5 weeks, I think I can say with good authority it works well and should make any 5600 look like last generation stuff.
 

walk2

Member
Jul 25, 2003
82
0
0
Oh well I ordered one. If I don't like it I will try the 5800 or maybe 5900 even. Truth be known, I'd rather stick with my GF3 Ti500, but I ordered a new computer, and figured it would be silly not to upgrade the video card. In any case the 5600-U ought to be faster than GF3 eh? :) I don't think the 5600 needs extra power, and I know the HSF are small simple, quiet designs. Maybe I'm confusing the 5800 with the 5800-Ultra but I know most of the cards I looked at had HUGE HSF's some even with 2 or THREE?! fans on them... or they were these big fat things that took up 2 slots. Like

this or THIS? <- LOL what in the hell is that monster all about ??

This is more like what I ordered...
 

nRollo

Banned
Jan 11, 2002
10,460
0
0
Your Asus V9900 is what they pretty much all look like, and while they take up the adjacent slot, they aren't loud or hot. The slot next to your agp is a small price to pay for double the fill rate.
 

Pete

Diamond Member
Oct 10, 1999
4,953
0
0
Rollo, I think the only rushing on the NV30 was to accomodate the fab failures, as you said (I think it was low-K dielectric or copper that didn't come through).

walk2, every single FX 5600 card I remember seeing has an extra power connector. I guess nVidia was able to modify the design or reduce power consumption (or I'm experiencing selective memory) with the flip chip version, as it seems current cards have eliminated the need for extra power.
 

merlocka

Platinum Member
Nov 24, 1999
2,832
0
0
Originally posted by: Pete
Rollo, I think the only rushing on the NV30 was to accomodate the fab failures, as you said (I think it was low-K dielectric or copper that didn't come through).

walk2, every single FX 5600 card I remember seeing has an extra power connector. I guess nVidia was able to modify the design or reduce power consumption (or I'm experiencing selective memory) with the flip chip version, as it seems current cards have eliminated the need for extra power.

IIRC the 5600U reference design includes the extra 4 pin power connector. The standard 5600 (which is very similar to the 5200 reference design) does not include the 4 pin power connector. Regardless, AIB makers have the option to follow the reference design or not so there will be exceptions to the rule.


 

nRollo

Banned
Jan 11, 2002
10,460
0
0
Pete, it was the low K dielectric. I remembered it started with "di" but not the full name or whole story. Thanks.
 

5150Joker

Diamond Member
Feb 6, 2002
5,549
0
71
www.techinferno.com
Originally posted by: KraziKid
Originally posted by: 5150Joker
Originally posted by: walk2 Oh 1 more thing. The 5600 looks like a great card for the price. I'm about to buy a 5600-Ultra 128MB for $189. It should definitely be a lot faster than the GF4 MX. I don't feel like spending $500 for a 5900 so there you go. ATI is totally out of the question (sorry, personal thing for me, seen far too many issues with them - sure the benchmark #'s are good, but they cut too many corners to get those).
What kind of crack are you smoking? ATi cuts corners to get high numbers?? You seem to be confusing them with nVidia seeing as how it's nVidia that's been "cutting corners" (cheating) to get higher numbers in games and applications.

Alright, lets all stop with the cheating flame thrower. Everyone has done it. Both ATI and NVidia have done it.

No the difference is ATi had a slight 2% optimization (compared to nVidia's image quality degrading 20%+ cheat) in 3dmark03 that they got rid of and nVidia STILL has cheats in over 70 applications/games. If you aren't aware of unwinders script, 3dmark03 texture compression cheat or UT2003 fiasco then you need to do a little research.
 

chilled

Senior member
Jun 2, 2002
709
0
0
Originally posted by: 5150Joker
No the difference is ATi had a slight 2% optimization (not cheat) in 3dmark03 that the got rid of and nVidia STILL has cheats in over 70 applications/games. If you aren't aware of unwinders script, 3dmark03 texture compression cheat or UT2003 fiasco then you need to do a little research.

Could you please provide a source for the 70 games claim? Maybe a list of the games if possible.

The reason I ask is that I want to buy a new gfx card soon and am trying to decide on my own based on the facts (for an unbiased opinion).
 

touchmyichi

Golden Member
May 26, 2002
1,774
0
76
I wouldn't even bother with any cards in the 5200/5600/5800 series. The ATI 9500/9600/9700/9800 has all of those beat for similar prices. Buying a 5600 is absolutely ridiculous when a much powerful 9700 is available for just a tad more. The 9800 non pro is also becoming quite popular. For only 270 dollars, it slices the 5800 in half. Really, the only card that Nvidia has right now thats worth buying is the geforce 4 ti series and the 5900.
 

5150Joker

Diamond Member
Feb 6, 2002
5,549
0
71
www.techinferno.com
Originally posted by: chilled
Originally posted by: 5150Joker
No the difference is ATi had a slight 2% optimization (not cheat) in 3dmark03 that the got rid of and nVidia STILL has cheats in over 70 applications/games. If you aren't aware of unwinders script, 3dmark03 texture compression cheat or UT2003 fiasco then you need to do a little research.

Could you please provide a source for the 70 games claim? Maybe a list of the games if possible.

The reason I ask is that I want to buy a new gfx card soon and am trying to decide on my own based on the facts (for an unbiased opinion).

Sure: http://www.beyond3d.com/misc/random/antidetector/

There's a lot more information to be found in the Beyond3D forums so I'd encourage you to look there after you read that article. It's recently been discovered that with the 44.03 drivers if you had renamed a program to UT2003.exe the drivers would automatically disable full trilinear filtering and switch to a half-way partial bilinear mode for faster performance despite having quality AF checked in the driver box. nVidia claims that's a bug but common sense should tell you that's a lie that nobody believes. It's also been reported recently by digit-life that nVidia is now compressing certain textures in 3dmark03 to gain extra performance again and that is outside the parameters of "optimization"--but this is still being discussed. I'd advise everyone that is curious about these issues to explore the Beyond3D forums, they offer much more in-depth information than anandtech or most other forums/websites have to offer.
 

nRollo

Banned
Jan 11, 2002
10,460
0
0
For only 270 dollars, it slices the 5800 in half. Really, the only card that Nvidia has right now thats worth buying is the geforce 4 ti series and the 5900.
Touchmychi, the 5800 is now $229 at Googlegear. (Chaintech retail) It does not get "sliced in half by 9800nps" it loses at some AA/AF settings in some games. I've used a 5800 for the last 5 weeks and can tell you it's a LOT of card for $229.

http://www.nordichardware.com/reviews/graphiccard/2003/Abit_FX5800/index.php?ez=5
Equal to the 9700 Pro at UT2003 at this review, of course this is using drivers known to have "optimizations". (read:cheats)

http://www.nordichardware.com/reviews/graphiccard/2003/Abit_FX5800/index.php?ez=4
Q3 numbers aren't bad either, definitely playable, especially the 2X4X settings.

Another review with lots of benches
One thing to note here: it's one of 5 reviews I saw that all said they could safely clock their 5800 at Ultra speeds, which makes it effectively as fast as a 9700Pro. My own has been the same: nVidia even includes a utility in the drivers to test your highest safe OC speed, and there's a built in safety feature that turns the card back to 300/300 to cool down if it ever gets up to a temperature they deem unsafe. Mine never got to HALF of that temperature.

What about D3? Ooops. Looks like the 5800 is going to own that:
Why is ATI so slow at this game?

Admit it Touchy: You've never used a 5800, seen a 5800, come within throwing distance of a 5800. You don't really know what you're talking about here.
I used a 9700Pro for 8 months, and the 5800 for the last month and a half, so I can at least speak about it from experience. You're just pitching a tent because you just went from a real old slow card (Ti200) to a kind of old fast card. (9700Pro).
Because you've done that doesn't mean you have to proclaim yourself "King Video Guru" and declare all other cards feeble.
You could make a pretty strong price/performance ratio argument for the 5800 at $229 compared to $350 5900s. If you're a fps gamer and not all about high AA/AF, it's a pretty good card to have.



 

5150Joker

Diamond Member
Feb 6, 2002
5,549
0
71
www.techinferno.com
Originally posted by: Rollo
For only 270 dollars, it slices the 5800 in half. Really, the only card that Nvidia has right now thats worth buying is the geforce 4 ti series and the 5900.
Touchmychi, the 5800 is now $229 at Googlegear. (Chaintech retail) It does not get "sliced in half by 9800nps" it loses at some AA/AF settings in some games. I've used a 5800 for the last 5 weeks and can tell you it's a LOT of card for $229.

http://www.nordichardware.com/reviews/graphiccard/2003/Abit_FX5800/index.php?ez=5
Equal to the 9700 Pro at UT2003 at this review, of course this is using drivers known to have "optimizations". (read:cheats)

http://www.nordichardware.com/reviews/graphiccard/2003/Abit_FX5800/index.php?ez=4
Q3 numbers aren't bad either, definitely playable, especially the 2X4X settings.

Another review with lots of benches
One thing to note here: it's one of 5 reviews I saw that all said they could safely clock their 5800 at Ultra speeds, which makes it effectively as fast as a 9700Pro. My own has been the same: nVidia even includes a utility in the drivers to test your highest safe OC speed, and there's a built in safety feature that turns the card back to 300/300 to cool down if it ever gets up to a temperature they deem unsafe. Mine never got to HALF of that temperature.

What about D3? Ooops. Looks like the 5800 is going to own that:
Why is ATI so slow at this game?

Admit it Touchy: You've never used a 5800, seen a 5800, come within throwing distance of a 5800. You don't really know what you're talking about here.
I used a 9700Pro for 8 months, and the 5800 for the last month and a half, so I can at least speak about it from experience. You're just pitching a tent because you just went from a real old slow card (Ti200) to a kind of old fast card. (9700Pro).
Because you've done that doesn't mean you have to proclaim yourself "King Video Guru" and declare all other cards feeble.
You could make a pretty strong price/performance ratio argument for the 5800 at $229 compared to $350 5900s. If you're a fps gamer and not all about high AA/AF, it's a pretty good card to have.


First off, the nordicreview site used 44.03 drivers that have an AF cheat in UT2003 not to mention 3DMark03 and a host of other applications. As for D3, the game isn't out yet and at the time of the review ATi had no time to optimize for the game like nVidia had ahead of time. When the game comes out, THEN we'll see how they stack up. It's funny you leave out HL2 which will most likely bring nv30/nv35 cards to their knees in shader performance compared to any R3xx based card. There won't be a driver cure for that either...
 

nRollo

Banned
Jan 11, 2002
10,460
0
0
It's funny you leave out HL2 which will most likely bring nv30/nv35 cards to their knees in shader performance compared to any R3xx based card.
It's actually pretty easy for me to leave HL2 out, 5150ATIemployee. You see, I haven't seen any benchmarks for that one. Could you direct me to some?

BTW- still limping along with your 9700Pro/Audigy 1 combo? ;)

One more thing- there's more to a video card than raw benchmark numbers. Some people always used to ask me why I didn't run 6X/16X with my 9700Pro in UT2003. I didn't because it wasn't smooth at those settings in the big maps, even though the benchmark numbers were good.
That is one thing the 5800 EXCELS at- offering silky smooth animation and good mouse feel, even when the fps are dipping lower. Why or how that is, beats me. Ex 3dfxers will know what I'm talking about when I say it's possible for a card to have lower benchmarks but better feel, I don't think it's any coincidence that the nV30 is the first card with 3dfx tech and collaboration in it since the buyout.


 

5150Joker

Diamond Member
Feb 6, 2002
5,549
0
71
www.techinferno.com
Originally posted by: Rollo
It's funny you leave out HL2 which will most likely bring nv30/nv35 cards to their knees in shader performance compared to any R3xx based card.
It's actually pretty easy for me to leave HL2 out, 5150ATIemployee. You see, I haven't seen any benchmarks for that one. Could you direct me to some?

Sure, go take a look at the number of shader benchmarks for the nv30/nv35 cards in vs/ps 2.0 operations and you'll see they perform MISERABLY. HL2 will automatically call PS 2.0/VS 2.0 when it detects a DX9 card, hence the pretty solid belief that the fx series will perform pretty horribly in shader heavy situations. The game benchmarks haven't surfaced yet but the evidence pointing in the direction of that is readily available. As for the snide "ATiEmployee" comment, that's funny coming from an nVidiot like yourself that swallowed everything [T]ardOCP had to say about UT2003/nV35 AF fiasco without first examining the flaws in the article.
rolleye.gif


BTW- still limping along with your 9700Pro/Audigy 1 combo? ;)


Since when did a 9700 pro/audigy 1 become low end? But to answer your question, I have a R9800 Pro with a canterwood system at 3.6 ghz.
rolleye.gif




 

nRollo

Banned
Jan 11, 2002
10,460
0
0
5150ATIemployee:
The game benchmarks haven't surfaced yet but the evidence pointing in the direction of that is readily available.
Well, since you asked why I left out HL2 performance, I would say the absence of any benchmarks is a pretty good reason?
rolleye.gif


As for the snide "ATiEmployee" comment, that's funny coming from an nVidiot like yourself that swallowed everything [T]ardOCP had to say about UT2003/nV35 AF fiasco without first examining the flaws in the article.
You mean an nVidiot like me who sold his nVidia based card and bought a 9800Pro, just because they didn't respond to my email? Pretty slavelike devotion to nVidia there!
rolleye.gif


Since when did a 9700 pro/audigy 1 become low end? But to answer your question, I have a R9800 Pro with a canterwood system at 3.6 ghz
A. So you just forgot to update your "My Rig" page here?
B. Does ATI give you an employee discount, or the RMAs that work for free?
 

5150Joker

Diamond Member
Feb 6, 2002
5,549
0
71
www.techinferno.com
Originally posted by: Rollo
5150ATIemployee:
The game benchmarks haven't surfaced yet but the evidence pointing in the direction of that is readily available.
Well, since you asked why I left out HL2 performance, I would say the absence of any benchmarks is a pretty good reason?
rolleye.gif

Pretty dense aren't you? Let me repeat this: There need not be benchmarks, the fact that nv30/nv35 have 2-5x slower shader performance than the R3XX series is well documented.
rolleye.gif


As for the snide "ATiEmployee" comment, that's funny coming from an nVidiot like yourself that swallowed everything [T]ardOCP had to say about UT2003/nV35 AF fiasco without first examining the flaws in the article.
You mean an nVidiot like me who sold his nVidia based card and bought a 9800Pro, just because they didn't respond to my email? Pretty slavelike devotion to nVidia there!
rolleye.gif

Awww poor baby, nVidia hurt your feewings? Just because you bought a 9800 Pro after realizing how bad the 5800 you had sucked doesn't mean you aren't an nVidiot--btw you don't need to lie about the e-mail bit, we both know you did it because you realized the 5800 was garbage.

Since when did a 9700 pro/audigy 1 become low end? But to answer your question, I have a R9800 Pro with a canterwood system at 3.6 ghz
A. So you just forgot to update your "My Rig" page here?
B. Does ATI give you an employee discount, or the RMAs that work for free?

I haven't gotten around to updating the Rigs page, gee I guess I better get right on it!
rolleye.gif
 

nRollo

Banned
Jan 11, 2002
10,460
0
0
Pretty dense aren't you? Let me repeat this: There need not be benchmarks, the fact that nv30/nv35 have 2-5x slower shader performance than the R3XX series is well documented.
No, I've just been around a while and have seen cases where the card that should be the fastest at a game on paper doesn't turn out to be. For example, you'd think a 9800 Pro would mop the floor with a 5800 Ultra at D3, but that doesn't seem to be the case, does it?
BTW, are you guys at ATI scrambling since Carmack shut you out after you leaked the demo he gave you? I notice nVidia didn't repeat your mistake. ;)

Awww poor baby, nVidia hurt your feewings?
No, not really. What would you do if you bought something, the company needlessly de-valued it, and wouldn't respond to your customer service request? (and remember, it has to be something other than video cards, I realize you can't flame your employer)

because you bought a 9800 Pro after realizing how bad the 5800 you had sucked doesn't mean you aren't an nVidiot--btw you don't need to lie about the e-mail bit, we both know you did it because you realized the 5800 was garbage.
See above. Beyond that, I buy most video cards that come out in one form or another. I like to try new video cards, even ones by companies with lots of driver and hardware issues that have been operating in the red for years due to poor sales. If the other ATI employees are as nasty as you, I can see why you're not building up any customer base.

I might have a 5900 by the end of the year if I feel like it.

I haven't gotten around to updating the Rigs page, gee I guess I better get right on it!
I think the board is more interested in the deal you get! Is it the employee discount, or the RMAs that work free? What do you do for them anyway? (besides pimp their stuff like your life depended on it here and at Rage3d?)