• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

STICKY: ATi 5xxx pre-release thread

Page 4 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Originally posted by: toyota
Originally posted by: cusideabelincoln
Originally posted by: Tempered81
Originally posted by: cusideabelincoln
Mmm, a new AA/AF algorithm. I'll be very tempted to get one of these 5000 series cards. At 1280x1024, I need all the AA I can get, and in newer games the 3850 just can't handle high AA with high details.

wow man, a 5870 could probably do every game at high quality 8xAA 16xQAF, max details at that resolution, and still give playable framerates. only ones you might dip below 40-50fps on would be gtaIV, crysis, ARMA2, & clearsky.

I'm not looking to get the 5870; I was hoping for the sub- or around-$200 5800 series, but I won't get a new card unless I come across some cash.

At $400 the 4870 will cost more than half of what my computer cost when I built it.

your cpu will even bottleneck the crap out of a current 4870 so a 5850 or 5870 would be a ridiculous waste anyway especially at 1280. you wouldnt even get 50% of what a card like that is capable of in many newer games. even at 3.0 your X2 cpu is only like a current Core 2 at 2.0-2.2 so please get a new platform if you are considering a card like the 5850 or so.

I'm well aware of bottlenecks, and quite frankly if I want to use high amounts of AA then a faster card is going to bring benefits. For example, if I wanted to play Crysis maxed out at 1280x1024, a CPU upgrade is not going to do jack shit to help my cause. I'll need a fast video card, like the 5850 (if I planned to get one). Even current generation high end cards have trouble getting 30 FPS in Crysis at 1280x1024 source for Crysis

Don't advise me to change platforms, because it totally isn't necessary. Stop acting like I would be doing some huge injustice if I were to put a 5850 into my system. Why do you care? If I do it, then I'll do so for my own reasons.

Oh and since you didn't bother to read anything I wrote: I NEVER SAID I WAS GETTING A 5850. In fact I stated that I would not be getting one, as early rumors put the 5850 at $300 and right now I would not buy and have never bought a video card over $200. I positively hate people bringing up irrelevant information, and then asking (telling) people what they should or should not do.
 
Originally posted by: cusideabelincoln
Originally posted by: toyota
Originally posted by: cusideabelincoln
Originally posted by: Tempered81
Originally posted by: cusideabelincoln
Mmm, a new AA/AF algorithm. I'll be very tempted to get one of these 5000 series cards. At 1280x1024, I need all the AA I can get, and in newer games the 3850 just can't handle high AA with high details.

wow man, a 5870 could probably do every game at high quality 8xAA 16xQAF, max details at that resolution, and still give playable framerates. only ones you might dip below 40-50fps on would be gtaIV, crysis, ARMA2, & clearsky.

I'm not looking to get the 5870; I was hoping for the sub- or around-$200 5800 series, but I won't get a new card unless I come across some cash.

At $400 the 4870 will cost more than half of what my computer cost when I built it.

your cpu will even bottleneck the crap out of a current 4870 so a 5850 or 5870 would be a ridiculous waste anyway especially at 1280. you wouldnt even get 50% of what a card like that is capable of in many newer games. even at 3.0 your X2 cpu is only like a current Core 2 at 2.0-2.2 so please get a new platform if you are considering a card like the 5850 or so.

I'm well aware of bottlenecks, and quite frankly if I want to use high amounts of AA then a faster card is going to bring benefits. For example, if I wanted to play Crysis maxed out at 1280x1024, a CPU upgrade is not going to do jack shit to help my cause. I'll need a fast video card, like the 5850 (if I planned to get one). Even current generation high end cards have trouble getting 30 FPS in Crysis at 1280x1024 source for Crysis

Don't advise me to change platforms, because it totally isn't necessary. Stop acting like I would be doing some huge injustice if I were to put a 5850 into my system. Why do you care? If I do it, then I'll do so for my own reasons.

Oh and since you didn't bother to read anything I wrote: I NEVER SAID I WAS GETTING A 5850. In fact I stated that I would not be getting one, as early rumors put the 5850 at $300 and right now I would not buy and have never bought a video card over $200. I positively hate people bringing up irrelevant information, and then asking (telling) people what they should or should not do.

I said it would be a waste "anyway". that means IF you were to do it. and the last time I looked this was a forum not some place where you just make a comment and nobody replies to it. I was just trying to give you suggestions since I run benchmarks and test games more than I play them. of course a better gpu will bring improvements and I didnt say buy a cpu instead of a new gpu now did I? I said IF you are going to get something really fast then you would need a better cpu ALSO.
 
Memory bandwidth looks a problem at first sight. The 4870 already had less then equivalent geforce's, now the shaders requiring access to that memory have doubled but the memory itself is hardly any faster. While I suppose the design will have been worked out to minimise the hit from this, it's still going to suffer when put up against a G300 which is alleged to have a 512bit bus and GDDR5 (i.e. more like 300GB/sec to the 5870's 150GB/sec).
 
ATi appears to have done what I predicted: take the 4xxx series, add DX11 and beef up the existing execution units by a factor of two. This was the smartest thing to do as they simply cannot afford a failure of 2900XT proportions by risking a new architecture. Up to twice the performance of the 4890 is quite a lot to be excited about, and it?s a safe move on ATi?s part.

Originally posted by: Scholzpdx

Nice job appopin on placing an AlienBabelTech watermark on SOMEONE ELSE'S photo!
That information was leaked to ABT from a reliable source, including the image as I understand it. Appopin watermarked it, but the image was likely also leaked to other websites that didn?t watermark it.
 
HD5870 Crysis Benchmark from Chiphell.

CPU:AMD Phenom II X4 955BE
Win 7 RTM
VGA:HD5870 1GB

Crysis 1900x1200 4AA+16AF DX10 Very High
min:30.**
avg:43.**
max:54.**


a comparison from hartware.de.


Crysis (very high) ? FSAA 4x / AF 8x
Intel Core 2 Duo 3.33 GHz, 2 GByte, Intel X38, Windows Vista

last table:

GeForce GTX 295 - 34fps
Radeon HD 4870 X2 - 31fps
GeForce GTX 285 - 23fps
GeForce GTX 280 - 21fps
GeForce GTX 260 - 17fps
Radeon HD 4870 - 17fps
 
Originally posted by: Janooo
HD5870 Crysis Benchmark from Chiphell.

CPU:AMD Phenom II X4 955BE
Win 7 RTM
VGA:HD5870 1GB

Crysis 1900x1200 4AA+16AF DX10 Very High
min:30.**
avg:43.**
max:54.**


a comparison from hartware.de.


Crysis (very high) ? FSAA 4x / AF 8x
Intel Core 2 Duo 3.33 GHz, 2 GByte, Intel X38, Windows Vista

last table:

GeForce GTX 295 - 34fps
Radeon HD 4870 X2 - 31fps
GeForce GTX 285 - 23fps
GeForce GTX 280 - 21fps
GeForce GTX 260 - 17fps
Radeon HD 4870 - 17fps


If that bench is accurate and taking into account min. framerates and 4xaa+16af, that's very nice, faster than gtx285 sli, 295 and tri285 sli. :thumbsup:
 
If that is to be believed it is astonishing. But we shall find out for certain in the coming week(s).
 
Originally posted by: BFG10K
ATi appears to have done what I predicted: take the 4xxx series, add DX11 and beef up the existing execution units by a factor of two. This was the smartest thing to do as they simply cannot afford a failure of 2900XT proportions by risking a new architecture. Up to twice the performance of the 4890 is quite a lot to be excited about, and it?s a safe move on ATi?s part.

Considering that DX11 is an evolutionary step up from DX10.1, I don't see a reason to have a different radically architecture, AFAIK nVidia is doing the same with the GT300, so we shall see if both vendors will make a new architecture by DX12 :Q
 
Originally posted by: Janooo
Hands on: ATI Eyefinity review by techradar.com

The power of the technology that supplied Eyefinity was clear ? this was a meaty rig indeed to provide stutter-free six monitor action (6 x 2,560x1,600 resolution, in fact) but it wasn't so pricey that it would be beyond the means of an enthusiast gamer.

I was wondering when the dots would be connected. 6 monitors for an X2 card, not bad AMD. They got to do something to up the pixel-count so there is a reason for folks to feel the need to buy $600 video cards. I'm not complaining, I feel the monitor-upgrade itch coming on myself for nothing more than the desktop real-estate increase.
 
Originally posted by: Scholzpdx
Originally posted by: yacoub
And no doubt an 80db fan and 90C temp. 🙁

I'm so tired of the loud, hot GPUs ATI's released the last several generations, particularly given their smaller die sizes. I'd hate to have to declock it to get it to run at what I feel is an acceptable temperature. Oh well. Whoever makes a quieter, more effective hsf to attach to ATI cards should make a fortune.

/fail

You don't realize this uses a 40nm manufacturing process. This will certainly play a role in keeping everything cool.

Why the fuck are you complaining

What follows is an explanation, but know up front that I'm not actually complaining, I'm just observing their history of creating cards with high clockspeeds instead of better technology onboard that doesn't require being clocked so high to achieve an equivalent level of performance, and lamenting that their new cards appear to continue in that trend.



Ever since ATI started die-shrinking (80-whatever nm, 65nm, etc) they've used it to make their cards run hotter and louder than previous generations and that annoys me. This started with the X1xxx series and continued for several generations, possibly through to the present though i've stopped paying as much attention. I just see the temps they put out and the continued focus on high clockspeeds, and assume they haven't changed their gameplan.

Instead of taking advantage of the die shrinks to produce more complex components that allow for lower clockspeeds that result in cooler, quieter cards, they've gone the opposite direction. Instead of creating cards with better hardware configuration (wider buses, more processing units, etc, which requires more investment but allows a lower-clocked but just as powerful result in performance), they chose to go the other route and ramp the clockspeeds up into the stratosphere as a way to compensate. That decision was obnoxious at the time, and has been since when NVidia in some ways started to ape it, though still not to the extent that ATI had done so.

When we first heard about the X1xxx series and the smaller manufacturing process they would use, many of us got excited, expecting them to be cooler-running, quieter GPUs that would work great in smaller form-factor cases, or even just run quieter in a regular mid-tower case. Instead we got that famous F16-jet-hairdryer blower slapped onto a card that was running so hot it was a champion of instability and sometimes cooked itself in as little as six months in people's rigs. That was such a disappointment, it stuck in my mind ever since - especially since they kept going in that direction of high-clocked, hot, loud cards for at least a couple generations thereafter, and possibly through to today.

Regardless, in my original post in this thread, I was lamenting the fact that their new card will probably be hot and loud too, because of the reported specs which show it has a narrow bus and similar path of architecture as their previous generations that require a high clockspeed to compensate, and that in turn means hot and loud. Now granted, this iteration DOES appear to have more power in the hardware that may allow them to not have to clock it as high, but given their history I would not be surprised if they do anyway.

And for the record, since someone brought the 8800GT up, my 8800GT was in the 60s Celsius on the stock cooler and when I swapped in the quieter Thermaltake hsf it's running now, it's in the 40s-50s Celsius. So even though NVidia did follow suit with some of their cards running higher temps, the 8800GT was not one of them for me. I do remember how the initial batch did have an issue with the fan controller where it didn't work properly, but that was a bug, and it wasn't supposed to run at 80+ Celsius.
 
Originally posted by: yacoub
Originally posted by: Scholzpdx
Originally posted by: yacoub
And no doubt an 80db fan and 90C temp. 🙁

I'm so tired of the loud, hot GPUs ATI's released the last several generations, particularly given their smaller die sizes. I'd hate to have to declock it to get it to run at what I feel is an acceptable temperature. Oh well. Whoever makes a quieter, more effective hsf to attach to ATI cards should make a fortune.

/fail

You don't realize this uses a 40nm manufacturing process. This will certainly play a role in keeping everything cool.

Why the fuck are you complaining

Because ever since ATI started die-shrinking (80-whatever nm, 65nm, etc) they've used it to make their cards run hotter and louder than previous generations and that annoys me. This started with the X1xxx series and continued for several generations, possibly through to the present though i've stopped paying as much attention. I just see the temps they put out and the continued focus on high clockspeeds, and assume they haven't changed their gameplan.

Instead of taking advantage of the die shrinks to produce cooler, quieter cards, they've gone the opposite direction. Instead of creating cards with better hardware configuration (wider buses, more processing units, etc, which requires more investment but provides a lower-clocked but just as powerful result in performance), they chose to go the cheap route and ramp the clockspeeds up into the stratosphere as a way to compensate. That decision was obnoxious, especially since NVidia in some ways has started to ape it, though still not to the extent that ATI had done so.

When we first heard about the X1xxx series and the smaller manufacturing process they would use, many of us got excited, expecting them to be cooler-running, quieter GPUs that would work great in smaller form-factor cases, or even just run quieter in a regular mid-tower case. Instead we got that famous F16-jet-hairdryer blower slapped onto a card that was running so hot it was a champion of instability and sometimes cooked itself in as little as six months in people's rigs. That was such a disappointment, it stuck in my mind ever since - especially since they kept going in that direction of high-clocked, hot, loud cards for at least a couple generations thereafter, and possibly through to today.

Regardless, in my original post in this thread, I was lamenting the fact that their new card will probably be hot and loud too, because of the reported specs which show it has a narrow bus and similar path of architecture as their previous generations that require a high clockspeed to compensate, and that in turn means hot and loud.

And for the record, since someone brought the 8800GT up, my 8800GT was in the 60s Celsius on the stock cooler and when I swapped in the quieter Thermaltake hsf it's running now, it's in the 40s-50s Celsius. So even though NVidia did follow suit with some of their cards, the 8800GT was not one of them for me. I do remember how the initial batch did have an issue with the fan controller where it didn't work properly, but that was a bug, and it wasn't supposed to run at 80+ Celsius.

Just buy one of the many cards with non-Stock Coolers to begin with. My Gigabyte 4850 with Zalman cooler runs 43c Idle.
 
I have personally used the stock design hd3870, hd4870, and gtx260, and they ran at about 60c idle to 90c under load, but all had fans/cooling that were very well designed barely audible over psu/cpu fans.

Occaisionally a bad design slips through for one reason or another, but for the most part they are quiet and effective. In recent memory there have been more problems with the custom fans/cooling put out my some card manufacturers than the stock design.

I also see your concern about temps, but you can understand that these companies make money by running the hardware fast and hot - giving best value for money. In a minority of cases that can cause problems where ambient temps are high, but it's hardly bad design.

Don't be put off from the newer generations of cards because of a bad experience in the past.
 
Originally posted by: Minas
I have personally used the stock design hd3870, hd4870, and gtx260, and they ran at about 60c idle to 90c under load, but all had fans/cooling that were very well designed barely audible over psu/cpu fans.

Occaisionally a bad design slips through for one reason or another, but for the most part they are quiet and effective. In recent memory there have been more problems with the custom fans/cooling put out my some card manufacturers than the stock design.

I also see your concern about temps, but you can understand that these companies make money by running the hardware fast and hot - giving best value for money. In a minority of cases that can cause problems where ambient temps are high, but it's hardly bad design.

Don't be put off from the newer generations of cards because of a bad experience in the past.

And this is coming from a new member.

ATI has been reducing temps in every single market level besides the very top end, where it matters.

Just take a look at the 46x0 series. They are the SAME GPU as the 3850/3870, yet can now be run passively if you want to. That is a clear indication that their cooling solutions are very efficient in the mid range. Don't even get started on the changes from the 3450 to the 4350. The 4350 has double the SP's (16 vs 8) and runs at comparable clocks, get this, at the same exact TDP.

It's all about price vs performance, not heat. If anything its been all of my nVidia cards to run hot, but i'm not one to complain. It's a NON-ISSUE. That didn't once dissuade me from purchasing an nVidia or ATI GPU.
 
Originally posted by: toyota
I said it would be a waste "anyway". that means IF you were to do it. and the last time I looked this was a forum not some place where you just make a comment and nobody replies to it. I was just trying to give you suggestions since I run benchmarks and test games more than I play them. of course a better gpu will bring improvements and I didnt say buy a cpu instead of a new gpu now did I? I said IF you are going to get something really fast then you would need a better cpu ALSO.

Well I play my games. You're focusing too much on pure numbers. In the more graphically demanding games, like Crysis, upgrading my CPU is not going to help me out AT ALL. In the other games I play, my CPU is fast enough to provide smooth framerates, so I'm not going to care if I were getting 70 fps when I could be getting 80 fps (with a new CPU). As such, I do not need a better CPU like you keep insisting. I would by lying if I said I didn't want one, simply because I'm a fan of computer hardware and seeing all this new stuff is quite exciting, but in no way do I fucking need one.
 
@cusideabelincoln: unless you're playing some pretty old games, you'll see an increase of performance with a phenom II-level CPU (x2, x3 or x4). you have the right mobo. at which point a GPU upgrade would see benefits, too. unless of course you only play World of Goo and Starcraft...

edit: not saying you NEED a new CPU, but you WILL see improvements, even with a dual core Phenom II based CPU.

edit2: now where the hell are my HD5000 benches?!
 
Originally posted by: cusideabelincoln
I would by lying if I said I didn't want one, simply because I'm a fan of computer hardware and seeing all this new stuff is quite exciting, but in no way do I fucking need one.

Benchmarks don't show the whole story. Yes, going from 70 to 80 fps max frame rate is imperceptible, especially on a 60 hz LCD. But the same 10 fps at the low end, going from a 20 to 30 minimum frame rate is a night and day difference.

Personal experience. I went from a 3.2 ghz E2180 (a faster CPU than yours) to a 2.8 ghz i7. With a wussy 8800GT -- and quite a few games got "fixed" to not lurch around in busy spots.

 
Originally posted by: v8envy
Originally posted by: cusideabelincoln
I would by lying if I said I didn't want one, simply because I'm a fan of computer hardware and seeing all this new stuff is quite exciting, but in no way do I fucking need one.

Benchmarks don't show the whole story. Yes, going from 70 to 80 fps max frame rate is imperceptible, especially on a 60 hz LCD. But the same 10 fps at the low end, going from a 20 to 30 minimum frame rate is a night and day difference.

Personal experience. I went from a 3.2 ghz E2180 (a faster CPU than yours) to a 2.8 ghz i7. With a wussy 8800GT -- and quite a few games got "fixed" to not lurch around in busy spots.

Your 2180 isn't much faster than mine.

I'm fully aware of the bottom end. I'm getting sick of hearing about things I already know, though. My GPU is still the biggest "bottleneck" for me, and it will be the first thing I upgrade when and if I do.

Saying I have to or need to get a new CPU at the same time is completely fucking ridiculous, IMO. If I were to upgrade I know exactly what I'm doing. I don't need people to hold my hand. If I were to stick a faster GPU in this system, it would indeed be to see exactly how much benefit I would gain with the same CPU. In other words, I would want to do some real world testing on my own, and provide actual numbers and specificity instead of saying "quite a few games got fixed".
 
Originally posted by: Cookie Monster
And why do you think its a bunch of lies?

Its a well know practice from AMD and now from ATI too... is like the 3870... claiming 320 sp... but all programs that read the internal info and such of the card shows 64... after a while amd start building up a story that their SPs can take 5 instructions at once... making their Sps 5 times more effective... and virtually processing like 320 64x5=320

the 4870 claims to have 800 but only got 160... same deal as before... virtually the 5870 is going to have 1600 but that is more like 320 in the physical form....

UNFORTUNATLY!!!! benchmarks and game performance tests doesnt scale up 5 times like amd or ati claims... that is why i take their 1600 SPs.. i mean any SPs label as lie...
 
Originally posted by: Keysplayr
Originally posted by: Pelu
the 1600 rops or whatever the name is is a bunch of lies... it only have 320...

Pelu, the ATI 3800 series had 320sp. And the will now be 2 gens old. The current 4850/70/90 all have 800sp. Do you think for some reason they will remove 480sp's from their current soon to be last gen arch? If they changed their shader architecture, then I can see 320 being a reality. But I don't think that is happening.

Ohhhhhhhhh!!!! I get it. Your going by the complex shader count only.. Like the current 4870 has only 160 5 part shaders. 160x5 = 800.

The new part will have 320 5 part shaders. 320x5 = 1600. Gotcha. Six of one, half a dozen of another. This is what you must have meant, right?
Ya something like that... with that thinking nvidia cards probably are around 4000 SPs lol
 
Originally posted by: Janooo
HD5870 Crysis Benchmark from Chiphell.

CPU:AMD Phenom II X4 955BE
Win 7 RTM
VGA:HD5870 1GB

Crysis 1900x1200 4AA+16AF DX10 Very High
min:30.**
avg:43.**
max:54.**


a comparison from hartware.de.


Crysis (very high) ? FSAA 4x / AF 8x
Intel Core 2 Duo 3.33 GHz, 2 GByte, Intel X38, Windows Vista

last table:

GeForce GTX 295 - 34fps
Radeon HD 4870 X2 - 31fps
GeForce GTX 285 - 23fps
GeForce GTX 280 - 21fps
GeForce GTX 260 - 17fps
Radeon HD 4870 - 17fps

HOLY SHIT! :shocked: Crysis Very High AAx4 on a single card with 30FPS min? The lower stuff for this gen is AVERAGE lol

This will be a beast 😀 Unless those results are fake, which I unfortunately think they are...
 
Back
Top