CrystalBay
Platinum Member
- Apr 2, 2002
 
- 2,175
 
- 1
 
- 0
 
Originally posted by: Matt2
Originally posted by: apoppin
Originally posted by: Matt2
I'm patiently waiting for someone to come in here and say it's because the drivers are immature.
What happened to ATI's drivers being the bestest?
Didnt Vijay Sharma tell us that the drivers were in "great condition" or something along those lines?
Then there is the whole, "G80 is not optimized for DX10!". LOL. Gstanfor, where is that pic you posted the other night.
HardOCP didn't test with the latest ATI drivers
and Vijay Sharma told a lot of other lies in that interview![]()
in case you still don't get it
the SW guys have to "tune" the drivers for the LATEST spin of the FINAL silicon
and they will continue to do so - as nvidia is still doing - for the next few months
LOL.
Tune for the latest spin. Hehehe. Final silicon.
I cant believe Hector Ruiz still has a job.
I also love how we've been waiting for this GPU forever, but had to wait so AMD could "launch the entire family" and yet we still have to wait another month at least for the rest of the family.
Originally posted by: Laminator
Originally posted by: apoppin
what makes it *all* suspect is the older drivers
and that the reviewer is "irritated" that r600 is late ... and has zero clue about MSRP vs market prices
I think it was appropriate to factor lateness into his final conclusion about the card. This thing should have been out a long time ago if it wanted the same recognition as the 8800 series.
apoppin's post
Originally posted by: apoppin
Originally posted by: Golgatha
Originally posted by: coldpower27
Originally posted by: MadBoris
Originally posted by: coldpower27
It seems the X2900 XT is great at being a "big numbers" cards in specifications.
320 Stream Shaders vs 96!
742MHZ vs 500MHZ!
11K vs 9K 3D Mark 2006!
24xAA vs 16xAA!
512Bit vs 320Bit!
105GB Bandwidth vs 64!
This sure didn't help them...
The ATI Radeon HD 2900 XT has 16 texture units and can perform 16 bilinear filtered FP16 pixels per clock. In comparison the GeForce 8800 GTX has twice as many texture units, 32 and does 32 FP16 pixels per clock, and the GTS has 50% more with 24 FP16 pixels per clock.
...
There are also 16 ROPs in the ATI Radeon HD 2000 series. The GeForce 8800 GTS has 20 ROPs and the GTX has 24. The Radeon HD 2900 XT can perform 32 pixels per clock for Z, the GeForce 8800 GTS can do 40 and the GTX does 48.
...
All of this sounds great on paper, but the facts are we never really saw any major specific examples of this new memory subsystem making a specific impact in games with the previous generation. We may be looking at a rather unbalanced GPU. The memory subsystem potential is incredible, but if the GPU cannot keep the memory fed the point is lost.
Yeah, it's more like a marketing piece then actually performing all that amazing part. It's alot like the Pentium 4 to Athlon 64 philosophy. 320 Shader Units sounds great, but if they only have 1/4 of the performance of each G80 Shader Unit, then well blah.
That's a good analogy. It's like a Prescott vs an Athlon 64.
Prescott = Hot, hard to cool without a lot of noise, higher clock speeds, lower performance, and more expensive.
Prescott was simply a *bad decision* on intels part ...
the older NW was cooler and much more efficient
after the Prescott, intel *dumped* P4 architecture ... back to the M ... and C2D is kicking A64 butt.
this is *different* ... r600 is a NEW architecture ... it is the basis for the next gen of AMD GPU ... this time AMD was blindsided by G80 much the same way the intel's P4 was smacked by Athlon
i expect the refresh to be a major improvement
this is AMD's turn at a "x1800" kind of launch
but they are not 'giving up' on the r600 architecture - it is *sound* ... unlike intel and the P4
and thanks for the *clarification*, coldpower27
... so HardOCP used the *very latest* nvidia drivers and the older Cats ...
sound 'fair'![]()
Originally posted by: AmdInside
"We asked ATI why there is no higher-end version and they pretty much told us that no one would buy it when you take the CrossFire into consideration. Well, we know that is simply false because there are people that buy $499 and $599 single video cards, case in point the GeForce 8800 GTX. ATI?s answers were a painful copout to not being able to get the job done and it was obvious. "
Man, what happened with ATI facing the truth? They really need to fire this AMD person or all that talk about facing the truth and making changes within the company was just a lie.
Originally posted by: Zenoth
I made my decision, and it is very painful for me ... honestly it is, I feel very sad right now, but I will not buy the HD 2900XT, nor XL, nor R600 at all. I will have to switch the coin upside down and stare at the new side I'll be looking at for the months and perhaps years to come. I don't see anything positive about the 2900XT at all so far.
Inferior A-F: Major disappointment for me, because it'd have been the second most important point to consider in my book. It looks like ATi did not even try to noticeably improve their texture-filtering algorithm over the X1K's HQ one, WHAT is the reasoning behind that I don't know, WHY they decided it was the right path to take I don't know, but GeForce 8's algorithm is near perfect, and I doubt anything could be better than it, unless more samples are used, which clearly didn't happened for this generation, and for some reasons I don't see a 24x or 32x A-F mode appearing with R700 and/or G90 either.
Inferior efficiency for power requirements: The single most important point for me, and it fails at it, and it fails good, GeForce 8's GTS 640MB offers equal to or better than performance compared to the HD 2900XT and consumes less power.
Inferior performance: Don't mention driver issues, I won't believe it this time around. It's disappointing, and it has issues. I doubt that mere driver updates will suddenly increase performances across the board, in any games by 30% or 40% or more. Just look at the S.T.A.L.K.E.R. charts, and what's up with the issue concerning grass in Oblivion at high resolutions ... and all that while the GeForce 8 still sleeps on both ears laughing about it in its dreams.
Competitive price ?: I don't think it is since already the GTS 640MB can be found at the same price.
SLi or CrossFire ?: Give me the choice and I'd go with 2x 8800GTS 320MB in SLi over 2x HD 2900XT in CrossFire. Not only will it perform similarly or better in SLi with such a setup, but it will consume less power and even generate less heat in return, not to mention that it will cost me less money to realize just that.
A-A: It seems rather equal to my eyes, nothing very disappointing but nothing jaw-dropping either. I wouldn't feel "inferior" by using any of the GeForce 8's A-A methods over any of the HD 2900XT's methods. This points is perhaps the only neutral point I have. No significant benefits not disadvantages on either side. It's a "no looses" situation, so if I go with GeForce 8 instead it shouldn't be disappointing in regards to A-A, but I myself almost never use A-A in my games (with the sole exception of Source-based games).
I can't believe what I'm typing here.
I always supported ATi, I always wanted them to show how to do things right to nVidia, but they have nothing to prove this time around, and they even need to learn a couple of things from the green team. I am certainly not going to have a smile on my face when I will give the money to the seller for my GeForce 8, and I will surely walk out of the store still wondering what in the holy heck is happening, but I know when I have to be objective and see the differences in two products operating in the same domain.
NVIDIA clearly wins this round, good job guys.
ATI, please, I beg you to improve on ALL fronts for R700, except for A-A, maybe. The rest is all yours to improve, you have lessons to learn, so please do your homeworks. Is this pretentious from me ? Probably, but you guys have a job to do, IMPROVE against the competition, compete in terms of prices, give us as-good-as or better methods and technologies, and then the consumers will do their own job, to choose if they want to support you or not. I have been a loyal consumer (however I ain't so blindly) with you guys for FOUR years of my life, and now, for the very first time ever I am disappointed ... disappointed enough to change to that extent, to a new company that have LEARNED from their past mistakes.
I'll give myself maybe a month or so before doing it, before buying the GeForce 8 GTS/X, or maybe I will wait for the 8900 series to be there but my system is crying for an upgrade and I ain't a patient consumer nor a patient person at all overall in real life, but I know that the decisions I take are based on facts and not just blind misconceptions or false hopes that have been shaped as "reality" in my head just to give myself good consciousness over a product purchase just because I "believed" in the brand name. No thanks, that ain't me.
Nvidia ... here I come.
Originally posted by: Zenoth
I made my decision, and it is very painful for me ... honestly it is, I feel very sad right now, but I will not buy the HD 2900XT, nor XL, nor R600 at all. I will have to switch the coin upside down and stare at the new side I'll be looking at for the months and perhaps years to come. I don't see anything positive about the 2900XT at all so far.
Inferior A-F: Major disappointment for me, because it'd have been the second most important point to consider in my book. It looks like ATi did not even try to noticeably improve their texture-filtering algorithm over the X1K's HQ one, WHAT is the reasoning behind that I don't know, WHY they decided it was the right path to take I don't know, but GeForce 8's algorithm is near perfect, and I doubt anything could be better than it, unless more samples are used, which clearly didn't happened for this generation, and for some reasons I don't see a 24x or 32x A-F mode appearing with R700 and/or G90 either.
Inferior efficiency for power requirements: The single most important point for me, and it fails at it, and it fails good, GeForce 8's GTS 640MB offers equal to or better than performance compared to the HD 2900XT and consumes less power.
Inferior performance: Don't mention driver issues, I won't believe it this time around. It's disappointing, and it has issues. I doubt that mere driver updates will suddenly increase performances across the board, in any games by 30% or 40% or more. Just look at the S.T.A.L.K.E.R. charts, and what's up with the issue concerning grass in Oblivion at high resolutions ... and all that while the GeForce 8 still sleeps on both ears laughing about it in its dreams.
Competitive price ?: I don't think it is since already the GTS 640MB can be found at the same price.
SLi or CrossFire ?: Give me the choice and I'd go with 2x 8800GTS 320MB in SLi over 2x HD 2900XT in CrossFire. Not only will it perform similarly or better in SLi with such a setup, but it will consume less power and even generate less heat in return, not to mention that it will cost me less money to realize just that.
A-A: It seems rather equal to my eyes, nothing very disappointing but nothing jaw-dropping either. I wouldn't feel "inferior" by using any of the GeForce 8's A-A methods over any of the HD 2900XT's methods. This points is perhaps the only neutral point I have. No significant benefits not disadvantages on either side. It's a "no looses" situation, so if I go with GeForce 8 instead it shouldn't be disappointing in regards to A-A, but I myself almost never use A-A in my games (with the sole exception of Source-based games).
I can't believe what I'm typing here.
I always supported ATi, I always wanted them to show how to do things right to nVidia, but they have nothing to prove this time around, and they even need to learn a couple of things from the green team. I am certainly not going to have a smile on my face when I will give the money to the seller for my GeForce 8, and I will surely walk out of the store still wondering what in the holy heck is happening, but I know when I have to be objective and see the differences in two products operating in the same domain.
NVIDIA clearly wins this round, good job guys.
ATI, please, I beg you to improve on ALL fronts for R700, except for A-A, maybe. The rest is all yours to improve, you have lessons to learn, so please do your homeworks. Is this pretentious from me ? Probably, but you guys have a job to do, IMPROVE against the competition, compete in terms of prices, give us as-good-as or better methods and technologies, and then the consumers will do their own job, to choose if they want to support you or not. I have been a loyal consumer (however I ain't so blindly) with you guys for FOUR years of my life, and now, for the very first time ever I am disappointed ... disappointed enough to change to that extent, to a new company that have LEARNED from their past mistakes.
I'll give myself maybe a month or so before doing it, before buying the GeForce 8 GTS/X, or maybe I will wait for the 8900 series to be there but my system is crying for an upgrade and I ain't a patient consumer nor a patient person at all overall in real life, but I know that the decisions I take are based on facts and not just blind misconceptions or false hopes that have been shaped as "reality" in my head just to give myself good consciousness over a product purchase just because I "believed" in the brand name. No thanks, that ain't me.
Nvidia ... here I come.
Originally posted by: apoppin
Originally posted by: Matt2
Originally posted by: apoppin
Originally posted by: Matt2
I'm patiently waiting for someone to come in here and say it's because the drivers are immature.
What happened to ATI's drivers being the bestest?
Didnt Vijay Sharma tell us that the drivers were in "great condition" or something along those lines?
Then there is the whole, "G80 is not optimized for DX10!". LOL. Gstanfor, where is that pic you posted the other night.
HardOCP didn't test with the latest ATI drivers
and Vijay Sharma told a lot of other lies in that interview![]()
in case you still don't get it
the SW guys have to "tune" the drivers for the LATEST spin of the FINAL silicon
and they will continue to do so - as nvidia is still doing - for the next few months
LOL.
Tune for the latest spin. Hehehe. Final silicon.
I cant believe Hector Ruiz still has a job.
I also love how we've been waiting for this GPU forever, but had to wait so AMD could "launch the entire family" and yet we still have to wait another month at least for the rest of the family.
that's what it IS
nvidia has had their 8800GTX "out" in users hands and couldn't put out a worthwhile driver for over TWO MONTHS
after 4 months it was becoming tolerable according to the reports here
SIX months later there are still big issues - with some users
AMD has JUST released their card today
if their drivers are still a *clusterF&@k* - after two months like nvidia's drivers were
--THEN you can talk.
go for it!DailyTech's roundup of reviews from around the web for Monday
ATI Radeon HD 2900 XT
@ Hardware Upgrade
@ t-break
@ TweakTown
@ Phoronix
@ VR Zone
@ IT-review
@ Hardware Secrets
@ Guru3D
@ techPowerUp
@ Chile Hardware
that's pretty silly, imo .. making a decision on 'which GPU' in the first few hours of half-finished reviews
i hope you don't buy a car like that ... or a home![]()
It is what it is, and the HD 2900 XT performance wise ended up in the lower to mid part of the high-end segment. Sometimes it has a hard time keeping up with a 320MB 8800 GTS, and in other scenarios we see performance close or equal to the GeForce 8800 GTX. Now that would be weird if we all had to pay 600 USD/EUR for it. AMD knows this, and knows it very well. This is why, and honestly this is fantastic, the product is launched at a 399 MSRP. Now I have been talking with a lot of board partners already and here in Europe the product will launch at 350 EUR; and that's just golden.
So we need to leave the uber-power-frag-meister-performance idea behind us and mentally position the product where it is and compare it with the money you have to pay for it. For your mental picture; performance wise I'd say GeForce 8800 GTS 640 MB is a good comparative product (performance wise). Then the opinion will change as you'll receive absolutely a lot of bang for your bucks here. At 350 EUR you'll have a good performing DirectX 10 compatible product, new programmable tessellation unit. It comes with 512 megs of memory. It comes with a state of the art memory controller, offers HDCP straight out of the box, all cards have HDMI connectivity with support for sound and if that alone is not enough, you receive a Valve game-bundle with some very hip titles in there for free. So yeah, you really can't go wrong there.
The negatives then; only two really. First off is power consumption. I know that power is a cheap commodity, yet the card is rated at a peak of give or take 215 Watts, and that's just a lot. In a high-end system our power-draw was 400 Watts for the entire PC. So chances are that with the HD 2000 XT also comes a PSU upgrade. To make it slightly worse, if you want to overclock you'll need a PSU with the new 8-pin power connector as well. And at this time these are still rare (review here).
The second negative just has to be the fan. It's not dramatic in 2D desktop mode, really. But once you start gaming, you'll definitely hear it. And God forbid, if your card overheats to 100 Degrees C (which is not that awkward) you'll have a local hair-dryer available to you. So my recommendation here is in overclocked situation to be on the lookout for water-cooling or just wait for board-partners like HiS to come up with another solution. For generic usage however, please don't worry.
The positives: Image quality of the HD 2900 XT is just excellent. I could have gone in-depth but at one point this article has to stop right? No, the the new HQ anisotropic filtering is just looking great. Then again, it already was exceptionally good on previous generation products. The new CSAA modes are looking to be promising as well. Surely we did not get all modes to work properly but hey, we had to deal with an alpha driver. ATI has a good reputation in fixing this stuff in a timely manner, so we'll look at these modes closer once working properly. Beta drivers or not, pretty much any game we tested, no re-phrase, all games as tested worked without any issues. The one thing that initially had an issue was 3Dmark. Yet FutureMark will release a patch for this soon as it has to do with Futuremark's system diagnostic scan and thus is not an ATI issue.
What I also have to mention is that the new Avivo HD; or more precisely UVD decoder is looking to be awesome. Once you decode high-definition movies over the GPU you'll be surprised by the excellent video quality, it's really that good. The huge positive here is obviously that the entire decoding process is managed on the graphics card. Your CPU cycles are left alone, watching HD 1080P 20 MBit/s HD movie full screen on a 2560x1600 monitor and still have the CPU utilized 10-15% is really very interesting to observe.
So there you have it, guys and gals. The positives and negatives all sum up to an opinion, one that you have to make your decisions on. You will be charmed with the new Radeon HD 2900 XT if you can life with the knowledge of that power consumption and the fact that it's a noisy product. Performance wise at 350 EUR this Radeon HD 2900 XT offers an incredible amount of bang for your bucks, heaps of functionality and extremely good image quality. For whatever reason, what caught my attention is whenever a game makes a lot use of pixel shaders the overall performance starts to rise massively. And considering the future of games, that's a very good thing.
AMD's new series of R600 DX10 graphics cards are finally here and we have had plenty of time to examine them but we are left with a tough decision if we should recommend the HD 2900 XT or not. It impressed us considering the price tag and the extras such as sound output through DVI to HDMI adapters and the Valve software bundle but it leaves us wondering if AMD could have done a little better...
AMD's Radeon HD 2900 XT is going to cost between US$350 - US$399, closer to the latter in most places. That makes it at least US$150 cheaper than most GeForce 8800 GTX 768MB cards. Sure, Nvidia's 8800 GTX green champ won most of our tests in the 1280 x 1024 to 1920 x 1200 range but not by as bigger margins as we expected. What is interesting are our results at the 2560 x 1600 resolution ? both products have their share of driver issues (which need to be sorted out quickly!) but from our testing at this point in time, the HD 2900 XT makes a really good showing at the 2560 x 1600 resolution. In games like Prey, Company of Heroes and Far Cry, the more expensive 8800 GTX really only has a small lead over the XT. In real-world terms, both offer basically about the same level of game play experience in terms of smoothness and we put that down to the massive 106GB/s of memory bandwidth from the R600, which is able to be properly utilized at this ultra-high resolution. With some overclocking of the core and memory, the HD 2900 XT could probably match or better the more expensive 8800 GTX and that is with 256MB less memory. This in turn tells us that the engineers of the R600, while slow and indecisive, were smart and produced a GPU which is very efficient ? but at the right settings.
We expect factory overclocked HD 2900 XT cards to start selling in less than one month from now. AIB partners currently have the option of ordering 1GB GDDR-4 models with faster clock speeds but it is unsure if this product will be called HD 2900 XT 1GB GDDR-4 or HD 2900 XTX ? you may end up seeing these types of cards appear in early June (around Computex Taipei show time). If we saw a product like this with slightly faster core clock and obviously much faster memory clock (2000 - 2100MHz DDR vs. 1656MHz DDR), we think it would compete very nicely against the GeForce 8800 GTX as far as price vs. performance goes. Sadly we did not have a GeForce 8800 GTS 640MB handy for testing but matching up with our previous testing on similar test beds, the HD 2900 XT will beat it quite considerably, by around the 20% mark in 3DMark06, for example. This is rather interesting since the HD 2900 XT is in the same price range as the GeForce 8800 GTS 640MB ? we will test against this card shortly!
Summing it up, we are happy to see a new high-end Radeon graphics card from AMD ? it literally is the red hot flaming monster but it manages to offer a good amount of performance and impressive feature set with full DX10 and Shader Model 4.0, Crossfire and Windows Vista support and a host of others which we did not even have enough time to cover in full today, such as improved anti-aliasing and UVD. It was a long time coming but it is able to offer very good bang for buck against the equivalent from Nvidia - GeForce 8800 GTS.
looking good* ATI's Radeon HD 2900 XT is priced at a mere $399 which is an extremely competitive price for being a highest-class performance product.
* Breathtaking $399 price point
* Support for DirectX 10, Shader Model 4.0
* HDMI + Audio output included
* New video acceleration features
* New AA Modes
* Not fastest GPU in the world as speculated by many
* Fan can be noisy
* High power consumption
8.9 I have to admit I was a bit disappointed after seeing the benchmark results of the Radeon HD 2900 XT. Even though the card is fast it can not beat the NVIDIA GeForce 8800 GTX, it should have no problems beating the GeForce 8800 GTS 320 MB though. Having a product like this AMD did the only right thing - make it affordable. With a price point of $399 USD this card is taking the price/performance crown without breaking a sweat.
Even though ATI has employed a number of power-saving techniques and heat reducing features, the heat output is still fairly high, especially when you are overclocking at a higher voltage. This also means that the fan has to run faster and louder than competing products. On the other hand watercooling will provide seriously improved overclocking potential, companies like Danger Den and Thermaltake already have the first coolers ready.
The new features like integrated Tesselator and CFAA offer new ways to improve image quality while not hurting performance too much. Full HDCP with audio support and H.264 decode acceleration make this card an excellent choice for media PC systems where image quality matters. I really hope that AMD can improve their drivers to offer increased performance, because at this price point the card has the potential to be a card that everybody can afford.
