HARDOCP 2900xt Review! (a proper review)

Page 6 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

OneOfTheseDays

Diamond Member
Jan 15, 2000
7,052
0
0
Originally posted by: CaiNaM
Originally posted by: apoppin
in case you still don't get it

the SW guys have to "tune" the drivers for the LATEST spin of the FINAL silicon

and they will continue to do so - as nvidia is still doing - for the next few months

but.. they HAVE been working on the drivers.

the hd2900xt has been ready to ship for a couple months now. they just held it back so they could do a "full" lineup release (where ARE those mid-range cards, lol), so you've gotta assume they've been working on drivers for the 'release' cards for months...

agreed they've had plenty of time to work on the drivers. i think they may have overdesigned the R600 this time around and it's going to take quite a while before they can fully tap into its potential.
 

CaiNaM

Diamond Member
Oct 26, 2000
3,718
0
0
Originally posted by: apoppin
Originally posted by: CaiNaM
Originally posted by: apoppin
in case you still don't get it

the SW guys have to "tune" the drivers for the LATEST spin of the FINAL silicon

and they will continue to do so - as nvidia is still doing - for the next few months

but.. they HAVE been working on the drivers.

the hd2900xt has been ready to ship for a couple months now. they just held it back so they could do a "full" lineup release (where ARE those mid-range cards, lol), so you've gotta assume they've been working on drivers for the 'release' cards for months...

do you really believe that nonsense?

i don't

i think they were totally lying ... they were trying to cover the "HW problems" with r600

it just came out ... the silicon has not been finalized for "months" :p

i didn't "buy" their reasoning back when they first stated it.. i was being somewhat sarcastic, but at the same time, the architecture doesn't change in a couple months, so there's really no good reason for them to have poor drivers, at least on xp.

vista is another story, and i am more then willing to cut them some slack. still, it's not like xh2900xt is a major departure from their previous hardware (unlike g70/71 to g80)....
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: Matt2
Originally posted by: apoppin
*everything* you say is true

all the reasons i had to dislike it also

- except the inferior IQ that is unproven .. so far

and we will see the prices of the GTS fall ...

since it is AMD's first foray into the graphics market, i will give than a C-

but a B+ on the card itself ... with the stock GTS earning a C+

I think you're being a little to generous with a grade of B+ on that scale, but that is just MY opinion.

If the GTX is an A, I would give HD2900XT a B, while the GTS 640mb would get a B- and the 320mb GTS a C.

That is just my opinion because I am basing it on pure performance. I dont give a rat's you know what about HD playback, HDMI, audio pass through, etc. As a gamer who watches HD movies and all that junk on a 1080p DLP TV, it just doesnt matter to me.

Again, my opinion. Also, I'd have to buy a new PSU to run this thing comfortably, so a GTX is actually going to be cheaper. Go figure.

i left out the GTX :eek:
--it gets an A+ in my book ...even the "price" is "worth it" ... it is KING!

... "except" for the initial driver support ... but i am not counting 'that' - NOW - as it is admittedly improved ... not close to "perfect" ... but greatly 'improved' :p
[someone PMmed me to point out that i neglected to say that the 8800 drivers HAVE improved ... oops ... so, let me "correct" that omission --right now ^]

OK?

i *also* expect the HDXT's drivers to just as dramatically improve

so .. now ... we're pretty close in our grades .. and that is SHOCKING as i am a [now labeled] a "fanboy" for liking the HD-XT with a B+ and you gave it a B
[AND ... i started with a higher score for the GTX ... A+ vs. A ... i think we actually have the *same* score!
:Q

and ... daamit ... i want one

the *only* other card i wanted at LAUNCH was the 6800u :p
-that never worked out although i bought a 450w PS especially for it

we'll see

more reviews to peruse and sales to watch for ...
i'm in no hurry
 

Gstanfor

Banned
Oct 19, 1999
3,307
0
0
I don't think GTS 640 prices will drop much, if at all in response to this.

One product is flawed, one is as close to perfect as it gets. Your basic "survival of the fittest" law applies here, and R600 ain't the fittest.

B3D review - CFAA
Custom Filter AA

Custom Filter AA, or CFAA for short, is AMD implementing non-box filters that look outside the pixel being processed in order to calculate final colour and antialias the image. The sample resolve for that is performed on the shader core, data passed in so the programmable hardware can do the math, with the filter function defined by the driver. That means AMD can implement a pluggable user filter system if they want, but even if they don't they can update, add and remove filters from the driver at will whenever they see fit.

The big advantage is the ability to perform better filtering than the standard hardware resolve. However the disadvantages include possible implementation of bad filters, and speed issues because the driver now has to issue, and the hardware run, resolve calculation threads which chew up available shader core cycles. Ah, but there's still the regular hardware resolve if you want maximum speed and the regular quality just from sampling a single pixel, right? Well no, that's not always the case.

Even for the basic box filter resolves, where the hardware weights samples based on their proximity to the pixel centroid, R600 will perform the calculations to resolve the samples on the shader core, that is unless compression for the tile is at maximum, so you know the resolve would just return the same colour anyway, so there's no math involved to filter. Currently that points to the hardware resolve either being broken, at least under some conditions (when compression is less than maximum), or it being easier to maintain in code if you're doing other filters anyway, so you only have one path for that in the driver. We lean towards the former, rather than the latter, since the performance deficit for shader core resolve seems to be significant, even for the basic box filter cases. That can be improved in the driver, however, and the hardware helps here by likely being able to decompress sample locations for a pixel into the shader core at a high rate.

Beyond 3d review - summary
Architecture Summary

Well well, graphics fans, it's finally here! Years in the making for AMD, via the hands of 300 or so engineers, hundreds of millions of dollars in expenditure, and unfathomable engineering experience from the contributing design teams at AMD, R600 finally officially breaks cover. We've been thinking about the architecture and GPU implementations for nearly a year now in a serious fashion, piecing together the first batches of information sieved from yon GPU information stream. As graphics enthusiasts, it's been a great experience to finally get our hands on it and put it through the mill of an arch analysis after all those brain cycles spent thinking about it before samples were plugged in and drivers installed.

So what do we think, after our initial fumblings with the shader core, texture filter hardware and ROPs? Well arguably the most interesting bits and pieces the GPU and boards that hold them provide, we've not been able to look at either for time reasons, resource reasons or they simply fall outside this article's remit! That's not to say things like the UVD, HDMI implementation and the tesselator overshadow the rest of the chip and architecture, but they're significant possible selling points that'll have to await our judgement a little while longer.

What remains is a pretty slick engineering effort from the guys and guys at AMD's Graphics Products Group, via its birth at the former ATI. What you have is evolution rather than revolution in the shader core, AMD taking the last steps to fully superscalar with independent 5-way ALU blocks and a register file with seemingly no real-world penalty for scalar access. That's backed up by sampler hardware with new abilities and formats supported to chew on, with good throughput for common multi-channel formats, and both the threaded sampler and shader blocks are fed and watered by an evolution of their ring-bus memory controller. We've sadly not been able to go into too much detail on the MC, but mad props to AMD for building a 1024-bit bi-directional bus internally, fed by a 16-piece DRAM setup on the 512-bit external bus.

Who said the main IHVs would never go to 512? AMD have built that controller in the same area as the old one (whoa, although that's helped by the process change). Using stacked pads and an increase in wire density, affording them the use of slower memory (which is more efficient due to clock delays when running at higher speeds), R600 in HD 2900 XT form gets to sucking over 100GB/sec peak theoretical bandwidth from the memories. That's worth a tip of an engineer's hat any day of the week.

Then we come to the ROP hardware, designed for high performance AA with high precision surface formats, at high resolution, with an increase in the basic MSAA ability to 8x. It's here that we see the lustre start to peel away slightly in terms of IQ and performance, with no fast hardware resolve for tiles that aren't fully compressed, and a first line of custom filters that can have a propensity to blur more than not. Edge detect is honestly sweet, but the CFAA package feels like something tacked on recently to paper over the cracks, rather than something forward-looking (we'll end up at the point of fully-programmable MSAA one day) to pair with speedy hardware resolve and the usual base filters. AMD didn't move the game on in terms of absolute image quality when texture filtering, either. They're no longer leaders in the field of IQ any more, overtaken by NVIDIA's GeForce 8-series hardware.

Coming back to the front of the chip, the setup stage is where we find the tesselator. Not part of a formal DirectX spec until next time with DX11, it exists outside of the main 3D graphics API of the time, and we hope the ability to program it reliably comes sooner rather than later since it's a key part of the architecture and didn't cost AMD much area. We'll have a good look at the tesselator pretty soon, working with AMD to delve deep into what the unit's capable of.

With a harder-to-compile-for shader core (although one with monstrous floating point peak figures), less per-clock sampler ability for almost all formats and channel widths, and a potential performance bottleneck with the current ROP setup, R600 has heavy competition in HD 2900 XT form. AMD pitch the SKU not at (or higher than) the GeForce 8800 GTX as many would have hoped, but at the $399 (and that's being generous at the time of writing) GeForce 8800 GTS 640MiB. And that wasn't on purpose, we reckon. If you asked ATI a year ago what they were aiming for with R600, the answer was a simple domination over NVIDIA at the high end, as always.

While we take it slow with our analysis -- and it's one where we've yet to heavily visit real world game scenarios, DX10 and GPGPU performance, video acceleration performance and quality, and the cooler side facets like the HDMI solution -- the Beyond3D crystal ball doesn't predict the domination that ATI will have done a year or more ago. Early word from colleagues at HEXUS, The Tech Report and Hardware.fr in that respect is one of mixed early performance that's 8800 GTS-esque or thereabouts overall, but also sometimes less than Radeon X1950 XTX in places. Our own early figures there show promise for AMD's new graphics baby, but not everywhere.

It's been a long time since that's been something anyone's been able to say about a leading ATI, now AMD, graphics part. We'll know a fuller story as we move on to looking at IQ and performance a bit closer, with satellite pieces to take in the UVD and HDMI solution and the tesselator to come as well. However after our look at the base architecture, we know that R600 has to work hard for its high-quality, high-resolution frames per second, but we also know AMD are going to work hard to make sure it gets there. We really look forward to the continued analysis of a sweet and sour graphics architecture in the face of stiff competition, and we'll have image quality for you in a day or two to keep things rolling. RV610 and RV630 details will follow later today.

I'd say the architecture was more sour, with a faint hint of sweet.

I'd score an "F" for two reasons - (1) Failure, (2) FanATIcs only.
 
Jun 14, 2003
10,442
0
0
oh my,

it doesnt seem all that bad imo, stalker aside (and maybe oblivion) its not a salivating old hound sitting in a puddle of its own urine is it?

but it is hotter, noiser, draws lots of power and currently more expensive than the GTS640, though i imagine the price will drop pretty quick if they hope to shift these off the shelves. for me, noise, heat and power draw are quite big factors...id even go for a slower/more expensive card if it was going to be quiet and cool.

its also a shame to see the once "top dog" of IQ not really advancing on that front, it appears they just cut n pasted the logic over from the R580, obviously thats not exactly a bad thing, but its not progress, something which nvidia did (well really the needed to didnt they). g80 has near perfect AF implementation and performance to boot.

its also a shame to read about the CFAA being a post process...thats a let down, but the customisable/programmable aspect of it sounds great, we could get AA paterns tailored specifically to individual games.

cross fire is a no no for me anyway.... i tried multicard gaming and found single card gaming to be a more fluid experience.

i guess if im going to go the way of a SFF i might as well pick up a GTS640 and be done with it.... i need a cool running card for that cramped space and one that wont melt the tiny 400W psu in those shuttles.
 

Pugnate

Senior member
Jun 25, 2006
690
0
0
I really don't trust HARDOCP. Their reviews are often ******, and they do things differently just for the sake of it; to have a claim to fame. Their review of the 8600s were laughable, and who can forget their stupid comments against buying a C2D processor? Anyway I've read this review, and is it really that bad? If it performs on the level of a 8800GTS, and is priced as such, what's the problem?

I am worried that AMD will throw in the towel here, and just stop working on high end video cards. Then it will leave Nvidia without an motivation to innovate.

The fact that this card overclocks like mad, and the drivers are far from finished gives me hope.

However over a year or so ago AanadTech had a scoop on how next gen cards were going to require excessive amounts of power and would need water cooling etc. I am sure they were talking about the R600. I am sure ATi realized it was a problem then and worked on it really hard, which is what gives us the R600 today. I just don't think they can get the R600 any cooler or use any less power.

I just hope AMD doesn't stop working on the GPU department.

 
Jun 14, 2003
10,442
0
0
Originally posted by: fierydemise
Originally posted by: swtethan
all the benches used different AA settings??? :( what!
Its HardOCP's "benchmarking" method, they determine what is the maximum quality settings at a "playable" frame rate then compare what they could enable on each card.

its a opinionated way to bench....afterall its just kyles or whoevers opinion as to what ran best. but i would argue that for the majority his opinion will be pretty close to home and you can look at the numbers to confirm, id be happy with those FPS at those settings.

i think benching this way is more revealing imo, its not entirely scientific (opinion on whats best and its not 100% repeatable) but i still think tests like these are better than canned/scripted benchmarks which dont always show the whole story
 

bobthemongoloid

Junior Member
Sep 4, 2004
9
0
0
i'd bet money drivers make this whole weird situation look a LOT better in a month or so.. higher FPS at 1920 vs 1600x1200?... At the least it'll probably level out with the gtx, and at around 100 to 150 bucks less, or more, they MIGHT be competetive, but this late, does it matter? I'm looking for a $100 (1280x1024)gaming card, so with price drops it matters to me =)
 

Matt2

Diamond Member
Jul 28, 2001
4,762
0
0
Originally posted by: Pugnate
I really don't trust HARDOCP. Their reviews are often ******, and they do things differently just for the sake of it, to have a claim to fame. Their review of the 8600s were laughable, and who can forget their stupid comments against buying a C2D processor. Anyway I've read this review, and it is it really that bad? If it performs on the level of a 8800GTS, and is priced as such, what's the problem?

I am worried that AMD will throw in the towel here, and just stop working on high end video cards. Then it will leave Nvidia without an motivation to innovate.

The fact that this card overclocks like mad, and the drivers are far from finished gives me hope.

However over a year or so ago AanadTech had a scoop on how next gen cards were going to require excessive amounts of power and would need water cooling etc. I am sure they were talking about the R600. I am sure ATi realized it was a problem then and worked on it really hard, which is what gives us the R600 today. I just don't think they can get the R600 any cooler or use any less power.

I just hope AMD doesn't stop working on the GPU department.

The problem is that it is six months late and the fastest GPU ATI/AMD could come up with is only the 3rd or 4th fastest SKU on the market. That means Nvidia has no reason to lower the price of the GTX or the Ultra and is going to operate completely uncontested at the high end.

They also failed to reach the level of IQ obtained by Nvidia.

Nvidia managed to deliver a GPU that is faster, cooler, quieter and less power consuming before AMD knew what hit them.

Dont get your hopes up about overclocking R600 either. Read the Driverheaven review. They hit 865/1026 and R600 only gained 6 fps in FEAR and 1 fps in Prey.
 
Jun 14, 2003
10,442
0
0
Originally posted by: apoppin
Originally posted by: AmdInside
"We asked ATI why there is no higher-end version and they pretty much told us that no one would buy it when you take the CrossFire into consideration. Well, we know that is simply false because there are people that buy $499 and $599 single video cards, case in point the GeForce 8800 GTX. ATI?s answers were a painful copout to not being able to get the job done and it was obvious. "

Man, what happened with ATI facing the truth? They really need to fire this AMD person or all that talk about facing the truth and making changes within the company was just a lie.

that is not what AMD is telling anyone else ... who knows which marketing clown talked to him

--or find a link :p

lets look for supporting reviews ... before i accept this one's conclusion - based on OLD drivers - anyway :p

again coldpower27 .. who cars the reason ... out of date is out of date

invalid

move along to a newer set of more orthodox benchs, please


didnt hard say they got given some alpha drivers from ATi, ie drivers so fresh you cant get em yet and they said the performance increase wasnt worth re-doing all the benches?

im sorry mate but your banging on about "out of date drivers" (which for ati they are every single month as they bring out new ones) but do you honestly think that using a driver one month younger is going to paint a much rosier picture? i dont.

they've had copious amounts of time to work on them too. R600 has been in the works for some time.

its like a 1 month out of date mars bar, yeah its out of date....still edible though, and you wont die 2 hours later.

its each to his own on review style though, i prefer the HOCP and Bit-tech style...playing the games, recording the FPS, suggesting best playable settings. yes its more subjective and less repeatable (read unscientific) but for me it gives me a much better idea of what i can and cant get away with.

3dmark is total pile, the only use for that is the overclocking crowd.

canned benches can be just as bad at times unless they are chosen carefully.
 

Gstanfor

Banned
Oct 19, 1999
3,307
0
0
Dont get your hopes up about overclocking R600 either. Read the Driverheaven review. They hit 865/1026 and R600 only gained 6 fps in FEAR and 1 fps in Prey.
I posted brent justice's comments that overclocking and 3dmark06 scores derived from said overclock "don't mean jack in the real world" on R600, yet people ***STILL*** persist in thinking that overclocking is going to magically save R600.

3dmark07 is getting close to completion too - watch the fanATIcs crawl all over that too...
 
Jun 14, 2003
10,442
0
0
Originally posted by: Gstanfor
Dont get your hopes up about overclocking R600 either. Read the Driverheaven review. They hit 865/1026 and R600 only gained 6 fps in FEAR and 1 fps in Prey.
I posted brent justice's comments that overclocking and 3dmark06 scores derived from said overclock "don't mean jack in the real world" on R600, yet people ***STILL*** persist in thinking that overclocking is going to magically save R600.

3dmark07 is getting close to completion too - watch the fanATIcs crawl all over that too...

theres conflicting info too.

ive seen 1Ghz on air is achievable, and that overclocking unlocks crazy performance reserves.

ive seen a few pre-views, reviews and leaks where 800Mhz + was achieved

then we hear things like it uses that much wattage as it is that its unlikely the 1Gb GDDR4 version will come with appreciably higher clocks because it'll just consume too much power.

and other dubious r600 oc's that reveal paltry gains not worth the effort invested to overclock it.

i mean if it'll go 800mhz to 1Ghz that easily on air, why doesnt it come like that out of the box? they need all they can get....they must of known that, so if it is capable of banging on the door of 1Ghz why not do it.
 

MooMooCow

Senior member
Jan 11, 2007
283
0
0
Originally posted by: otispunkmeyer
Originally posted by: Gstanfor
Dont get your hopes up about overclocking R600 either. Read the Driverheaven review. They hit 865/1026 and R600 only gained 6 fps in FEAR and 1 fps in Prey.
I posted brent justice's comments that overclocking and 3dmark06 scores derived from said overclock "don't mean jack in the real world" on R600, yet people ***STILL*** persist in thinking that overclocking is going to magically save R600.

3dmark07 is getting close to completion too - watch the fanATIcs crawl all over that too...

theres conflicting info too.

ive seen 1Ghz on air is achievable, and that overclocking unlocks crazy performance reserves.

ive seen a few pre-views, reviews and leaks where 800Mhz + was achieved

then we hear things like it uses that much wattage as it is that its unlikely the 1Gb GDDR4 version will come with appreciably higher clocks because it'll just consume too much power.

and other dubious r600 oc's that reveal paltry gains not worth the effort invested to overclock it.

i mean if it'll go 800mhz to 1Ghz that easily on air, why doesnt it come like that out of the box? they need all they can get....they must of known that, so if it is capable of banging on the door of 1Ghz why not do it.


Pretty much every review has brought up the fact this card generates a lot of heat, eats up a lot of power, and very loud when doing both of those things. I'm sure if AMD could of shipped the card out at a higher clock speed without it sound like a typhoon while causing blackouts, AMD would of.
 

tuteja1986

Diamond Member
Jun 1, 2005
3,676
0
0
WHY are we even talking of Hardocp review... I thought we made it clear they can be trusted... Just look back at the C2D.
 
Jun 14, 2003
10,442
0
0
Originally posted by: Pugnate
I really don't trust HARDOCP. Their reviews are often ******, and they do things differently just for the sake of it; to have a claim to fame. Their review of the 8600s were laughable, and who can forget their stupid comments against buying a C2D processor? Anyway I've read this review, and is it really that bad? If it performs on the level of a 8800GTS, and is priced as such, what's the problem?

I am worried that AMD will throw in the towel here, and just stop working on high end video cards. Then it will leave Nvidia without an motivation to innovate.

The fact that this card overclocks like mad, and the drivers are far from finished gives me hope.

However over a year or so ago AanadTech had a scoop on how next gen cards were going to require excessive amounts of power and would need water cooling etc. I am sure they were talking about the R600. I am sure ATi realized it was a problem then and worked on it really hard, which is what gives us the R600 today. I just don't think they can get the R600 any cooler or use any less power.

I just hope AMD doesn't stop working on the GPU department.

i simply cannot agree with this, not from an engineering perspective anyway.

even when its just you, as an engineer (if your an enthusiastic one and most are) you want more, you want to push the boundry's, try something new, explore new paths and methods. thats what engineers do...i know, i am one.

theyre not gonna sit there an say "well lads, looks like its wrapped up for a while...lets spend the next month in pub collecting money"

if they at all passionated about engineering at nvidia, which im sure they are, they'll of been chomping at the bit to design improvements and advancements...engineers dont just sit there....they alwasy thing "how can i make that better, or more efficient, or more powerful"

however.....the acountants run the show, and i fear they simply couldnt care less about being able to do 32x SSAA with negligable performance hit, or being able to encode Divx 100x real time. they'll just turn up for work, make sure they stay black, and then go home again in their generic boring cars.

if engineers ran the show....man would life be different, we'd have some pretty funky stuff.
 

sisq0kidd

Lifer
Apr 27, 2004
17,043
1
81
Originally posted by: otispunkmeyer
Originally posted by: Pugnate
I really don't trust HARDOCP. Their reviews are often ******, and they do things differently just for the sake of it; to have a claim to fame. Their review of the 8600s were laughable, and who can forget their stupid comments against buying a C2D processor? Anyway I've read this review, and is it really that bad? If it performs on the level of a 8800GTS, and is priced as such, what's the problem?

I am worried that AMD will throw in the towel here, and just stop working on high end video cards. Then it will leave Nvidia without an motivation to innovate.

The fact that this card overclocks like mad, and the drivers are far from finished gives me hope.

However over a year or so ago AanadTech had a scoop on how next gen cards were going to require excessive amounts of power and would need water cooling etc. I am sure they were talking about the R600. I am sure ATi realized it was a problem then and worked on it really hard, which is what gives us the R600 today. I just don't think they can get the R600 any cooler or use any less power.

I just hope AMD doesn't stop working on the GPU department.

i simply cannot agree with this, not from an engineering perspective anyway.

even when its just you, as an engineer (if your an enthusiastic one and most are) you want more, you want to push the boundry's, try something new, explore new paths and methods. thats what engineers do...i know, i am one.

theyre not gonna sit there an say "well lads, looks like its wrapped up for a while...lets spend the next month in pub collecting money"

if they at all passionated about engineering at nvidia, which im sure they are, they'll of been chomping at the bit to design improvements and advancements...engineers dont just sit there....they alwasy thing "how can i make that better, or more efficient, or more powerful"

however.....the acountants run the show, and i fear they simply couldnt care less about being able to do 32x SSAA with negligable performance hit, or being able to encode Divx 100x real time. they'll just turn up for work, make sure they stay black, and then go home again in their generic boring cars.

if engineers ran the show....man would life be different, we'd have some pretty funky stuff.

I don't know why people keep making this argument either. Maybe innovation might take a different path, but I doubt any large-scale manufacturer is going to stop innovating.

Companies will innovate for the sake of innovating, if not for the sake of the possibility of being overtaken by an emerging company.
 

Pugnate

Senior member
Jun 25, 2006
690
0
0
I am just taking Intel's example. Without competition they were happy churning out products that were marginally better etc. If you look at the software side of things, EA has bought just about any sporting license you can think of, and comfortably release the same garbage each year and people eat it up.

I know video hardware is different, but will Nvidia really fund as much research and innovation if their only competition at the high end is themselves? Just a thought.
 

Mem

Lifer
Apr 23, 2000
21,476
13
81
I'm waiting for Anandtech's review before I draw my own conculsions,however GTS prices have been falling lately and this will become a critical factor for a lot of people.

Drivers are one thing,but right pricing is a big thing for many.
 

Gstanfor

Banned
Oct 19, 1999
3,307
0
0
if engineers ran the show....man would life be different, we'd have some pretty funky stuff.
Thats just it - at ATi engineers Do run the show and you end up with over-engineered chips that don't focus on what's really needed and wanted in the marketplace by their customers - they are just engineered for the sake of engineering - witness the ring bus and 512bit memory bus.
 

yacoub

Golden Member
May 24, 2005
1,991
14
81
Originally posted by: Matt2
HD2900XT consistantly loses across the board to both 8800GTS 640mb and 8800GTX.

From their conclusion:

The ATI Radeon HD 2900 XT however is more akin to NVIDIA?s GeForce FX 5800. It does not seem like this will have a very long life span in comparison. NVIDIA quickly answered the GeForce FX 5800 by introducing the GeForce FX 5900 (NV35). ATI really needs to do something similar in this situation, or they may lose some loyal fans in the enthusiast community and you can bet they are going to continue to lose sales to NVIDIA?s 8000 series products.

EDIT: Another tidbit from their conclusion:

Despite what the numbers in 3DMark are showing our evaluation has proven that the ATI Radeon HD 2900 XT is slower than a GeForce 8800 GTS when it comes to actually gaming. Even our apples-to-apples real gaming tests confirmed that the 8800 GTS is faster than the HD 2900 XT and nowhere close to the GeForce 8800 GTX, yet here sits 3DMark showing us the opposite!

So the saying is true - you can code for 3DMark or you can code for games.
 

yacoub

Golden Member
May 24, 2005
1,991
14
81
Is the G80 refresh really so far away as Nov/Dec? The G80 has been out since last Nov/Dec, right? I thought refreshes came out 6-9 months after the main product. I would expect refresh around September.