techPowerUP! goofs and posts HD2900XT review early?

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Nightmare225

Golden Member
May 20, 2006
1,661
0
0
Originally posted by: Matt2
To this very day, Nvidia's CEO swears that even though NV30 was unsuccessful, it pushed GPU design in the right direction.

Well, wasn't the hugely successful 6 series based on this architecture?
 

nullpointerus

Golden Member
Apr 17, 2003
1,326
0
0
Originally posted by: Matt2
To this very day, Nvidia's CEO swears that even though NV30 was unsuccessful, it pushed GPU design in the right direction.
What's the nVidia CEO supposed to say?

Hey, we took millions of dollars from consumers to produce a flop, and then when we found that out, we hired a marketing firm and tried to pawn it off as the best stuff evar!

:D
 

Nightmare225

Golden Member
May 20, 2006
1,661
0
0
Originally posted by: nullpointerus
Originally posted by: Matt2
To this very day, Nvidia's CEO swears that even though NV30 was unsuccessful, it pushed GPU design in the right direction.
What's the nVidia CEO supposed to say?

Hey, we took millions of dollars from consumers to produce a flop, and then when we found that out, we hired a marketing firm and tried to pawn it off as the best stuff evar!

:D

What else, advertise their product as "worse than the competitors." Yeah, I sure see the company staying together with that kind of ad campaign... :disgust:
 

defiantsf

Member
Oct 23, 2005
132
0
0
For the faithfuls, there is still a chance that in HD2900XT will blow away the GTX in DX10 games. AMD/ATI must have optimized it for DX10 games. We haven't seen the whole story yet.

Yeah...

/fingers-crossed

 

nullpointerus

Golden Member
Apr 17, 2003
1,326
0
0
Originally posted by: Nightmare225
Originally posted by: nullpointerus
Originally posted by: Matt2
To this very day, Nvidia's CEO swears that even though NV30 was unsuccessful, it pushed GPU design in the right direction.
What's the nVidia CEO supposed to say?

Hey, we took millions of dollars from consumers to produce a flop, and then when we found that out, we hired a marketing firm and tried to pawn it off as the best stuff evar!

:D

What else, advertise their product as "worse than the competitors." Yeah, I sure see the company staying together with that kind of ad campaign... :disgust:

:roll:

Humor and business advice are two different things, Nightmare225.

In the future I would appreciate it if you wouldn't take my jokes as serious business recommendations--and then try to belittle me for them.
 

Matt2

Diamond Member
Jul 28, 2001
4,762
0
0
Originally posted by: Nightmare225
Originally posted by: nullpointerus
Originally posted by: Matt2
To this very day, Nvidia's CEO swears that even though NV30 was unsuccessful, it pushed GPU design in the right direction.
What's the nVidia CEO supposed to say?

Hey, we took millions of dollars from consumers to produce a flop, and then when we found that out, we hired a marketing firm and tried to pawn it off as the best stuff evar!

:D

What else, advertise their product as "worse than the competitors." Yeah, I sure see the company staying together with that kind of ad campaign... :disgust:

ok guys, dont shoot the messenger, I'm just regurgitating what I read.

My point was that although Nvidia thought they had a winner with NV30, things just dont play out the way they're supposed to sometimes. R300 probably made Nvidia poo their pantalones just like G80 probably made ATI poo theirs.

Just because a GPU didnt live up to it's expectations doesn't mean that all these R600 benches we are seeing are fake. I know I'm bordering on herecy here on this board by saying this, but why can't some people just accept the fact that ATI *might* have pulled an Nvidia and developed a bad GPU?

It's too premature to say that R600 is going to be a flop, but the possibility is there. I have a hard time believing that 5 leaked reviews are painting the same picture, but some just can't bring themselves to think that ATI can be beat by Nvidia. I haven't seen one leaked bench that disagrees with DT's initial assesment.
 

Ackmed

Diamond Member
Oct 1, 2003
8,499
560
126
What a sad state of affairs when the only thing people care about is fps. While its the most important thing to me, there are various other things that make a difference. Such as, IQ, price, drivers, bundle, etc. Things none of these "reviews" touched on. With 2900XT's being around $100 cheaper (or more) than 8800GTX's, its far too early to make a claim that one card is better than the other. Unless of course, you're a mindless drone who believes all rumors posted on the intarweb.
 

nullpointerus

Golden Member
Apr 17, 2003
1,326
0
0
Hey! I can shoot the messenger if I want `cause I'm using a clown gun with a fold-out BANG! hanky. :)

Actually, I don't recall expressing an opinion either way WRT your main point (though I was leaning toward agreeing with you).
 

nullpointerus

Golden Member
Apr 17, 2003
1,326
0
0
Originally posted by: Ackmed
What a sad state of affairs when the only thing people care about is fps. While its the most important thing to me, there are various other things that make a difference. Such as, IQ, price, drivers, bundle, etc. Things none of these "reviews" touched on. With 2900XT's being around $100 cheaper (or more) than 8800GTX's, its far too early to make a claim that one card is better than the other. Unless of course, you're a mindless drone who believes all rumors posted on the intarweb.

Hmm...then it's always "a sad state of affairs." Most people are mindless drones on any political subject, and pre-launch from either company is *always* a very political subject in this forum. So this kind of behavior is completely normal. Ridiculous, but normal. It'll quite down a bit when we get some hard data.
 

Matt2

Diamond Member
Jul 28, 2001
4,762
0
0
Originally posted by: Ackmed
What a sad state of affairs when the only thing people care about is fps. While its the most important thing to me, there are various other things that make a difference. Such as, IQ, price, drivers, bundle, etc. Things none of these "reviews" touched on. With 2900XT's being around $100 cheaper (or more) than 8800GTX's, its far too early to make a claim that one card is better than the other. Unless of course, you're a mindless drone who believes all rumors posted on the intarweb.

Speaking of IQ, have we heard anything on R600's IQ?

Earlier in the week I remember reading a tech slide someone posted that showed R600's new AA modes. I cant for the life of me remember what it's called, CFSAA or something like that. It's comparable to CSAA, but better, according to the slide. It supposedly offers much better edge detection than CSAA.

Other than that, I dont think there is much more ATI can improve on in terms of plain IQ. G80 is no NV40/G71 in terms of IQ, so I dont think there's anyway ATI can be a runaway winner of the IQ battle this time around.
 

Zenoth

Diamond Member
Jan 29, 2005
5,202
216
106
Originally posted by: MadBoris
Gosh, that was a suprise huh? Worse than last weeks leaks. 7 Months late and a billion dollars short.

I love how the 2900xt rules a GTX in 3dmark, afterall, we know that is more important than the games.
Before they used to gear drivers to 3dmark, now they actually engineer silicon for 3Dmark. :p ;)

Anyway, I hope some of you support AMD?ATYI, they need the support.

I for one will aways try to support them, but not blindly and not at any price. If that new generation isn't at least "as good as" its competition then it'll need to provide some unique features and/or better (lower) prices overall for me to buy any of the HD 2K series.

I never "hated" nVidia, I have no reasons to. But I never sought to support them either. If G80 proves to *still* be the best overall (prices, performance, features), then I will surely consider one of the G80's for my next upgrade, which is starting to be needed more than ever. My gaming experience isn't getting better since December 2005 (since I bought my system in my sig), it's just getting worse, surely, although slowly, but it ain't negligible anymore like it used to be a year ago.
 

yacoub

Golden Member
May 24, 2005
1,991
14
81
Originally posted by: theprodigalrebel
1600x1200, 4xAA/16xAF

Far Cry
FEAR
Prey
Quake 4
X3

Pretty lame review. No Oblivion, Rainbow Six Vegas, Supreme Commander, STALKER etc.

Edit: BTW, the review said 2900XT but the graphs show 2900XTX.

how f'ing stupid are they not to post the numbers for the GTS? they don't put the most important card on there, the 640MB 8800GTS (and the 320MB for that matter) because that's the card it's supposedly going to be most price-competitive with. I'd love to know how the GTS scores in those games at those resolutions in an identical rig. I'm betting the GTS wasn't tested because it does just as well but will be around $50 cheaper than the XT and that's for the 640MB version.
 

coolpurplefan

Golden Member
Mar 2, 2006
1,243
0
0
Originally posted by: ShadowOfMyself
... So they spent millions in R&D to make a card that gets owned by their own last gen part? Yes, that must be it :roll:

Maybe it's optimized for DX10. I can't see why they'd come out with an inferior card. Unless all of the other features of their HD cards make it really worthwhile like encoding and whatever.

 

yacoub

Golden Member
May 24, 2005
1,991
14
81
Originally posted by: Nightmare225
Originally posted by: ShadowOfMyself
Originally posted by: Pugnate
I can't believe some of you guys believe this to be true. You think the card is so bad that it is outperformed by a 1950XT? Please.

Maybe this is intentional from ATi so that by the time the card comes everyone is shocked by its awesomeness. :p

Thats what bugs me too... If the card sits somewhere between the 8800GTS and GTX in DX9, and takes the crown in DX10, its perfectly acceptable.. but getting kicked by the last gen part? Not even the 5800FX was that bad

Wasn't it? I vaguely remember it getting beat by the 4 series in some scenarios...

Heck the 8600GTS does to the 7900GT :D And I think the 8600GT does to the 7900GS.
 

Pugnate

Senior member
Jun 25, 2006
690
0
0
Originally posted by: golem
Originally posted by: Pugnate
I can't believe some of you guys believe this to be true. You think the card is so bad that it is outperformed by a 19500XT? Please.

Maybe this is intentional from ATi so that by the time the card comes everyone is shocked by its awesomeness. :p

I think you're joking, but if it is true that ATI did this intentionally, then the reviews are legitimate at THAT time given that both the software/hardware was provided by ATI.

We'll see when more reviews are made available, but it's possible that all these reviews with these weird numbers could be correct. ATI is trying out a new architecture and maybe it just doesn't translate well to the way some games run.

Yes I was joking.

Hence the :p

:p

I think the FX 5800 Ultra will go down in history as one of those mythical video cards that couldn't even render 2d and ate children. The more time elapses, the worse the story gets about the FX 5800 Ultra.

What a lot of people are forgetting though, is that it's competition was simply a beast. The FX 5800 Ultra wasn't necessarily a bad card, especially in DX8, it was just a horrible card compared to its competitor in DX9.

I think by the time my post ends, the 5800 story will have morphed into a story about how the card murdered kittens.

hahahahahaha that's the funniest thing I've read in a while.

Just because a GPU didnt live up to it's expectations doesn't mean that all these R600 benches we are seeing are fake. I know I'm bordering on herecy here on this board by saying this, but why can't some people just accept the fact that ATI *might* have pulled an Nvidia and developed a bad GPU?

It's too premature to say that R600 is going to be a flop, but the possibility is there. I have a hard time believing that 5 leaked reviews are painting the same picture, but some just can't bring themselves to think that ATI can be beat by Nvidia. I haven't seen one leaked bench that disagrees with DT's initial assesment.

Well because the R600 is a more advanced version of what powers the 360. It can't be that bad!

What a sad state of affairs when the only thing people care about is fps. While its the most important thing to me, there are various other things that make a difference. Such as, IQ, price, drivers, bundle, etc. Things none of these "reviews" touched on. With 2900XT's being around $100 cheaper (or more) than 8800GTX's, its far too early to make a claim that one card is better than the other. Unless of course, you're a mindless drone who believes all rumors posted on the intarweb.

Yes, we are paying $500 for a shiny box when we could have paid $150 for the same performance. ;)

But I agree with you, I can't believe any of this crap to be true.

Look I think there was a reason these 'benchmarks' were 'posted' and then removed... It was probably a publicity stunt or something.
 

Cookie Monster

Diamond Member
May 7, 2005
5,161
32
86
DX10? by the time its out, we might be seeing the 8900GTX series, or G90. (G90 is planned to be released sometime this year, sort of like nv40-->G70 refresh)

The most important thing is now, not the future since by then better hardware will be out. One thing the R600 might not do well is that current games still rely on texturing power. Its pretty much confirmed that R600 has 16TMUs (a more beefier TMUs compared to the ones in R5x0). This could be one reason why R600 falls so short against previous gen hardware.
 

jim1976

Platinum Member
Aug 7, 2003
2,704
6
81
Originally posted by: Cookie Monster
The most important thing is now, not the future since by then better hardware will be out. One thing the R600 might not do well is that current games still rely on texturing power. Its pretty much confirmed that R600 has 16TMUs (a more beefier TMUs compared to the ones in R5x0). This could be one reason why R600 falls so short against previous gen hardware.

;)

X2900XT
16 TMUs x 742 MHz = 11.9 GTexels/s

G80
32 TMUs * 575MHz = 18.4 GTexels/s

Not to mention that if x2900xt has the same fillrate with bi/AF it stays miles away with G80 in this case having 36.8

We have to bear in mind though that this applies only to D3D9 since texturing in D3D10 is really abstract atm.. ATI/AMD may have a new way of texturing which can prove much more powerful under the D3D10 API..
 

lopri

Elite Member
Jul 27, 2002
13,314
690
126
Originally posted by: ShadowOfMyself
Not even the 5800FX was that bad
I was hoping that I wouldn't be the first one to bring the dreaded FX up. As the real launch get closer, leaks will become more accurate. And Kristopher Kubicki @DT, while now running a news site, used to write reviews in here @AT back in the days.

The numbers are hard to believe but I think it's time to be out of denial. R600 isn't what we expected. (which naturally explains NV's arrogant pricing for 8800 Ultra's) What makes it harder to swallow this R600 fiasco is that ATI had a baby R600 like 2~3 years ago in the form of Xenos (R500). Such talented engineers failed to build up on R500 for such a long period time, with the rumoured (ridiculous) number of total 13 respins, totally gets me lost.

Even with aggressive pricing this card will have a hard time in the market. Whether it's because of media or not, people DO consider thermal/power of components these days.

And why does ATI (AMD) teaming up with Valve again? My apologies for folks enjoying Half Life series, but I have a real dislike for that company and their marketing strategy.
 

Pugnate

Senior member
Jun 25, 2006
690
0
0
I don't mind STEAM too much, I just think the HL games are vastly overrated. PCG magazine seems to have a massive hardon for them for no reason at all. The gameplay is very shoddy, and the weapons suck.
 

PingSpike

Lifer
Feb 25, 2004
21,758
603
126
Originally posted by: Matt2
Originally posted by: Nightmare225
Originally posted by: nullpointerus
Originally posted by: Matt2
To this very day, Nvidia's CEO swears that even though NV30 was unsuccessful, it pushed GPU design in the right direction.
What's the nVidia CEO supposed to say?

Hey, we took millions of dollars from consumers to produce a flop, and then when we found that out, we hired a marketing firm and tried to pawn it off as the best stuff evar!

:D

What else, advertise their product as "worse than the competitors." Yeah, I sure see the company staying together with that kind of ad campaign... :disgust:

ok guys, dont shoot the messenger, I'm just regurgitating what I read.

My point was that although Nvidia thought they had a winner with NV30, things just dont play out the way they're supposed to sometimes. R300 probably made Nvidia poo their pantalones just like G80 probably made ATI poo theirs.

Just because a GPU didnt live up to it's expectations doesn't mean that all these R600 benches we are seeing are fake. I know I'm bordering on herecy here on this board by saying this, but why can't some people just accept the fact that ATI *might* have pulled an Nvidia and developed a bad GPU?

It's too premature to say that R600 is going to be a flop, but the possibility is there. I have a hard time believing that 5 leaked reviews are painting the same picture, but some just can't bring themselves to think that ATI can be beat by Nvidia. I haven't seen one leaked bench that disagrees with DT's initial assesment.

Agreed. I believe these reviews are going to turn out to be generally correct. You know why? If they were total bullsh|t...any marketing department worth a sh|t would be shouting about how much bullsh|t they were. And I don't hear much shouting from AMD's marketing department.
 

PingSpike

Lifer
Feb 25, 2004
21,758
603
126
Originally posted by: Pugnate
I don't mind STEAM too much, I just think the HL games are vastly overrated. PCG magazine seems to have a massive hardon for them for no reason at all. The gameplay is very shoddy, and the weapons suck.

HL2 is a mod platform to me, its an alright game with good storytelling but its nothing technically spectacular. I'll agree, all the weapons suck. Mostly because they were just nerfed versions of HL1 weapons.
 

Keysplayr

Elite Member
Jan 16, 2003
21,219
54
91
Originally posted by: jim1976
Originally posted by: Cookie Monster
The most important thing is now, not the future since by then better hardware will be out. One thing the R600 might not do well is that current games still rely on texturing power. Its pretty much confirmed that R600 has 16TMUs (a more beefier TMUs compared to the ones in R5x0). This could be one reason why R600 falls so short against previous gen hardware.

;)

X2900XT
16 TMUs x 742 MHz = 11.9 GTexels/s

G80
32 TMUs * 575MHz = 18.4 GTexels/s

Not to mention that if x2900xt has the same fillrate with bi/AF it stays miles away with G80 in this case having 36.8

We have to bear in mind though that this applies only to D3D9 since texturing in D3D10 is really abstract atm.. ATI/AMD may have a new way of texturing which can prove much more powerful under the D3D10 API..

And, I read somewhere that the way R600's shaders are designed, a lot of operations take more passes (up to 4:1 R600:G80). I will try to find the read and post it.
This might explain why R600 doesn't even seem to best R580 in some cases. Maybe R580 was a much more efficient arch.
 

ShadowOfMyself

Diamond Member
Jun 22, 2006
4,227
2
0
Originally posted by: keysplayr2003
Originally posted by: jim1976
Originally posted by: Cookie Monster
The most important thing is now, not the future since by then better hardware will be out. One thing the R600 might not do well is that current games still rely on texturing power. Its pretty much confirmed that R600 has 16TMUs (a more beefier TMUs compared to the ones in R5x0). This could be one reason why R600 falls so short against previous gen hardware.

;)

X2900XT
16 TMUs x 742 MHz = 11.9 GTexels/s

G80
32 TMUs * 575MHz = 18.4 GTexels/s

Not to mention that if x2900xt has the same fillrate with bi/AF it stays miles away with G80 in this case having 36.8

We have to bear in mind though that this applies only to D3D9 since texturing in D3D10 is really abstract atm.. ATI/AMD may have a new way of texturing which can prove much more powerful under the D3D10 API..

And, I read somewhere that the way R600's shaders are designed, a lot of operations take more passes (up to 4:1 R600:G80). I will try to find the read and post it.
This might explain why R600 doesn't even seem to best R580 in some cases. Maybe R580 was a much more efficient arch.

For DX9 probably... Look, do you remember what Kombatant said? Something along the lines of "AMD made certain decisions about this card"

Now it makes sense, he was probably talking about how they had to "nerf" the card in DX9 for it to shine in DX10 since its a completely different system...

I just think AMD should have waited for the next gen part before doing that, because there are no DX10 titles out and thats gonna hurt
 

Gstanfor

Banned
Oct 19, 1999
3,307
0
0
I can't believe people still thing this thing is somehow going to miraculously perform better in Dx10.

Oh, well, at least we now know the fanatic warcry - Text
 

schneiderguy

Lifer
Jun 26, 2006
10,801
91
91
Originally posted by: PingSpike

HL2 is a mod platform to me, its an alright game with good storytelling but its nothing technically spectacular.

:| HL2 is one of the best single player FPS's ever :|