techPowerUP! goofs and posts HD2900XT review early?

Page 6 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Ackmed

Diamond Member
Oct 1, 2003
8,499
560
126
Originally posted by: ShadowOfMyself
Originally posted by: Ackmed

Windows Vista Direct X10 SDK

PIPEGS

2900XT 159 fps
88GTS 34 fps
88GTX 63 fps


CubemapGS, Car, Instancing

2900XT 23 fps
88GTS 9 fps
88GTX 11 fps

Cubemap, Car, Instancing

2900XT 18 fps
88GTS 10 fps
88GTX 11 fps

I have no idea what this is, but this is the kind of thing I am expecting with DX10 games

Hope I am right :D[/quote]

I dont know. A demo of Lost Planet is supposed to be out the 15th too. No idea how much DX10 it actually uses, maybe it will give us a glimpse of whats to come.

Someone already has their HD 2900XT...
http://www.rage3d.com/board/showthread.php?t=33890816
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: ShadowOfMyself
Originally posted by: n7
LOL i wish i could agree with you.

We'll see tomorrow i guess?

But Extelleron is right... Everyone that says the reviews are BS is considered a fanATIc... WHY? How can you ignore the inconsistencies? Do you seriously believe the same card can kick the GTX in a game and be stomped in another? Its ridiculous!

ok ... lets say i was going to "falsify" a review ... just "dry lab" it --without HW ;)

i *would NOT* have inconsistencies like the ones in the above mentioned reviews ...
--they would look *plausible* ...

obviously they took a lot of time

with new architecture, who can say for sure?
:confused:

now ... i am going to "guess" .... from what i read ...

that the HD2900xt will mostly lose the the GTX, mostly win from the GTS ... in DX9 and XP

in Vista i am going further out on a limb to *predict* that AMD will have better Vista drivers AND ...

wait for it ...

HD2900xt will generally be the "equal" of the GTX in DX10/Vista games ... for a lot less money and with more features


--now lets see how "off" i am

... about the 2900xt :p

 

Extelleron

Diamond Member
Dec 26, 2005
3,127
0
71
Originally posted by: apoppin
Originally posted by: ShadowOfMyself
Originally posted by: n7
LOL i wish i could agree with you.

We'll see tomorrow i guess?

But Extelleron is right... Everyone that says the reviews are BS is considered a fanATIc... WHY? How can you ignore the inconsistencies? Do you seriously believe the same card can kick the GTX in a game and be stomped in another? Its ridiculous!

ok ... lets say i was going to "falsify" a review ... just "dry lab" it --without HW ;)

i *would NOT* have inconsistencies like the ones in the above mentioned reviews ...
--they would look *plausible* ...

obviously they took a lot of time

with new architecture, who can say for sure?
:confused:

now ... i am going to "guess" .... from what i read ...

that the HD2900xt will mostly lose the the GTX, mostly win from the GTS ... in DX9 and XP

in Vista i am going further out on a limb to *predict* that AMD will have better Vista drivers AND ...

wait for it ...

HD2900xt will generally be the "equal" of the GTX in DX10/Vista games ... for a lot less money and with more features


--now lets see how "off" i am

... about the 2900xt :p

If that review is with real hardware, then fine, but it's still not trustworthy. These people obviously have no idea how to review. A "driver problem" does not cause a card to perform better when it has to render twice the number of pixels. A really bad reviewer might be able to cause that.

 

inveterate

Golden Member
Mar 1, 2005
1,504
0
0
DUDE, I know ATI's official plan. They're gonna jump out of the showers and Yell "yargggg , i'm a wizard",

Then the HD2900 is gonna be awsomee...

Or it could be a political thing as in ,, SOMEONE is Shorting AMD/ATI stocks right now..

Wheres 007 when u need um
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: Extelleron
Originally posted by: apoppin
Originally posted by: ShadowOfMyself
Originally posted by: n7
LOL i wish i could agree with you.

We'll see tomorrow i guess?

But Extelleron is right... Everyone that says the reviews are BS is considered a fanATIc... WHY? How can you ignore the inconsistencies? Do you seriously believe the same card can kick the GTX in a game and be stomped in another? Its ridiculous!

ok ... lets say i was going to "falsify" a review ... just "dry lab" it --without HW ;)

i *would NOT* have inconsistencies like the ones in the above mentioned reviews ...
--they would look *plausible* ...

obviously they took a lot of time

with new architecture, who can say for sure?
:confused:

now ... i am going to "guess" .... from what i read ...

that the HD2900xt will mostly lose the the GTX, mostly win from the GTS ... in DX9 and XP

in Vista i am going further out on a limb to *predict* that AMD will have better Vista drivers AND ...

wait for it ...

HD2900xt will generally be the "equal" of the GTX in DX10/Vista games ... for a lot less money and with more features


--now lets see how "off" i am

... about the 2900xt :p

If that review is with real hardware, then fine, but it's still not trustworthy. These people obviously have no idea how to review. A "driver problem" does not cause a card to perform better when it has to render twice the number of pixels. A really bad reviewer might be able to cause that.

OK ... i am just giving my "guess' and my "take" as you are ... and making a prediction:

here's what i BASE it on

1) all of the reviews show major variability ... in the case of theInq's 'former colleague', they *admitted* some of the scores were way off and tossed some out

2) theInq 'backs' them as AT 'backs' DT
[not very much ... but a link to a little credibility

3) they tested for 100 straight hours
--i know by the 3rd day in a row, some things are gonna get screwed up and figures transposed and simple errors exacerbated by rushing things and [apparently] losing sleep

4) their 2nd set of numbers are a little better

so ... i think they have the cards ... but are not good reviewers and/or have rushed the hell out of it with little time as they try to do several platforms



 

Extelleron

Diamond Member
Dec 26, 2005
3,127
0
71
Originally posted by: apoppin
Originally posted by: Extelleron
Originally posted by: apoppin
Originally posted by: ShadowOfMyself
Originally posted by: n7
LOL i wish i could agree with you.

We'll see tomorrow i guess?

But Extelleron is right... Everyone that says the reviews are BS is considered a fanATIc... WHY? How can you ignore the inconsistencies? Do you seriously believe the same card can kick the GTX in a game and be stomped in another? Its ridiculous!

ok ... lets say i was going to "falsify" a review ... just "dry lab" it --without HW ;)

i *would NOT* have inconsistencies like the ones in the above mentioned reviews ...
--they would look *plausible* ...

obviously they took a lot of time

with new architecture, who can say for sure?
:confused:

now ... i am going to "guess" .... from what i read ...

that the HD2900xt will mostly lose the the GTX, mostly win from the GTS ... in DX9 and XP

in Vista i am going further out on a limb to *predict* that AMD will have better Vista drivers AND ...

wait for it ...

HD2900xt will generally be the "equal" of the GTX in DX10/Vista games ... for a lot less money and with more features


--now lets see how "off" i am

... about the 2900xt :p

If that review is with real hardware, then fine, but it's still not trustworthy. These people obviously have no idea how to review. A "driver problem" does not cause a card to perform better when it has to render twice the number of pixels. A really bad reviewer might be able to cause that.

OK ... i am just giving my "guess' and my "take" as you are ... and making a prediction:

here's what i BASE it on

1) all of the reviews show major variability ... in the case of theInq's 'former colleague', they *admitted* some of the scores were way off and tossed some out

2) theInq 'backs' them as AT 'backs' DT
[not very much ... but a link to a little credibility

3) they tested for 100 straight hours
--i know by the 3rd day in a row, some things are gonna get screwed up and figures transposed and simple errors exacerbated by rushing things and [apparently] losing sleep

4) their 2nd set of numbers are a little better

so ... i think they have the cards ... but are not good reviewers and/or have rushed the hell out of it with little time as they try to do several platforms

I'm pretty much done with the speculation and rumors... I want to know anything I can about the 2900XT but at this point I might as well wait till Monday, the real day of reckoning.

But I agree with you for the most part; I think the 2900XT will be slightly slower than a GTX in most DX9 games (perhaps surpassing the GTX at 2560x1600 with AA/AF) but be slightly faster in DX10 games. The fact that it is $400 and hopefully below that, AND comes with some Valve games, makes it a very good deal. The only thing I'm really worried about is power consumption. From those who have the card already, ATI doesn't allow overclocking via Overdrive unless you have an 8-pin connector, and that must mean that this baby draws quite a bit of power. Hopefully if it does draw a lot of power, the XT is the peak of GPU power draw - things should go down from here, if power consumption continues to rise it is going to get out of hand.

 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
i think i will sleep till Tuesday

... 'till Tuesday?
:Q

anyway i *love* to speculate and make predictions about half-revealed future HW
- it costs nothing and i like the brain exercise ;)

and i Pre *planned ahead*

note the PS in rig

:D

for xfire
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: swtethan
how many 8 pins do you have apoppin?

i *never* - ever - OC my GPU ... never have, never will
[except to 'test' and get 3DMarkXX scores]

... and i would never - ever - even think of OC'ng SLI or Xfire :p

what was that question about again?
:confused:
 

swtethan

Diamond Member
Aug 5, 2005
9,071
0
0
Originally posted by: apoppin
Originally posted by: swtethan
how many 8 pins do you have apoppin?

i *never* - ever - OC my GPU ... never have, never will
[except to 'test' and get 3DMarkXX scores]

... and i would never - ever - even think of OC'ng SLI or Xfire :p

what was that question about again?
:confused:

how many 8 pins do you have? easy question.

my psu had 1x 8, but its used for my motherboard
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
oh ... i got *distracted*

sorry
:confused:

two 8 pin ... 12v Dual CPU support ... just one is used for my CPU ...

. . . and 2-channel PCIe connector

think i thought ahead far enough :p
 

Gstanfor

Banned
Oct 19, 1999
3,307
0
0
Originally posted by: Ackmed
Originally posted by: ShadowOfMyself
Originally posted by: Ackmed

Windows Vista Direct X10 SDK

PIPEGS

2900XT 159 fps
88GTS 34 fps
88GTX 63 fps


CubemapGS, Car, Instancing

2900XT 23 fps
88GTS 9 fps
88GTX 11 fps

Cubemap, Car, Instancing

2900XT 18 fps
88GTS 10 fps
88GTX 11 fps

I have no idea what this is, but this is the kind of thing I am expecting with DX10 games

Hope I am right :D

I dont know. A demo of Lost Planet is supposed to be out the 15th too. No idea how much DX10 it actually uses, maybe it will give us a glimpse of whats to come.

Someone already has their HD 2900XT...
http://www.rage3d.com/board/showthread.php?t=33890816[/quote]

Yes there will be a Lost Planet demo, and you'll see it first at www.nzone.com

Worlds first DX10 playable demo available May 15th from NZONE.com

 

n7

Elite Member
Jan 4, 2004
21,281
4
81
Originally posted by: apoppin
ok ... lets say i was going to "falsify" a review ... just "dry lab" it --without HW ;)

i *would NOT* have inconsistencies like the ones in the above mentioned reviews ...
--they would look *plausible* ...

obviously they took a lot of time

with new architecture, who can say for sure?
:confused:

now ... i am going to "guess" .... from what i read ...

that the HD2900xt will mostly lose the the GTX, mostly win from the GTS ... in DX9 and XP

in Vista i am going further out on a limb to *predict* that AMD will have better Vista drivers AND ...

wait for it ...

HD2900xt will generally be the "equal" of the GTX in DX10/Vista games ... for a lot less money and with more features


--now lets see how "off" i am

... about the 2900xt :p

I think you are pretty much bang on.

But we'll see Monday, & we'll see more DX10 results sometime later this year.
 

Extelleron

Diamond Member
Dec 26, 2005
3,127
0
71
Originally posted by: Gstanfor
Originally posted by: Ackmed
Originally posted by: ShadowOfMyself
Originally posted by: Ackmed

Windows Vista Direct X10 SDK

PIPEGS

2900XT 159 fps
88GTS 34 fps
88GTX 63 fps


CubemapGS, Car, Instancing

2900XT 23 fps
88GTS 9 fps
88GTX 11 fps

Cubemap, Car, Instancing

2900XT 18 fps
88GTS 10 fps
88GTX 11 fps

I have no idea what this is, but this is the kind of thing I am expecting with DX10 games

Hope I am right :D

I dont know. A demo of Lost Planet is supposed to be out the 15th too. No idea how much DX10 it actually uses, maybe it will give us a glimpse of whats to come.

Someone already has their HD 2900XT...
http://www.rage3d.com/board/showthread.php?t=33890816

Yes there will be a Lost Planet demo, and you'll see it first at www.nzone.com

Worlds first DX10 playable demo available May 15th from NZONE.com

[/quote]

Wahoo! Finally get to play around with DX10!! :D
 

Gstanfor

Banned
Oct 19, 1999
3,307
0
0
Ignore the top line of results, unless, of course, you like your $400 graphics cards running with no AA/AF.
 

HurleyBird

Platinum Member
Apr 22, 2003
2,812
1,550
136
Originally posted by: Gstanfor
Originally posted by: Ackmed
Originally posted by: ShadowOfMyself
Originally posted by: Ackmed

Windows Vista Direct X10 SDK

PIPEGS

2900XT 159 fps
88GTS 34 fps
88GTX 63 fps


CubemapGS, Car, Instancing

2900XT 23 fps
88GTS 9 fps
88GTX 11 fps

Cubemap, Car, Instancing

2900XT 18 fps
88GTS 10 fps
88GTX 11 fps

I have no idea what this is, but this is the kind of thing I am expecting with DX10 games

Hope I am right :D

I dont know. A demo of Lost Planet is supposed to be out the 15th too. No idea how much DX10 it actually uses, maybe it will give us a glimpse of whats to come.

Someone already has their HD 2900XT...
http://www.rage3d.com/board/showthread.php?t=33890816

Yes there will be a Lost Planet demo, and you'll see it first at www.nzone.com

Worlds first DX10 playable demo available May 15th from NZONE.com

[/quote]

It will perform better on Nvidia hardware. Think about it. It comes out a day after the R600 NDA lifts, on Nzone. Looks like Nvidia is going to try and show us that their cards perform better in DX10. However, Lost Planet may or may not be indicative of other DX10 titles.
 

Gstanfor

Banned
Oct 19, 1999
3,307
0
0
Here are some 3dmark06 comparison pics that show ATi 'Blur-O-Vision?' AA hard at work

R600 - Deep Freeze
G80 - Deep Freeze

R600 - Canyon Flight
G80 - Canyon Flight

comments following the images being posted:
Is it my imagination or is the GTX image sharper?

You can see it on the dragon and the Airship.

Also the cables(under the sun) seem shaper on the GTX in the snow benchie.

Humm. The dragon is blurry yet the hangar roof in the bottom right is "noisy".

It seems to be blurring the image more then anything.

Ye.. the HD2900XT image is blurry and the GTX is sharper.

Thats what i'm seeing too. On the airship, you can see the lines on the GTX shot while that gets blurred out on the XT shot.

I don't think it's your imagination: the skin of the dragon loses a lot of detail. Ouch.
For cables and for edges, the blurring obviously removes the harsh straircasing. Not bad. But in my opinion, it doesn't weigh against the loss of sharpness.
 

Matt2

Diamond Member
Jul 28, 2001
4,762
0
0
Wow.

The HD2900XT pics are way more blurry than the G80 pics. I know tuteja is going to come in here and yell at me for being a fanboy, but you have to be a fanboy not to see the 8800GTX is MUCH MUCH sharper.

Just look at the Deep Freeze pics. The crates at the bottom and on the right lose a ridiculous amount of detail on the R600 pic.

EDIT: What AA modes are being used? Is the G80 using CSAA or regular MSAA? What about R600? Is that the new CFAA or whatever it's called?
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: Matt2
Wow.

The HD2900XT pics are way more blurry than the G80 pics. I know tuteja is going to come in here and yell at me for being a fanboy, but you have to be a fanboy not to see the 8800GTX is MUCH MUCH sharper.

Just look at the Deep Freeze pics. The crates at the bottom and on the right lose a ridiculous amount of detail on the R600 pic.

EDIT: What AA modes are being used? Is the G80 using CSAA or regular MSAA? What about R600? Is that the new CFAA or whatever it's called?

... he is probably asleep in Australia ...
dreaming of r600 in shining armour slaying the nvidia GTX dragon in DX10-land

--or out partying :p

[you were up pretty late Fri, weren't you ? :p]
:D

anyway,

i'll "stand in" for him, nvidia-fanboy-with-ATi-card

the new ATi card uses here-to-fore unheard of super-multi-sampling that completely blurrs ALL straight lines

:Q


which pics are you referring to ?

edit:
seriously ... which *review* does this come from?
 

ShadowOfMyself

Diamond Member
Jun 22, 2006
4,227
2
0
After stepping up the quality so much on the x1800/x1900 series I dont see Ati doing such a low trick (which is Nvidia-like) to get more performance... also because its very noticeable

Good thing there is only one day left
 

Gstanfor

Banned
Oct 19, 1999
3,307
0
0
Originally posted by: ShadowOfMyself
After stepping up the quality so much on the x1800/x1900 series I dont see Ati doing such a low trick (which is Nvidia-like) to get more performance... also because its very noticeable

Good thing there is only one day left

You're in for a rude shock, i'm afraid.