This whole R600/G80 benchmarks thing is nonsense.

Page 8 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Extelleron

Diamond Member
Dec 26, 2005
3,127
0
71
Originally posted by: Ackmed
I wouldnt make any bets, or claims on DX10 performance either way. If the past has told us anything, its that both cards will be too slow to run it very well. But to make any claim that one card will run it faster at this junction is pretty foolish.

Was the 9700 Pro too slow to run DX9? For the most part, the 9700 can still run games TODAY, in 2007, and it was released in 2002. Can't say the same about the FX5800 or FX5900, however.
 

Ackmed

Diamond Member
Oct 1, 2003
8,499
560
126
Originally posted by: Extelleron
Originally posted by: Ackmed
I wouldnt make any bets, or claims on DX10 performance either way. If the past has told us anything, its that both cards will be too slow to run it very well. But to make any claim that one card will run it faster at this junction is pretty foolish.

Was the 9700 Pro too slow to run DX9? For the most part, the 9700 can still run games TODAY, in 2007, and it was released in 2002. Can't say the same about the FX5800 or FX5900, however.

Run DX9 games today? Sure, at 800x600. Sorry, I dont like slide shows, or low res games. Farcry made my 9700 Pro crawl. A 5200 can run 'ANY" DX9 game as well. Does it make it playable? Not even close to me.

Crysis will likely not run well enough on a 2900 or 8800 for me. As I said, its pretty silly to even make a claim that one card will run them faster, when there will be only a handful of them out this year. And there will probably be a refresh of cards by then anyways.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: Nightmare225
Originally posted by: apoppin
maybe you missed it ... HD2900xt is gonna DESTROY the GTX in DX10

Link, please?
Contrary to what you may think, I really need to know and am not just trying to disprove you because I'm an NVIDIA fanboy, like some people on here are. :confused:

I really need to know, because if it's true, i need to sell my GTX ASAP to make enough money on an R600.

Although, I am curious as to why you went from being completely disappointed in and dumping on AMD/ATI just a week ago to saying that an XT will destroy a GTX today. Were you just sour that your old computer couldn't even physically take either an R600 and G80 that you looked for scapegoats that weren't promoting competition in any way? Probably an ATI fanboy all along, as I suspected... :disgust:

it's from what i am "hearing" ... 'fluff' ... but 'fluff' that has been correct in years past
--just "different" reports than Gstanfor is "hearing" ;)
--Chinese webpages that appear for an hour and that are taken down ... comments by bloggers i correspond with .... dev side-comments about nextgen DX10 HW

if you can't wait 7 days i feel really sorry for your situation ...
it appears that AMD/ATi went the 'future' route ... to excel in DX10 ...
i really *want* to see the benches ... especially the DX10 ones

as to *why* i was dumping on AMD .. they *lied* and BSed us about the r600 delay
-- and their stock IS tanking ... they are finally "owning up" to it ... so i give them back some "hope" i had completely lost for them

ATi is *gone* ... swallowed up .. i feel no loyalty to their corpse whatsoever ... :p
rose.gif
... R.I.P.
rose.gif


i have ZERO problem with "graphics by AMD" or "graphics by nvidia" ... WHICHEVER one is better bang-for-buck at the moment i click *buy*
--as to CPUs, it's *always* been intel ... coincidently ... just 'timing' like with my GPU purchases ... so to getting me to "root" for AMD is gonna take some major accomplishment
[barcelona, maybe ... i HOPE so]
 

Gstanfor

Banned
Oct 19, 1999
3,307
0
0
Originally posted by: Ackmed
Holy cow, he thinks NV is creating driver problems on purpose? Thats just, well, dumb.

No, I just think they aren't showing what the chip is really capable of yet (no need to either until a Dx10 game actually lands).
 

Gstanfor

Banned
Oct 19, 1999
3,307
0
0
Well, apoppin, if you regularly "hear voices" that come and go, then you either need to visit a clairvoyant/exorsist or become one yourself!
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: Gstanfor
Well, apoppin, if you regularly "hear voices" that come and go, then you either need to visit a clairvoyant/exorsist or become one yourself!

no "voices" that approach what you are hearing inside your head

i guess we'll know in just over a week how HD2900XT really does against the GTX ;)
 

Ackmed

Diamond Member
Oct 1, 2003
8,499
560
126
Originally posted by: Gstanfor
Originally posted by: Ackmed
Holy cow, he thinks NV is creating driver problems on purpose? Thats just, well, dumb.

No, I just think they aren't showing what the chip is really capable of yet (no need to either until a Dx10 game actually lands).

Do you think they are holding back performance? I cant believe they would do this, but who knows.

Another reason why I think its moot to try and decide right now which will be better at DX10 games, is because they will all be on Vista. And Vistas gaming performance is really, really bad. I dont think they're holding back, I think its just bad OS or drivers. Or both.
 

Gstanfor

Banned
Oct 19, 1999
3,307
0
0
I just think they are letting ATi believe that they are somehow weak in Dx10. Its a pretty common nvidia tactic, they deliberately gave 8 pipeline nv40's with cutdown features to developers before the launch to fox ATi, that generation. It wouldn't suprises me if even the developers aren't seeing full Dx10 performance yet.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: Gstanfor
I just think they are letting ATi believe that they are somehow weak in Dx10. Its a pretty common nvidia tactic, they deliberately gave 8 pipeline nv40's with cutdown features to developers before the launch to fox ATi, that generation. It wouldn't suprises me if even the developers aren't seeing full Dx10 performance yet.

yes, nvidia is "famous" for "timing" their *improved* driver releases with ATi launches
--letting all their customers suffer until that particular date.

but let's *see* if their $800 Ultra can best the $400 HD-XT in DX10

:p

i am certain some people will definitely be surprised ;)
 

palindrome

Senior member
Jan 11, 2006
942
1
81
Originally posted by: Gstanfor
I just think they are letting ATi believe that they are somehow weak in Dx10. Its a pretty common nvidia tactic, they deliberately gave 8 pipeline nv40's with cutdown features to developers before the launch to fox ATi, that generation. It wouldn't suprises me if even the developers aren't seeing full Dx10 performance yet.

So, you are saying that nVidia is PURPOSELY hindering their own dx10 performance? Explain how this helps developers, please, I'd love to know. Somehow I doubt there is much more performance to be gained through drivers on G80 cards. Get real, these things have been out since November, the 8800U is flopping (by everyone's standards, seriously, why not just get an EVGA...KO series...), and NV drivers STILL have tons of problems on vista. I think you need to see your doctor, because you are oozing stupid all over the place...
 

Gstanfor

Banned
Oct 19, 1999
3,307
0
0
Despite BFG10K's strident squealings, I don't think current G80 owners are suffering too much somehow....

And yes, I do think nvidia is hiding or masking their true Dx10 performance. At the moment its irrelevant to consumers and the need to compete effectively with the competition overrides consumer concerns at this point.
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,005
126
3dmark isn't a game,
But these are.

No More Shader Replacement

This is quite a big deal in light of the fact that, just over a year ago, thousands of shaders were stored in the driver ready for replacement on demand in NV3x and even NV4x.
Shader replacement even has its own section in the article.

I'm not disputing that nvidia replaced shaders there - they made several public statements about their intention to do so at the time.
The public statements that were first in the form of denial and then that they were "bugs"?

Despite BFG10K's strident squealings, I don't think current G80 owners are suffering too much somehow....
Except your thoughts seldom match reality, like your delusional claims that nVidia never engaged in shader substitution.
 

Gstanfor

Banned
Oct 19, 1999
3,307
0
0
All they did was state their opinion. I see no evidence backing said opinion up.

No, the statements where they said that 3dmark could be easily optimized for extra performance. Try looking again.

Just as a matter of interest have you ever tried extracting the 3dmark03 shaders? There is a grand total of 2 (T - W - O) Sm2.0 shaders in the entire benchmark....
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,005
126
All they did was state their opinion.
:roll:

How is that opinion? They didn't state "we think they are doing this", they "stated they are doing this".

Again if you aren't being reimbursed by the AEG for your antics then I truly pity you.

I see no evidence backing said opinion up.
Show me evidence that nVidia didn't perform any shader subsitution.

No, the statements where they said that 3dmark could be easily optimized for extra performance.
Show me evidence.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: Gstanfor
Despite BFG10K's strident squealings, I don't think current G80 owners are suffering too much somehow....

And yes, I do think nvidia is hiding or masking their true Dx10 performance. At the moment its irrelevant to consumers and the need to compete effectively with the competition overrides consumer concerns at this point.

"irrelevant to consumers" ?

"the need to compete effectively with the competition overrides consumer concerns"

you ARE kidding, aren't you ?
:confused:

screw your customer to compete effectively

that IS what you are saying
 

Gstanfor

Banned
Oct 19, 1999
3,307
0
0
Not at all apoppin. There are no Dx10 games yet, so the customer can hardly be being screwed....
 

SickBeast

Lifer
Jul 21, 2000
14,377
19
81
Originally posted by: apoppin
Originally posted by: Gstanfor
Despite BFG10K's strident squealings, I don't think current G80 owners are suffering too much somehow....

And yes, I do think nvidia is hiding or masking their true Dx10 performance. At the moment its irrelevant to consumers and the need to compete effectively with the competition overrides consumer concerns at this point.

"irrelevant to consumers" ?

"the need to compete effectively with the competition overrides consumer concerns"

you ARE kidding, aren't you ?
:confused:

screw your customer to compete effectively

that IS what you are saying
How is it screwing the customer if there are no DX10 games?
 

SickBeast

Lifer
Jul 21, 2000
14,377
19
81
Originally posted by: apoppin
we're talking g80 DX9 perf ;)

that IS what BFG10K is complaining about
That's nuts. My DX9 performance on my G80 is *great*. If they're holding back, then the G80 was a miracle chip.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: SickBeast
Originally posted by: apoppin
we're talking g80 DX9 perf ;)

that IS what BFG10K is complaining about
That's nuts. My DX9 performance on my G80 is *great*. If they're holding back, then the G80 was a miracle chip.

again ... IF we are to believe what Gstanfor is saying is true
[take a breath ... not what *i* am saying*]

and ... IF we are to believe what BFG10K is saying is true
[take another breath ... not what *i* am saying*]

THEN you get my conclusion

;)
 
Oct 4, 2004
10,515
6
81
Originally posted by: Arkaign
The only possible explanation for this thread is being picked on as a child and an adolescence of being clubbed over the head mercilessly by penguins with giant freeze-dried doggy dongs :|

fixed.
 

josh6079

Diamond Member
Mar 17, 2006
3,261
0
0
It's not my claim, its DailyTech's.
They claimed that the R600 will pack 320 stream processors not a driver that's 320 MB's in size. Or at least that's what I could find throughout their R600 articles.

So if you have the proper link from a Dailytech article, please post it.
You are very welcome to post an x1950 comparison picture if you so desire (i'll bet you don't though).
Da..duh...drrr..drrrrrrrrrrrrrrr

What would it prove anyway?
Show me the TWIMTBP game that received $5 million in funding and had the developer slagging off the opposing IHV's GPU's? (lets not forget a certain other action RPG based upon th earlier source build that oh so mysteriously got leaked around the time this deal took place - funnily enough it demonstrably had no issues coping with things like _PP etc... This title was forced to wait over a year after being completed because valves ego would not permit any source engine title other than their own to be released first. Those who lament the demise of Troika should bear that in mind.)
Such disdain for a company that you ultimately gave money to.

:laugh:

I'm sure they're glad you showed your support. ;)
 

Gstanfor

Banned
Oct 19, 1999
3,307
0
0
Originally posted by: Arkaign
The only possible explanation for this thread is drugs :|

Well, ATi employees do appear to be mighty partial to those Hallucinogens Jen Hsun Huang mentioned back in 2002... May even have gotten AMD employees into the habit too...