The R600 isnt great in DX9

Soccerman06

Diamond Member
Jul 29, 2004
5,830
5
81
But we have yet to see how it performs in DX10, maybe Im just ignorant and havent spent time looking for DX10 reviews, but dont completely bash the card until you all see the performance in DX10 (it may win yet!).

just a thought :)
 

Pugnate

Senior member
Jun 25, 2006
690
0
0
DX10 performance is just the icing on the cake. The meat of it is DX9. Most games from today till 2 years will be DX9. When there are actual DX10 games to play, you will have far better cards. So sorry.
 

Nightmare225

Golden Member
May 20, 2006
1,661
0
0
Originally posted by: Soccerman06
But we have yet to see how it performs in DX10, maybe Im just ignorant and havent spent time looking for DX10 reviews, but dont completely bash the card until you all see the performance in DX10 (it may win yet!).

just a thought :)


What DX10 benchmarks are you talking about? Enlighten me on how you think the R600 might be better in DX10 if you don't even know what the G80 performs in?
 

Matt2

Diamond Member
Jul 28, 2001
4,762
0
0
Doesnt matter how it performs in DX10 IMO.

By the time we get DX10 games en masse, the card will be well beyond it's useful life.

People are buying G80 for it's DX9 performance, not DX10.
 

Keysplayr

Elite Member
Jan 16, 2003
21,211
50
91
He doesn't know. He's just trying to look on the bright side, even if there isn't any bright side 'cept maybe price cuts.
 

Munky

Diamond Member
Feb 5, 2005
9,372
0
76
I'll wait for more reliable benches from other sources before passing judgement on how badly the r600 performs. But even if it's good at DX10, that doesn't mean the g80 will be bad at DX10 unless the g80 has some design flaw that makes it perform much slower (like the fx series and dx9).
 

sandorski

No Lifer
Oct 10, 1999
70,696
6,257
126
It might pwn in DX10, but that's not going to sell cards at this time when DX9 games are all that's available. It's possible that ATI made a conscious decision to sacrifice DX9 performance in the short term in order to have a superior DX10 product in the longterm, but I dunno, just kinda trying to make sense out of the situation.
 

terentenet

Senior member
Nov 8, 2005
387
0
0
There was some Nvidia demo for G80 & Vista, the one with the cascade stuff. That's the only DX10 I've seen so far. Yes, 70% of the market buy the G80 for it's current performance in DX9.
By the time DX10 comes in force with nice titles (Crysis, Alan Wake, etc) new boards with more performance will be out and our current 8800GTXs and HD2900XT/XTX will be considered old tech.
 

mazeroth

Golden Member
Jan 31, 2006
1,821
2
81
Originally posted by: terentenet
There was some Nvidia demo for G80 & Vista, the one with the cascade stuff. That's the only DX10 I've seen so far. Yes, 70% of the market buy the G80 for it's current performance in DX9.
By the time DX10 comes in force with nice titles (Crysis, Alan Wake, etc) new boards with more performance will be out and our current 8800GTXs and HD2900XT/XTX will be considered old tech.


Can you confirm your 70%? Because I would have said something along the lines of, oh, 99%. :)
 

Chadder007

Diamond Member
Oct 10, 1999
7,560
0
0
Originally posted by: sandorski
It might pwn in DX10, but that's not going to sell cards at this time when DX9 games are all that's available. It's possible that ATI made a conscious decision to sacrifice DX9 performance in the short term in order to have a superior DX10 product in the longterm, but I dunno, just kinda trying to make sense out of the situation.

Or it could be just a bad driver issue.
 

Acanthus

Lifer
Aug 28, 2001
19,915
2
76
ostif.org
Originally posted by: Matt2
Doesnt matter how it performs in DX10 IMO.

By the time we get DX10 games en masse, the card will be well beyond it's useful life.

People are buying G80 for it's DX9 performance, not DX10.

bingo
 

sandorski

No Lifer
Oct 10, 1999
70,696
6,257
126
Originally posted by: Chadder007
Originally posted by: sandorski
It might pwn in DX10, but that's not going to sell cards at this time when DX9 games are all that's available. It's possible that ATI made a conscious decision to sacrifice DX9 performance in the short term in order to have a superior DX10 product in the longterm, but I dunno, just kinda trying to make sense out of the situation.

Or it could be just a bad driver issue.

Possibly, but I'd think AMD/ATI would have emphasized immaturity of the Drivers if that was the case.
 

jpeyton

Moderator in SFF, Notebooks, Pre-Built/Barebones
Moderator
Aug 23, 2003
25,375
142
116
Someone on XS is claiming the XTX holds a few FPS advantage in Crysis using DX10, citing development sources.

Crysis in DX9 is in favor of the GTX though.
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,002
126
I'll wait for official reviews before passing final judgment on the R600.
 

BladeVenom

Lifer
Jun 2, 2005
13,365
16
0
Originally posted by: jpeyton
Someone on XS is claiming the XTX holds a few FPS advantage in Crysis using DX10, citing development sources.

Crysis in DX9 is in favor of the GTX though.

Of course by the time Crysis comes out, Nvidia will have its next generation card out, maybe the next two generations will be out by then.
 

jpeyton

Moderator in SFF, Notebooks, Pre-Built/Barebones
Moderator
Aug 23, 2003
25,375
142
116
Originally posted by: BladeVenom
Originally posted by: jpeyton
Someone on XS is claiming the XTX holds a few FPS advantage in Crysis using DX10, citing development sources.

Crysis in DX9 is in favor of the GTX though.

Of course by the time Crysis comes out, Nvidia will have its next generation card out, maybe the next two generations will be out by then.
nVidia is going to release two new generations of product by Q3 2007?
 

bunnyfubbles

Lifer
Sep 3, 2001
12,248
3
0
Originally posted by: BFG10K
I'll wait for official reviews before passing final judgment on the R600.

my thoughts exactly, we still know next to nothing about IQ and actual street prices
 

coolpurplefan

Golden Member
Mar 2, 2006
1,243
0
0
Originally posted by: bunnyfubbles
Originally posted by: BFG10K
I'll wait for official reviews before passing final judgment on the R600.

my thoughts exactly, we still know next to nothing about IQ and actual street prices

I'm seeing some X1950 Pro on sale "while quantities last" so I kind of suspect a R600 is going to replace it. Perhaps it will be the R630. I kind of hope they won't raise the price then though. Who knows, maybe I should buy a X1950 Pro right now since I'm not upgrading to Vista just now anyway. Mmmmmmmmmmmmmmmm, the long 2 week wait to see what will happen. Will it all drive us mad? :Q
 

BladeVenom

Lifer
Jun 2, 2005
13,365
16
0
Originally posted by: jpeyton
Originally posted by: BladeVenom
Originally posted by: jpeyton
Someone on XS is claiming the XTX holds a few FPS advantage in Crysis using DX10, citing development sources.

Crysis in DX9 is in favor of the GTX though.

Of course by the time Crysis comes out, Nvidia will have its next generation card out, maybe the next two generations will be out by then.
nVidia is going to release two new generations of product by Q3 2007?

I said maybe, and I also said by the time Crysis comes out. In case you have noticed by now, games don't always come out on time.
 

Gstanfor

Banned
Oct 19, 1999
3,307
0
0
Originally posted by: sandorski
Originally posted by: Chadder007
Originally posted by: sandorski
It might pwn in DX10, but that's not going to sell cards at this time when DX9 games are all that's available. It's possible that ATI made a conscious decision to sacrifice DX9 performance in the short term in order to have a superior DX10 product in the longterm, but I dunno, just kinda trying to make sense out of the situation.

Or it could be just a bad driver issue.

Possibly, but I'd think AMD/ATI would have emphasized immaturity of the Drivers if that was the case.

Given their recent trumpeting about the superiority of their drivers, especially under Vista I think an excuse of driver immaturity is highly unlikely to be forthcoming (at least until well after all this has blown over and AMD PR thinks we've forgotten the previous claims).

On the subject of drivers 350mb for a driver?!? I could fit nvidia's largest forceware into that, plus "optimized" version of every game binary on my system in that, plus all banchmark binaries and still have space to spare...
 

Matt2

Diamond Member
Jul 28, 2001
4,762
0
0
Originally posted by: Gstanfor
Originally posted by: sandorski
Originally posted by: Chadder007
Originally posted by: sandorski
It might pwn in DX10, but that's not going to sell cards at this time when DX9 games are all that's available. It's possible that ATI made a conscious decision to sacrifice DX9 performance in the short term in order to have a superior DX10 product in the longterm, but I dunno, just kinda trying to make sense out of the situation.

Or it could be just a bad driver issue.

Possibly, but I'd think AMD/ATI would have emphasized immaturity of the Drivers if that was the case.

Given their recent trumpeting about the superiority of their drivers, especially under Vista if think an excuse of driver immaturity is highly unlikely to be forthcoming (at least until well after all this has blown over and AMD PR thinks we've forgotten the previous claims).

On the subject of drivers 350mb for a driver?!? I could fit nvidia's largest forceware into that, plus "optimized" version of every game binary on my system in that, plus all banchmark binaries and still have space to spare...

I agree. If this is a driver issue then AMD has blowing smoke up our rear ends for a while now. I just dont think they would be so dumb as to point the finger and laugh at Nvidia for their drivers and then deliver broken drivers 6 months late.