All AMD R6xx chips are 65 nanometre chips, now

Page 6 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: ronnn
Originally posted by: apoppin

nope ... the 8800 is expensive ... the 8900 should be a lot cheaper ... and faster ... nvidia has the luxury of refining it's process while "waiting" for AMD ... and hopefully, drivers. :p

r600 is going against 8800 x1.20[+] .... if we get any indication by how g80 o/c's now

right now r600 is a total loss ... *something* would be nice ... even if it is slower and a LOT cheaper. ;)

Not sure if any of that made sense? Am very glad know the 8900 will be cheaper, as that implies that the r600 will rock. :beer:
it was in reply to your post - which definitely made no sense.

... and cheaper for nvidia to make ... not for me :p

Originally posted by: ronnnOf course, one always compares the top cards of each company. A phantom ultra 8800 gtx is more likely though. Unless the g81 turns out to be 2 cards glued together. 20% at low resolutions is huge with current games. Would likely translate into an 80% advantage at high resolutions or newer games. Personally I think the r600 will be lucky to be 10% at low resolutions. Still a win is a win and if nv can match - well nice for us guys who ain't bought yet.
 

ronnn

Diamond Member
May 22, 2003
3,918
0
71
Originally posted by: apoppin
it was in reply to your post - which definitely made no sense.

... and cheaper for nvidia to make ... not for me :p
[/quote]

Making no sense must have been what attracted your confidence and support of future nvidia products. Que sera, sera.
 

ronnn

Diamond Member
May 22, 2003
3,918
0
71
Originally posted by: Matt2


What I am trying to say is that AMD, as far as I'm concerned, has missed out on an entire generation of graphics cards.

Not sure if a whole generation, but they certainly have left the field empty for a long time now. If this 65 nanometre rumour is true than maybe amd will hold the lead for a while. Should have plenty of legs for future speed bumps. Already said I doubt it, but seems like we will know soon.

 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: ronnn
Originally posted by: apoppin
it was in reply to your post - which definitely made no sense.

... and cheaper for nvidia to make ... not for me :p

Making no sense must have been what attracted your confidence and support of future nvidia products. Que sera, sera.
[/quote]

you have confused me with someone else :p

you support products that don't even exist
 

Gstanfor

Banned
Oct 19, 1999
3,307
0
0
What make you think 8800 Ultra would be on 90nm though appopin?

nvidia's midrange and budget line will be made on the 80nm process, and I'm pretty certain nvidia will take the opportunity to save costs, lower power consumption & heat, plus possibly improve speeds on G80 also. Given their history over the years I'd be downright shocked if they didn't.

Even if clockspeed does not rise, their should be enough free diespace relative to 90nm G80 to add on another couple of ROP and processing blocks (see G80 block diagram) Couple that with a 512 bit interface to memory and put GDDR4 instead of GDDR3 on the board and it should be more than a match for anything AMD has cooked up.
 

chizow

Diamond Member
Jun 26, 2001
9,537
2
0
Originally posted by: apoppin
why?
:confused:

they already allow their partners to o/c the 8800 to "ultra" speeds

Ya, but those OCs are just taking advantage of G80's healthy headroom. G81 on an 80nm process could allow for 700-750MHz stock speeds with the potential for 800MHz to 900MHz headroom, which would be a worthwhile upgrade for even GTX owners. Its no wonder there's talk of scrapping G81 and going straight to G90.....G81 with a full 128 shaders OC'd might come too close to G90's target performance numbers.
 

Gstanfor

Banned
Oct 19, 1999
3,307
0
0
According to the latest rumors, part of ATi's delay is due to waiting for DX10 software to arrive.

Its rumored ATi will bundle 2 games with R600, both Dx10 enabled - Call Of Juarez and Just Cause.

The link below is supposedly a comparison of Call of Juarez in DX9 vs DX10 mode.

DX9 vs DX10
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: chizow
Originally posted by: apoppin
why?
:confused:

they already allow their partners to o/c the 8800 to "ultra" speeds

Ya, but those OCs are just taking advantage of G80's healthy headroom. G81 on an 80nm process could allow for 700-750MHz stock speeds with the potential for 800MHz to 900MHz headroom, which would be a worthwhile upgrade for even GTX owners. Its no wonder there's talk of scrapping G81 and going straight to G90.....G81 with a full 128 shaders OC'd might come too close to G90's target performance numbers.

AGREED ... i am just showing a possible *potential* of g81 ... better than current g80 "ultra" ;)

Originally posted by: Gstanfor
What make you think 8800 Ultra would be on 90nm though appopin?
i *don't*

i was replying to CookieMonster who thought there would just be another g80 "ultra" ... i said we already have that .... g81 will be smaller, meaner, cooler and faster than g80 'ultra' ;)

r600 will have to go against g81 ... *improved* over g80's best
 

Matt2

Diamond Member
Jul 28, 2001
4,762
0
0
Originally posted by: Gstanfor
According to the latest rumors, part of ATi's delay is due to waiting for DX10 software to arrive.

Its rumored ATi will bundle 2 games with R600, both Dx10 enabled - Call Of Juarez and Just Cause.

The link below is supposedly a comparison of Call of Juarez in DX9 vs DX10 mode.

DX9 vs DX10

More BS from AMD.

If they're waiting for DX10 software then we might be in for a longer wait than expected.

What about the millions of users who dont have Vista? They're alienating the entire DX9 market that is going to be their bread basket.

Waiting for Call of Juarez and Just Cause (two extremely mediocre games) is just plain stupid. No matter how you cut it.
 

ayabe

Diamond Member
Aug 10, 2005
7,449
0
0
Originally posted by: Matt2
Waiting for Call of Juarez and Just Cause (two extremely mediocre games) is just plain stupid. No matter how you cut it.

Understatement of the year. :brokenheart:

 

Matt2

Diamond Member
Jul 28, 2001
4,762
0
0
Originally posted by: ayabe
Originally posted by: Matt2
Waiting for Call of Juarez and Just Cause (two extremely mediocre games) is just plain stupid. No matter how you cut it.

Understatement of the year. :brokenheart:

If this is true, I dont know if I could ever believe in AMD's ability to survive in the discrete graphics card market.

Why not just stick a couple of coupons in the box?
 

imported_thefonz

Senior member
Dec 7, 2005
244
0
0
Originally posted by: Gstanfor
According to the latest rumors, part of ATi's delay is due to waiting for DX10 software to arrive.

Its rumored ATi will bundle 2 games with R600, both Dx10 enabled - Call Of Juarez and Just Cause.

The link below is supposedly a comparison of Call of Juarez in DX9 vs DX10 mode.

DX9 vs DX10

DAAMMMMMNNN DX10 IS LOOKING GOOD.

That was my issue with G80 the entire time, any 7800/7900 or 1800/1900 can play dx9 games with decent settings, why the hell would would you spend 500 bucks to get 160 frames instead of 100, you wont even know the difference (with the exception of those with 24 or 30 inch widescreens ;) )

Now that where seeing what dx10 is all about, I think I'll be buying a R600/G80 with my tax refund. It all depends on those first DX10 game benchmarks.
 

Matt2

Diamond Member
Jul 28, 2001
4,762
0
0
Originally posted by: thefonz
Originally posted by: Gstanfor
According to the latest rumors, part of ATi's delay is due to waiting for DX10 software to arrive.

Its rumored ATi will bundle 2 games with R600, both Dx10 enabled - Call Of Juarez and Just Cause.

The link below is supposedly a comparison of Call of Juarez in DX9 vs DX10 mode.

DX9 vs DX10

DAAMMMMMNNN DX10 IS LOOKING GOOD.

That was my issue with G80 the entire time, any 7800/7900 or 1800/1900 can play dx9 games with decent settings, why the hell would would you spend 500 bucks to get 160 frames instead of 100, you wont even know the difference (with the exception of those with 24 or 30 inch widescreens ;) )

Now that where seeing what dx10 is all about, I think I'll be buying a R600/G80 with my tax refund. It all depends on those first DX10 game benchmarks.

If you think an 1900/7900 is enough then you are obviously playing at a low resolution or are not using the highest levels of IQ.

My X1900XTX choked @ 1680x1050 settings maxed, 4x/16x AA/AF in newer games.

The argument, "160fps is no different than 100fps" is so freakin over played it makes me wanna vomit (BTW, if you're getting 100fps with a 7900 or 1900 series card these days, then I suggest you upgrade your 15" monitor or play newer games. It's more like 50 fps vs 100 fps).

If you find yourself with extra fps, bump up the IQ for the love of God. Play with 8xQ or 8xS AA. Make use of the extra horsepower by making your game look better instead of complaining that Nvidia is making hardware that is too fast.

 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: thefonz
Originally posted by: Gstanfor
According to the latest rumors, part of ATi's delay is due to waiting for DX10 software to arrive.

Its rumored ATi will bundle 2 games with R600, both Dx10 enabled - Call Of Juarez and Just Cause.

The link below is supposedly a comparison of Call of Juarez in DX9 vs DX10 mode.

DX9 vs DX10

DAAMMMMMNNN DX10 IS LOOKING GOOD.

That was my issue with G80 the entire time, any 7800/7900 or 1800/1900 can play dx9 games with decent settings, why the hell would would you spend 500 bucks to get 160 frames instead of 100, you wont even know the difference (with the exception of those with 24 or 30 inch widescreens ;) )

Now that where seeing what dx10 is all about, I think I'll be buying a R600/G80 with my tax refund. It all depends on those first DX10 game benchmarks.

what a load of crap

there is *nothing* that can be shown in a JPEG that DX10 can *do* that DX9c cannot
:thumbsdown:
 

Gstanfor

Banned
Oct 19, 1999
3,307
0
0
Well, there is always the performance aspect and it should be far easier to use AA with MRT in dx10 (hopefully no more AA less games!).
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
but you shouldn't be able to *see* a noticeable difference
-this is 'night and day' ... maybe ... :p
:confused:

other than possibly higher textures and being able to run at very high resolutions with les of a performance hit, there is no real difference in the visuals ... yet.

and these shots are not so detailed to be able to see higher textures

i call BS on the pics

 

Matt2

Diamond Member
Jul 28, 2001
4,762
0
0
Originally posted by: apoppin
but you shouldn't be able to *see* a noticeable difference
-this is 'night and day' ... maybe ... :p
:confused:

other than possibly higher textures and being able to run at very high resolutions with les of a performance hit, there is no real difference in the visuals ... yet.

and these shots are not so detailed to be able to see higher textures

i call BS on the pics

I agree. I also like how HDR is nicely implemented in the DX10 side and there is no HDR in DX9.

Last time I checked, we've been enjoying HDR in DX9 for some time now.

Those pics look more like DX8 vs DX9.
 

Munky

Diamond Member
Feb 5, 2005
9,372
0
76
Originally posted by: thefonz
Originally posted by: Gstanfor
According to the latest rumors, part of ATi's delay is due to waiting for DX10 software to arrive.

Its rumored ATi will bundle 2 games with R600, both Dx10 enabled - Call Of Juarez and Just Cause.

The link below is supposedly a comparison of Call of Juarez in DX9 vs DX10 mode.

DX9 vs DX10

DAAMMMMMNNN DX10 IS LOOKING GOOD.

That was my issue with G80 the entire time, any 7800/7900 or 1800/1900 can play dx9 games with decent settings, why the hell would would you spend 500 bucks to get 160 frames instead of 100, you wont even know the difference (with the exception of those with 24 or 30 inch widescreens ;) )

Now that where seeing what dx10 is all about, I think I'll be buying a R600/G80 with my tax refund. It all depends on those first DX10 game benchmarks.

What surprised me the most is not how good DX10 looked but how bad DX9 looked in those screens. I'm sure I've seen better looking DX9 games, and if the devs will intentionally sacrifice DX9 IQ to make DX10 look good, then I'm not buying those games.
 

imported_thefonz

Senior member
Dec 7, 2005
244
0
0
Originally posted by: munky
Originally posted by: thefonz
Originally posted by: Gstanfor
According to the latest rumors, part of ATi's delay is due to waiting for DX10 software to arrive.

Its rumored ATi will bundle 2 games with R600, both Dx10 enabled - Call Of Juarez and Just Cause.

The link below is supposedly a comparison of Call of Juarez in DX9 vs DX10 mode.

DX9 vs DX10

DAAMMMMMNNN DX10 IS LOOKING GOOD.

That was my issue with G80 the entire time, any 7800/7900 or 1800/1900 can play dx9 games with decent settings, why the hell would would you spend 500 bucks to get 160 frames instead of 100, you wont even know the difference (with the exception of those with 24 or 30 inch widescreens ;) )

Now that where seeing what dx10 is all about, I think I'll be buying a R600/G80 with my tax refund. It all depends on those first DX10 game benchmarks.

What surprised me the most is not how good DX10 looked but how bad DX9 looked in those screens. I'm sure I've seen better looking DX9 games, and if the devs will intentionally sacrifice DX9 IQ to make DX10 look good, then I'm not buying those games.

What I'm really curious about is how multi player is going to be implemented, if you have dx10 can you play on dx9 servers? and what if there is a distinct advantage of not having the newest hardware, like if there are shadows in dx10 on certain areas, that on dx9 you can see through!

So many questions, It's MADNESS!!!!