All AMD R6xx chips are 65 nanometre chips, now

Page 7 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

imported_thefonz

Senior member
Dec 7, 2005
244
0
0
Originally posted by: Matt2
Originally posted by: thefonz
Originally posted by: Gstanfor
According to the latest rumors, part of ATi's delay is due to waiting for DX10 software to arrive.

Its rumored ATi will bundle 2 games with R600, both Dx10 enabled - Call Of Juarez and Just Cause.

The link below is supposedly a comparison of Call of Juarez in DX9 vs DX10 mode.

DX9 vs DX10

DAAMMMMMNNN DX10 IS LOOKING GOOD.

That was my issue with G80 the entire time, any 7800/7900 or 1800/1900 can play dx9 games with decent settings, why the hell would would you spend 500 bucks to get 160 frames instead of 100, you wont even know the difference (with the exception of those with 24 or 30 inch widescreens ;) )

Now that where seeing what dx10 is all about, I think I'll be buying a R600/G80 with my tax refund. It all depends on those first DX10 game benchmarks.

If you think an 1900/7900 is enough then you are obviously playing at a low resolution or are not using the highest levels of IQ.

My X1900XTX choked @ 1680x1050 settings maxed, 4x/16x AA/AF in newer games.

The argument, "160fps is no different than 100fps" is so freakin over played it makes me wanna vomit (BTW, if you're getting 100fps with a 7900 or 1900 series card these days, then I suggest you upgrade your 15" monitor or play newer games. It's more like 50 fps vs 100 fps).

If you find yourself with extra fps, bump up the IQ for the love of God. Play with 8xQ or 8xS AA. Make use of the extra horsepower by making your game look better instead of complaining that Nvidia is making hardware that is too fast.

Well, I guess some people are never satisfied. I'm happy with my dx9 experience and I have an even shittier card then you. The games that have come out recently have not really interested me enough to get more "horsepower" or to blow money on g80 when there is no competition to bring prices down.
I'm no sucker and I don't wanna pay $600(can) for a g80 that will be obsolete in 2 months when nvidia releases g81, there just milkin it for all its worth now. Maybe when it first came out I would expect to pay that, but not 9 months later!
R600 will probably get the prices down, and when the price drop on the q6600's come, that when i'll get a new rig.
 

Matt2

Diamond Member
Jul 28, 2001
4,762
0
0
Originally posted by: thefonz
Well, I guess some people are never satisfied. I'm happy with my dx9 experience and I have an even shittier card then you. The games that have come out recently have not really interested me enough to get more "horsepower" or to blow money on g80 when there is no competition to bring prices down.
I'm no sucker and I don't wanna pay $600(can) for a g80 that will be obsolete in 2 months when nvidia releases g81, there just milkin it for all its worth now. Maybe when it first came out I would expect to pay that, but not 9 months later!
R600 will probably get the prices down, and when the price drop on the q6600's come, that when i'll get a new rig.

Well, you did say that you had "an issue with G80 the entire time".

Does EVGA Step-Up apply to Canada as well?

At this point I would just get an EVGA 8800GTX and then step-up to 8900GTX if it is ever released.

 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: thefonz

What I'm really curious about is how multi player is going to be implemented, if you have dx10 can you play on dx9 servers? and what if there is a distinct advantage of not having the newest hardware, like if there are shadows in dx10 on certain areas, that on dx9 you can see through!

So many questions, It's MADNESS!!!!

does it make a difference if you have a DX8 card playing CS on a server if ever one else is running DX9 cards?

:p

those pics are BOGUS
 

imported_thefonz

Senior member
Dec 7, 2005
244
0
0
Originally posted by: apoppin
Originally posted by: thefonz

What I'm really curious about is how multi player is going to be implemented, if you have dx10 can you play on dx9 servers? and what if there is a distinct advantage of not having the newest hardware, like if there are shadows in dx10 on certain areas, that on dx9 you can see through!

So many questions, It's MADNESS!!!!

does it make a difference if you have a DX8 card playing CS on a server if ever one else is running DX9 cards?

:p

those pics are BOGUS

Honestly I don't know, and I can't tell if your being sarcastic or not. :)
I never played the original CS, but I do play source. I'm was more into 1942 around that time.

Have a look at some Crysis vids, definitively not bogus ;)

Hopefully when someone here gets a dx10 game and card, he/she can post some screens of it maxed in dx9, and then whatever they can get in dx10, then we can see the difference. Only then we can tell, as it won't be a company with "interests" posting it.
 

Matt2

Diamond Member
Jul 28, 2001
4,762
0
0
Originally posted by: thefonz
Originally posted by: apoppin
Originally posted by: thefonz

What I'm really curious about is how multi player is going to be implemented, if you have dx10 can you play on dx9 servers? and what if there is a distinct advantage of not having the newest hardware, like if there are shadows in dx10 on certain areas, that on dx9 you can see through!

So many questions, It's MADNESS!!!!

does it make a difference if you have a DX8 card playing CS on a server if ever one else is running DX9 cards?

:p

those pics are BOGUS

Honestly I don't know, and I can't tell if your being sarcastic or not. :)
I never played the original CS, but I do play source. I'm was more into 1942 around that time.

Have a look at some Crysis vids, definitively not bogus ;)

Hopefully when someone here gets a dx10 game and card, he/she can post some screens of it maxed in dx9, and then whatever they can get in dx10, then we can see the difference. Only then we can tell, as it won't be a company with "interests" posting it.

It's obvious the pics are bogus.

Where's the HDR in the DX9 pic?

Why is the terrain so much different in the DX9 pic vs the DX10 pic? They dont even look like the same scenes.

I dont see anything in the DX10 pic that cant be done under DX9. In fact, I think the SM3 tests in 3DMock06 look better than the supposed DX10 pics.

Add in the fact that the screenies have an ATI logo on them and I am even more suspicious.
 

imported_thefonz

Senior member
Dec 7, 2005
244
0
0
Originally posted by: Matt2
Originally posted by: thefonz
Originally posted by: apoppin
Originally posted by: thefonz

What I'm really curious about is how multi player is going to be implemented, if you have dx10 can you play on dx9 servers? and what if there is a distinct advantage of not having the newest hardware, like if there are shadows in dx10 on certain areas, that on dx9 you can see through!

So many questions, It's MADNESS!!!!

does it make a difference if you have a DX8 card playing CS on a server if ever one else is running DX9 cards?

:p

those pics are BOGUS

Honestly I don't know, and I can't tell if your being sarcastic or not. :)
I never played the original CS, but I do play source. I'm was more into 1942 around that time.

Have a look at some Crysis vids, definitively not bogus ;)

Hopefully when someone here gets a dx10 game and card, he/she can post some screens of it maxed in dx9, and then whatever they can get in dx10, then we can see the difference. Only then we can tell, as it won't be a company with "interests" posting it.

It's obvious the pics are bogus.

Where's the HDR in the DX9 pic?

Why is the terrain so much different in the DX9 pic vs the DX10 pic? They dont even look like the same scenes.

I dont see anything in the DX10 pic that cant be done under DX9. In fact, I think the SM3 tests in 3DMock06 look better than the supposed DX10 pics.

Add in the fact that the screenies have an ATI logo on them and I am even more suspicious.

I'll just repost what I said before in the last post.

*ahem*

"Hopefully when someone here gets a dx10 game and card, he/she can post some screens of it maxed in dx9, and then whatever they can get in dx10, then we can see the difference. Only then we can tell, as it won't be a company with "interests" posting it "
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: thefonz
Originally posted by: apoppin
Originally posted by: thefonz

What I'm really curious about is how multi player is going to be implemented, if you have dx10 can you play on dx9 servers? and what if there is a distinct advantage of not having the newest hardware, like if there are shadows in dx10 on certain areas, that on dx9 you can see through!

So many questions, It's MADNESS!!!!

does it make a difference if you have a DX8 card playing CS on a server if ever one else is running DX9 cards?

:p

those pics are BOGUS

Honestly I don't know, and I can't tell if your being sarcastic or not. :)
I never played the original CS, but I do play source. I'm was more into 1942 around that time.

Have a look at some Crysis vids, definitively not bogus ;)

Hopefully when someone here gets a dx10 game and card, he/she can post some screens of it maxed in dx9, and then whatever they can get in dx10, then we can see the difference. Only then we can tell, as it won't be a company with "interests" posting it.

you have to remember that Crysis is still a DX9 game ... it has "added" DX10 features ... nothing you see in *those* screen shoots also cannot be done in DX9c

later .. in a year or so, when full DX10 games are released we should *see* more than performance and increased textures/resolutions

DX10 is all about "efficiency"

what i was saying ... there should be no problem playing with a DX9 or DX10 in MP
 

Cookie Monster

Diamond Member
May 7, 2005
5,161
32
86
Bad news

Over at CeBIT, we overheard some production problems with R600. The yield at 65nm is very low at this point of time. It is projected by middle of April, AMD is only able to churn out some 20,000 Radeon X2900 XTX graphics cards and XT SKU might not even make it for the launch in May.

:Q

20,000?!?!? Last time i was told, more than 400,000 G80s have been sold. Im beginning to believe AMD may have made a mistake on the R600 project started off by ATi. Why not stick with 80nm?
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
i am gonna put VRzone on my do not click list ... soon ... along with theInq

every report contradicts the previous one

i have NEVER seen such confusion ...

--as though AMD intended it this way ;)

to AMD: show us the product or StFU!
:thumbsdown:

:|

 

Cookie Monster

Diamond Member
May 7, 2005
5,161
32
86
Bad news 2

AMD-ATI then...

Oy! Even worse! I had a couple of meetings with board-partners like TUL (PowerColor) and HiS Technology. These were supposed to be NDA (disclosed) meetings with a presentation of new R600 based product.

AMD decided to pull the plug on these presentations as they really do not have anything to show. The way it is right now is that the press-briefings have been delayed towards likely the end of April. I still believe we'll see four models of the R600 based chipset. I think one of the biggest issues right now is the low yield levels due to data leakage (successful working products). It was rumored at the CeBIT that at the initial launch only 30,000 products will be available "World-wide"! That's both retail and OEM (which takes easily half of the products).

I so hope that AMD-ATI can get past the problems as we need that high-level of competition in the desktop graphics cards market. This is good for everyone, including NVIDIA.

NDA or not, I quickly had a chance to observe two R600 based PC's at a selection of manufacturers who where kind enough to demo them despite AMD being against it.

The rumors are true, the card photos you have seen are correct and so is power consumption. I can't get you any numbers on performance as it was not available. The new Ruby demo is looking very good for sure though. But for now I honestly do not have anything new to add other than the rumors on the web are pretty much on par.

Right let's move into the actual pictorial. Loads to show, lots to see !

Well, im not liking this AT All. Sounds exactly like the nv30 day by day.

*runs*
 

Matt2

Diamond Member
Jul 28, 2001
4,762
0
0
Originally posted by: apoppin
i am gonna put VRzone on my do not click list ... soon ... along with theInq

every report contradicts the previous one

i have NEVER seen such confusion ...

--as though AMD intended it this way ;)

to AMD: show us the product or StFU!
:thumbsdown:

:|

Agreed.

I don't believe anything I hear anymore.

I'm gonna start calling R600 vaporware pretty soon.

If VR-Zone is right, R600 is doomed. it's going to be super expensive. I'm gonna say at least $700.

How could they not know that the yields were going to be terrible at 65nm?

Do they even care?
 

Munky

Diamond Member
Feb 5, 2005
9,372
0
76
I'm wondering at what point in time did AMD decide to make the r600 a 65nm product? If it was a last minute decision some time in February, then I don't expect to see mass availability anytime before May, and that's if the yields are good. If the yields are bad, who knows, AMD might have r600 available just in time for the 8900 refresh, which is really really late. The bad news just keep coming. Seeing how the g80 is doing well on a 90nm process, I don't see how a 80nm r600 could have been so bad, unless it was a much bigger and more complex gpu than the g80.
 

Cookie Monster

Diamond Member
May 7, 2005
5,161
32
86
Or it was because the performance wasnt up to far with G80.
So they resorted into pushing the R600 but they couldnt OC the R600 any higher on the 80nm due to the every increasing power consumption/heat that was getting out of hand.

65nm isnt even a mature process, and the R600 arcitecture is relatively something the worlds never seen before. New Process + New Architecture = trouble.


 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
nope ... it's an expensive paperweight
Description

ATI RADEON VIDEO CARD, PCI EXPRESS, DUAL DVI OUTPUT



CONDITION: UNKNOWN MODEL. CARD HAS JUMBO SIZE (12 INCH/30 cm. LONG AND 3 LB/1.5 Kg. WEIGHT, ALL COVERED BY HUGE COOLING SINK AND FAN. USED, NOT TESTED. WE DID NOT HAVE CONDITIONS TO TEST THIS TYPE OF CARD. SEE ACTUAL PICTURES. MISSING METAL COVER ON THE CONNECTOR SIDE OF THE CARD. NO CABLES. MEMORY SIZE UNKNOWN BUT LOOKS LIKE HI-END CARD WITH 256 OR 512 MB OF MEMORY.

SELLING ?AS-IS? DUE TO THE UNKNOWN ITEM FUNCTIONALITY.

probably the "lucky" 13th respin :p

:roll:

WE DID NOT HAVE CONDITIONS TO TEST THIS TYPE OF CARD.
they didn't have a 2,000 w PS
:Q

:D
 

hardwareking

Senior member
May 19, 2006
618
0
0
well u still can't beat it for price
Just $50.And if it works u got a killer card(with no drivers but it should come soon)
 

ayabe

Diamond Member
Aug 10, 2005
7,449
0
0
Originally posted by: hardwareking
well u still can't beat it for price
Just $50.And if it works u got a killer card(with no drivers but it should come soon)

I would make a medallion out of it, that's some serious bling.
 

Cookie Monster

Diamond Member
May 7, 2005
5,161
32
86
R600 benches

[CeBIT : Retail R600XTX Pictured & Benchmarked]

Page Title: CeBIT : Retail R600XTX Pictured & Benchmarked
Category: GPUs & Graphic Cards
Type: News
Posted By: Visionary
Date: March 20, 2007, 11:40 pm
Source: VR-Zone
Actions: Print Article Email Del.icio.us Digg







We saw some R600 cards at CeBIT but any form of photography is restricted as AMD is keeping a close watch making sure nothing is leaked outside. Nevertheless, we still managed to get some shots on the retail R600XTX (Radeon X2900 XTX) card. The frontal shot contains sensitive information so we can't show it here.

Just a recap of what we have told you before, R600XTX retail card is codenamed Dragonhead 2. It is a 12-layer card at 9.5" long with a max TDP of 240W. It is clocked at 800MHz on 80nm process technology, 512-bit memory interface and has 1GB GDDR4 memories onboard. It has 6-pin and 8-pin PCIe connectors but two 2x3 PCIe power can be used. Rumors surfaced over at CeBIT that the final product may be on 65nm if it can be produced in time with reasonable good yield. So far we have yet to get a confirmation that the first shipping R600 cards will be on 65nm but we can't rule out that there are experimental R600 chips at 65nm now. If our source is right, the yield at 65nm is poor at this point of time and expect a limited quantity of 20,000 pieces of R600XTX by middle of April.

Next, we got hold of some preliminary benchmarks figures of the R600 XTX card with core clock at 800MHz vs a GeForce 8800 GTX card. Using a Core 2 Extreme 2.93GHz processor on an Intel 975X board, the 3DMark06 score at 1600x1200 resolution is 97xx on the R600XTX compared to 95xx on the 8800GTX. Seems like R600XTX is running slightly faster than 8800GTX on the early release of drivers for R600. AMD is still working hard on the drivers and there are some more performance left to unlock. However, the DX10 benchmarking war between ATi and NVIDIA has not started yet. The targeted display driver version for launch in May is 8.361 or later (Catalyst 7.4 or 7.5).

more pics as well
 

Extelleron

Diamond Member
Dec 26, 2005
3,127
0
71
Originally posted by: hardwareking
well u still can't beat it for price
Just $50.And if it works u got a killer card(with no drivers but it should come soon)

Seller ended the item early. I might have put a bid in if he hadn't... even if I can't use it it would still be amazing to have.
 

Matt2

Diamond Member
Jul 28, 2001
4,762
0
0
Originally posted by: Cookie Monster
R600 benches

[CeBIT : Retail R600XTX Pictured & Benchmarked]

Page Title: CeBIT : Retail R600XTX Pictured & Benchmarked
Category: GPUs & Graphic Cards
Type: News
Posted By: Visionary
Date: March 20, 2007, 11:40 pm
Source: VR-Zone
Actions: Print Article Email Del.icio.us Digg







We saw some R600 cards at CeBIT but any form of photography is restricted as AMD is keeping a close watch making sure nothing is leaked outside. Nevertheless, we still managed to get some shots on the retail R600XTX (Radeon X2900 XTX) card. The frontal shot contains sensitive information so we can't show it here.

Just a recap of what we have told you before, R600XTX retail card is codenamed Dragonhead 2. It is a 12-layer card at 9.5" long with a max TDP of 240W. It is clocked at 800MHz on 80nm process technology, 512-bit memory interface and has 1GB GDDR4 memories onboard. It has 6-pin and 8-pin PCIe connectors but two 2x3 PCIe power can be used. Rumors surfaced over at CeBIT that the final product may be on 65nm if it can be produced in time with reasonable good yield. So far we have yet to get a confirmation that the first shipping R600 cards will be on 65nm but we can't rule out that there are experimental R600 chips at 65nm now. If our source is right, the yield at 65nm is poor at this point of time and expect a limited quantity of 20,000 pieces of R600XTX by middle of April.

Next, we got hold of some preliminary benchmarks figures of the R600 XTX card with core clock at 800MHz vs a GeForce 8800 GTX card. Using a Core 2 Extreme 2.93GHz processor on an Intel 975X board, the 3DMark06 score at 1600x1200 resolution is 97xx on the R600XTX compared to 95xx on the 8800GTX. Seems like R600XTX is running slightly faster than 8800GTX on the early release of drivers for R600. AMD is still working hard on the drivers and there are some more performance left to unlock. However, the DX10 benchmarking war between ATi and NVIDIA has not started yet. The targeted display driver version for launch in May is 8.361 or later (Catalyst 7.4 or 7.5).

more pics as well

OMG.

If this is true AMD is in trouble. If all they can muster is +200 points in 3DMock06 then I am thoroughly underwhelmed.

Now I know this is with pre-release drivers. But damn, I was expecting so much more. Plus, if Nvidia has any performance yet to unlock via their drivers, then R600 might not end up being any faster at all.

No wonder they want to go 65nm. I can't believe they can only get 2% performance advantage with a 225mhz clock advantage.

They're going to need to get close to a 1ghz core clock to even make it worth being 7 months late.

Nvidia's response is going to be interesting. 8800GTX Ultra or take AMD to the cleaners with 8900GTX?

I wonder where AMD went wrong? With a 225mhz core clock advantage, 512bit mem interface and GDDR4, this should have not even been close.

:thumbsdown:
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
with this garbage performance, you can be *sure* we WON'T see r600 in April ... maybe July

maybe :p

if they only have 20K pieces, they might as well toss em ... or rename them rv630xtX ... that was supposed to get 12,000 in '05
:clock:

when r660 .. still probably named r600 in 65nm 'clothing' ... goes against G81

AMD is *lost* ... adrift ... floundering

:(

who cares where they went wrong?

looks like buying ATi has turned out to be a *disaster* for both companies

too bad


 

Matt2

Diamond Member
Jul 28, 2001
4,762
0
0
I dont think that we'll even see the true 80nm R600.

AMD will probably release the mid-range and super paper launch R600 Phantom Edition.

Like apoppin said, AMD will probably scrap this 80nm debacle and we'll see the 65nm version in June/July and they'll say it's R600. When in reality, we all know this was supposed to be the refresh.

I dont even think that shrinking it to 65nm will save that much power since like I said before, they'll have to pump the clocks to the 1GHZ mark to even have a chance.

What I am interested in is how the architecture functions. Since (according to this report) R600 has very similar performance to G80, it must have some kind of architectural deficiency compared to G80 since it has a 39% higher core clock.

After everyone said that ATI had such a huge head start on nvidia because of R500, Nvidia must have known something ATI didn't. Maybe those Vec4 shaders are harder to keep fed. Afterall, Nvidia did claim near 100% efficiency with simple scalar shaders.