All AMD R6xx chips are 65 nanometre chips, now

Page 5 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

coldpower27

Golden Member
Jul 18, 2004
1,676
0
76
Originally posted by: munky
Originally posted by: allies
Originally posted by: munky
If the mainstream rv630 cards are ready, it would be stupid for Ati to not release them by now. Also, the performance target for r600 must be about 2x the performance of r580, so in that regard I don't care if they use 96 shaders or 48, as long as the performance is there.

Isn't the 8800GTX 2x the power of an R580...?

Actually it's as fast as them Crossfired, right? And the overhead of Xfire is like... 10%? So the R600 will be ~5-15% faster than the 8800GTX?

I've looked at the shadermark results between r580, r580 CF and the g80. In a few cases the g80 is as fast as r580 CF, but in most cases it's about 1.7x as fast as the r580, and r580 CF is 1.95-2x the performance of r580. These are synthetic shader tests, and in actual games the results may vary, but it gives a good idea of how a gpu performs under heavy shader load.

Yes, while synthetics gives us some good academic information, Crossfire R580 is typically not 2x as fast as a Single R580 card in actual games. From the looks of things typically as long as your not CPU bound you get a 70-80% increase from going from X1950 XTX to Crossfire.

Compares to Crossfire X1950 XTX a 8800 GTX is usually at worst 5-10% slower and sometimes even faster such as in Oblivion.

R600 being 10-20% faster then a Geforce 8800 GTX is expected given the time frame of release.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: coldpower27

R600 being 10-20% faster then a Geforce 8800 GTX is expected given the time frame of release.

10-20% faster then a Geforce 8800 GTX :p
:Q

that's ALL?
:shocked:

if so, AMD better pack it in ... and plan to severely discount it

unless their *features* are so 'compelling', even +20% won't cut it ... 8900 will easily pass it and leave it in exhaust

who wants a power-hungry GPU that only gets +20 when you *know* nvidia can beat it ... shortly ... and probably r660 too?
:confused:

uh-uh
:thumbsdown:

if we're talking +33-50% ... maybe ;)
a *single* 2900xtx clearly beating xfired 1950xtxes ... leaving GTX far behind

that would be *compelling*
 

imported_thefonz

Senior member
Dec 7, 2005
244
0
0
Originally posted by: apoppin
Originally posted by: coldpower27

R600 being 10-20% faster then a Geforce 8800 GTX is expected given the time frame of release.

10-20% faster then a Geforce 8800 GTX :p
:Q

that's ALL?
:shocked:

if so, AMD better pack it in ... and plan to severely discount it

unless their *features* are so 'compelling', even +20% won't cut it ... 8900 will easily pass it and leave it in exhaust

who wants a power-hungry GPU that only gets +20 when you *know* nvidia can beat it ... shortly ... and probably r660 too?
:confused:

uh-uh
:thumbsdown:

if we're talking +33-50% ... maybe ;)
a *single* 2900xtx clearly beating xfired 1950xtxes ... leaving GTX far behind

that would be *compelling*

Speak for yourself

I think I'd buy the 2800 over the 8800, for a 10-20% performance gain, if it comes in at around the same price or even 30-40 bucks more. The extra 60 watts or so is not that much of a problem, considering most people by now at least have a 500w, it really depends how many amps it pulls on the 12v rails, and how much load the rest of you system pulls.

And with the 65nm parts coming out, I'm guessing around the same time as the 8900, its going to be interesting the next few months.

All good for us, more competition= cheaper cards!
 

40sTheme

Golden Member
Sep 24, 2006
1,607
0
0
Originally posted by: flounder
I'm a little confused wouldn't current leakage be more pronounced in a smaller process?

No, because less current will have to be used, if I'm not mistaken.
 

Zstream

Diamond Member
Oct 24, 2005
3,395
277
136
I am selling my 320mb 8800 GTS and will purchase the midrange ATI card. I think you will see more of that soon.
 

Regs

Lifer
Aug 9, 2002
16,666
21
81
Originally posted by: chizow
A paper launch would've been better than what they're doing now. Hell, anything, even a 3DMark score would be nice. Instead, they're running TeraFLOP demos with Barcelona in CrossFire. :Q

They have working silicon but they're not willing to give any glimpse of what the card is capable of. I doubt anyone at AMD/ATI knows or wants to commit because its such a moving target. Every time they're expected to show us some hardware and numbers, they throw out new paper specs and ambiguous comments instead. All they offer is "you'll be surprised", but at this point, would anyone be surprised if it was a steaming pile or if it absolutely mauled G80?

LMAO - I agree! It's the old bait and switch. The marketing team wants but the engineering team is not delivering. All I've seen from AMD is marketing contingencies.

If AMD came out and displayed an example of the new GPU or CPU that crashed in a general desktop application every 20 or so seconds lets say, the hardware sites will have a field day and people will cast doubt on it. They would be better off not saying anything about it until it works obviously.
 

TanisHalfElven

Diamond Member
Jun 29, 2001
3,512
0
76
Originally posted by: Zstream
I am selling my 320mb 8800 GTS and will purchase the midrange ATI card. I think you will see more of that soon.

i doubt any midrange ati card will come close to rivaliing the sheer power of the 8800gts 320. its hampered by memory sure but the core is still the second fastest available today. and at low res it just as fast as the 640 version.

remeber ATI midrange ALWAYS sucls when it fiorst come out. the x1600xt sucked for the price, so did x700.
 

Matt2

Diamond Member
Jul 28, 2001
4,762
0
0
Originally posted by: Zstream
Originally posted by: Matt2
So was that Ruby movie rendered in real time on an R600 using DX10?

Yes it is and it beats Nvidia's Demo by far!

I find it very hard to believe that the Ruby Movie was rendered in real time using DX10.

What engine?

DO you have any proof besides your extreme fanboism to back up any of that?

That looked pre-rendered to me.

ATI can not only code and create an engine that looks that good, but has the hardware to run it at excellent speed?

Damn the midrange and give us this card for christ's sake.
 

ronnn

Diamond Member
May 22, 2003
3,918
0
71
Seems like bs to me, but what do I know. Nice vid of Ruby, thanks for the heads up. If the r600 is 20% faster than current stuff at lower resolutions - than huge win and only a fanboy could call it a loss. If only at very high resolutions - than we will have to wait for demanding dx10 games to know.
 

coldpower27

Golden Member
Jul 18, 2004
1,676
0
76
Originally posted by: tanishalfelven
Originally posted by: Zstream
I am selling my 320mb 8800 GTS and will purchase the midrange ATI card. I think you will see more of that soon.

i doubt any midrange ati card will come close to rivaliing the sheer power of the 8800gts 320. its hampered by memory sure but the core is still the second fastest available today. and at low res it just as fast as the 640 version.

remeber ATI midrange ALWAYS sucls when it fiorst come out. the x1600xt sucked for the price, so did x700.

Lower High End ATI is not too bad from what I remember, but it is very unlikely that a X2600 will rival a 8800 card of any caliber. They are pretty much the same generation, so you need to compare high end vs high end or ATI is likely to get trumped easily.

ATI pretty much miscalculated for the last 2 generation, or had issues with their mid range. They never calculate it correctly or some issue seems to pop up each and every time.

Radeon 9500 Series, great cards, but high production costs.
Radeon 9600 Series, fixed the issues of production cost at a slight expene of performance.

Radeon X700 Series, not able to clock high enough to compete with Nvidia's 6600 GT.
Radeon X800 GT, competitive to the 6600 GT at the expense of production cost, high die size, and 256Bit PCB

Radeon X1600 Series, not large enough performance increase for the midrange.
Radeon X1800 GTO, competitive in performance with the 7600 GT at the expense of production cost high end core with 256Bit PCB.
Radeon X1650 XT, finally got it right but still more expensive to produce then competition, and fairly late in the generation.

Radeon X2600's, might be ATI/AMD first time to get it right from the get go, hopefully it should be roughly 1/2 the X2800 Series and utilize the cost effective 128Bit Bus like Nvidia. But considering ATI's track record I am not holding my breath.



 

Matt2

Diamond Member
Jul 28, 2001
4,762
0
0
Originally posted by: ronnn
Seems like bs to me, but what do I know. Nice vid of Ruby, thanks for the heads up. If the r600 is 20% faster than current stuff at lower resolutions - than huge win and only a fanboy could call it a loss. If only at very high resolutions - than we will have to wait for demanding dx10 games to know.

I tend to disagree. Well, at least about calling it a "huge" win.

If R600 and G80 had released simultaneously and R600 was 20% faster then yeah, huge win.

But when you're 8 months late, Nvidia has sold enough to cards to pretty much not even care if they get beat by R600. This is especially true if they do in fact have a G81 core waiting in the wings.

Even if people shell out another $600 to go from G80 to R600, why would Nvidia care? They already got their money.

Fact is, G80 is already approaching what I would historically consider end of life for the G80 product line.

That being said, if Nvidia releases a G81, then that is what I will be comparing R600 to.
 

Paratus

Lifer
Jun 4, 2004
17,745
16,062
146
G80 EOL LOL :p

And folks wonder why the prices are so high.

posted via Palm Life Drive
 

Gstanfor

Banned
Oct 19, 1999
3,307
0
0
Originally posted by: Paratus
G80 EOL LOL :p

And folks wonder why the prices are so high.

posted via Palm Life Drive

If you remember back to when G80 launched I said that G80's biggest advantage was time to market. It has served its purpose brilliantly, just like NV10 (another product whose biggest advantage was time to market) while nvidia prepares the g8x version of NV15 (NV20 to NV25 is another example, but NV10 & NV15 stick in my memory for the atomic bomb like effect they had on the competition).
 

Matt2

Diamond Member
Jul 28, 2001
4,762
0
0
Originally posted by: Paratus
G80 EOL LOL :p

And folks wonder why the prices are so high.

posted via Palm Life Drive

Yes, EOL.

When G81 hits the market, there will be no more G80's produced. Hence the reason I said "approaching" end of life.
 

dreddfunk

Senior member
Jun 30, 2005
358
0
0
There is a big difference between being EOL (end-of-life) and EOTD (end-of-top-dawg).

While I've got little doubt that the g80's expected EOTD isn't far in the future, I sincerely doubt that the part will be EOL anytime soon. There are too many niches in the high-end and mid-high-end range to fill.

The space between the flagship and midrange performance will just increase when G81 ships, until nVidia can refresh the 8600 series. Until that time, they'll need the 8800 series to help fill the gaps, methinks.
 

ronnn

Diamond Member
May 22, 2003
3,918
0
71
Originally posted by: Matt2

That being said, if Nvidia releases a G81, then that is what I will be comparing R600 to.


Of course, one always compares the top cards of each company. A phantom ultra 8800 gtx is more likely though. Unless the g81 turns out to be 2 cards glued together. 20% at low resolutions is huge with current games. Would likely translate into an 80% advantage at high resolutions or newer games. Personally I think the r600 will be lucky to be 10% at low resolutions. Still a win is a win and if nv can match - well nice for us guys who ain't bought yet.


 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: ronnn
Originally posted by: Matt2

That being said, if Nvidia releases a G81, then that is what I will be comparing R600 to.


Of course, one always compares the top cards of each company. A phantom ultra 8800 gtx is more likely though. Unless the g81 turns out to be 2 cards glued together. 20% at low resolutions is huge with current games. Would likely translate into an 80% advantage at high resolutions or newer games. Personally I think the r600 will be lucky to be 10% at low resolutions. Still a win is a win and if nv can match - well nice for us guys who ain't bought yet.

nope ... the 8800 is expensive ... the 8900 should be a lot cheaper ... and faster ... nvidia has the luxury of refining it's process while "waiting" for AMD ... and hopefully, drivers. :p

r600 is going against 8800 x1.20[+] .... if we get any indication by how g80 o/c's now

right now r600 is a total loss ... *something* would be nice ... even if it is slower and a LOT cheaper. ;)
 

Keysplayr

Elite Member
Jan 16, 2003
21,219
56
91
Originally posted by: Wreckage
Originally posted by: Zstream
I am selling my 320mb 8800 GTS and will purchase the midrange ATI card. I think you will see more of that soon.

Based on what?

He doesn't have any idea what it's based on. Just blabber.
 

ronnn

Diamond Member
May 22, 2003
3,918
0
71
Originally posted by: apoppin

nope ... the 8800 is expensive ... the 8900 should be a lot cheaper ... and faster ... nvidia has the luxury of refining it's process while "waiting" for AMD ... and hopefully, drivers. :p

r600 is going against 8800 x1.20[+] .... if we get any indication by how g80 o/c's now

right now r600 is a total loss ... *something* would be nice ... even if it is slower and a LOT cheaper. ;)

Not sure if any of that made sense? Am very glad know the 8900 will be cheaper, as that implies that the r600 will rock. :beer:

 

Matt2

Diamond Member
Jul 28, 2001
4,762
0
0
Originally posted by: ronnn
Originally posted by: Matt2

That being said, if Nvidia releases a G81, then that is what I will be comparing R600 to.


Of course, one always compares the top cards of each company. A phantom ultra 8800 gtx is more likely though. Unless the g81 turns out to be 2 cards glued together. 20% at low resolutions is huge with current games. Would likely translate into an 80% advantage at high resolutions or newer games. Personally I think the r600 will be lucky to be 10% at low resolutions. Still a win is a win and if nv can match - well nice for us guys who ain't bought yet.

What I am trying to say is that AMD, as far as I'm concerned, has missed out on an entire generation of graphics cards.

The above statment only applies if there is an actual G81, which I think is more likely than the "8800GX2" that you're implying will be Nvidia's answer to R600. I think that Nvidia will indeed launch a G81 that will be on an 80nm process. Less heat, less power and higher clocks for "8900GTX" will absolutely spoil anything R600 is going to offer. I'll bet we wont see a "GX2" breed of G8x until G80 is refreshed. An "8900GX2" is more likely than an "8800GX2".

The ball game is in AMD's hands. If R600 can beat the 8800GTX by 15% or more in the majority of gaming situations, then I think that Nvidia will launch G81 within a month or two of R600's release. If R600 is only 10% faster or less, I think we'll see an overclocked G80 until AMD refreshes R600 (who knows when that will be).