R600 to be 80nm

Page 12 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

olmer

Senior member
Dec 28, 2006
324
0
0
r600cardfn9.jpg


http://www.fudzilla.com/index.php?option=com_content&task=view&id=483&Itemid=1
 

SexyK

Golden Member
Jul 30, 2001
1,343
4
76
Originally posted by: kobymu
Originally posted by: SexyK
Obviously if it is needed, it will be developed eventually. However, there are people on this board claiming that Fusion is arriving and will wipe out the discreet GPU add-in card on arrival within the next year or two. My only point in this whole discussion is that there are major hurdles to overcome and high-end graphics cores integrated into CPUs (a la Fusion) will need 5+ years of platform development before we even begin to discuss the demise of the add-in card.
If I was in my optimist mood I would give it ~3 years, 5 years is a very "safe" gamble. ;)

The problem is, IMHO, that know one knows in which way Intel will take it, not a lot of decision making people will be willing/comfortable to go for an AMD only solution, once Intel CPU/GPU hybrid specs/docs will be out (assuming that AMD will reach that point first) than we can start seeing some serious development taking place.

Agreed - before you posted this response I was reconsidering my last post and decided that 3-5 years is probably a realistic estimate. 3 years may be possible if the stars completely align industry-wide.

edit - I should be more explicit: I mean 3 years minimum until a competitive high-end Fusion-type CPU/GPU hybrid is possible, not 3 years until discreet add-in cards are extinct.
 

SexyK

Golden Member
Jul 30, 2001
1,343
4
76

Looks pretty massive for an 80nm die. Eye-balling it, the die doesn't seem significantly larger than G80, but of course G80 is 90nm. Photo of G80 core for reference. Not gonna pass judgment until we get firm heat/power usage numbers. Gotta say the HSF doesn't inspire much confidence that it'll be a cool runner though... still yet to see this supposedly smaller "retail" HSF....
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: SexyK
Originally posted by: kobymu
Originally posted by: SexyK
Obviously if it is needed, it will be developed eventually. However, there are people on this board claiming that Fusion is arriving and will wipe out the discreet GPU add-in card on arrival within the next year or two. My only point in this whole discussion is that there are major hurdles to overcome and high-end graphics cores integrated into CPUs (a la Fusion) will need 5+ years of platform development before we even begin to discuss the demise of the add-in card.
If I was in my optimist mood I would give it ~3 years, 5 years is a very "safe" gamble. ;)

The problem is, IMHO, that know one knows in which way Intel will take it, not a lot of decision making people will be willing/comfortable to go for an AMD only solution, once Intel CPU/GPU hybrid specs/docs will be out (assuming that AMD will reach that point first) than we can start seeing some serious development taking place.

Agreed - before you posted this response I was reconsidering my last post and decided that 3-5 years is probably a realistic estimate. 3 years may be possible if the stars completely align industry-wide.

edit - I should be more explicit: I mean 3 years minimum until a competitive high-end Fusion-type CPU/GPU hybrid is possible, not 3 years until discreet add-in cards are extinct.

that is *exactly* what i claimed ... way back when ... at the *beginning* of this latest discussion ... 5 years till phase out of discreet gfx starts. :)

BTW ... it's B-O-R-I-N-G without controversy

.... how about a 'little' name calling? :p

:D

BTW, when is the latest launch of r600 scheduled ... next month?
:confused:

 

dreddfunk

Senior member
Jun 30, 2005
358
0
0
It's boring without controversy? I actually was really fascinated by the discussion between kobymu and SexyK. I learned a lot.

In my opinion it's better without controversy and name-calling...

;)
 

dreddfunk

Senior member
Jun 30, 2005
358
0
0
apoppin - no disrespect intended, but it's hard for me to understand your use of emoticons at times. That's one reason why I eschew using them for the most part. Nuances of context are notoriously hard to replicate on a message board. You kind of have to say exactly what you mean or it might be misinterpreted. Sarcasm, etc., is usually conveyed through body language and tone, and sometimes we aren't aware that 'tone' is much harder to produce in written language.

At any rate, I didn't mean to seriously take you to task or anything.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: dreddfunk
apoppin - no disrespect intended, but it's hard for me to understand your use of emoticons at times. That's one reason why I eschew using them for the most part. Nuances of context are notoriously hard to replicate on a message board. You kind of have to say exactly what you mean or it might be misinterpreted. Sarcasm, etc., is usually conveyed through body language and tone, and sometimes we aren't aware that 'tone' is much harder to produce in written language.

At any rate, I didn't mean to seriously take you to task or anything.

again ... no prob

notice i didn't say you did ... i just wanted to 'clarify' ... for the record
... the suggestion for "a little name calling" should have also given the reader a clue. :p

... and 'emoticons' DO tend to roughly convey in written form what we do with 'body language'
-- i say we need more and better ones for this forum.

... and of course, i talk with my hands, anyway
 

zephyrprime

Diamond Member
Feb 18, 2001
7,512
2
81
The only way that memory bandwidth issues can be alleviated to a large degree for GPUs would be if the rendering process implemented some sort of tiling. If the screen were chopped up into tiles, the datasets used by a gpu could fit into some sort of cache. I know that some sort of tiling was done before but I don't know whatever happened to that.