Originally posted by: Lonyo
Originally posted by: lopri
I am most curious whether NV will adopt GDDR5 this time. (Heck, even GDDR4!)
This has been puzzling to me for a long time since G92 was introduced. Why the heck NV improved the core (G80->G92) then crippled it with memory configuration? I know there was a cost saving from G80->G92 transition, but I can't help but imagine how G92 might have performed had it been mated with fast GDDR5. And considering all the drama NV has been playing with its products updates and namings since then, I wonder if G92 was originally meant to be paird with GDDR5. It didn't happen, though, as we know.
Now, will NV stick to GDDR3 even for its next gen (GT300)? If so, is it because:
1. ATI has patent and/or royalties of GDDR5, and Mr. Huang would rather die than give a dime to ATI.
2. ATI has existing contracts with GDDR5 manufacturers, and there is not enough supply for NV's demand, which is likely quite bigger than that of ATI's.
3. No conspiracy or market theory. Sticking to GDDR3 is by design.
Is there any rumor regarding this? What do you guys think about the imaginary G92+GDDR5?
IMO there is no way that NV can stick with GDDR3 for next gen, and will have to move to GDDR5 no matter what.
GDDR4 is out of the market pretty much, since GDDR5 is much better. GDDR4 kind of never went anywhere.
If NV want to stay competitive with ATI they will need to increase memory bandwidth (since this is what always happens). Currently ATI has 256-bit memory buses but has in the past had 512-bit.
If GDDR5 doesn't increase in speeds enough for ATI they can increase their bus width quite easily I would expect (since they've been there before) to gain bandwidth.
NV are already at 512-bit for their high end cards and (arguably) there's no real way to go that far above 512-bit without a horribly expensive card, so going GDDR5 is the easy way to gain bandwidth.
I can't really see NV NOT going GDDR5 with their next gen cards, especially given that prices will be a lot lower and availability a lot higher.
Since NV released the GTX2x0's before ATI released their HD4870, and wanted good availability, it pretty much wasn't an option back then to use GDDR5 since it wasn't easily available enough, but one year on the market has probably changed meaning it'll be easy to make use of GDDR5. NV couldn't really take the gamble on limited supplies etc a year ago when they were launching their new cards.
GDDR5 was barely marketable when the HD4870 came out
http://www.anandtech.com/video/showdoc.aspx?i=3469&p=8 A bit of background on GDDR5 circa GTX/4800 launch time. ATI needed it, NV's design choice meant they didn't.
There's also no way that G92 could have hoped to use GDDR5 given its launch date, and maybe they could have added it with a respin, but I doubt that was in NV's mind if GDDR5 wasn't widely available when they were planning and doing the die shrink.