• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

GDDR4?

josh6079

Diamond Member
Just wondering, from GDDR2 to GDDR3, how much of a performance increase was there? Would that be comparable to GDDR3 to GDDR4?

I've heard that ATI's new card will have that, but I don't know if it will really do much. I also am wondering if the G80 will have it, which I'd hope it would.
 
To be honest, I don't think I've seen very many cards with DDR2. And I don't think that GDDR 4 is gonna be that great. I know it's not teh same but, look at DDR2 and DDR1. In AM2 you get OMG! 1%!!/sarcasm
 
No one can really say right now, not much has been released on GDDR4. DDR2 is significantly better than DDR, it just depends on how you use it, reason AM2 cpu's as of now only got minor performance gains was because AMD stuck with the same K9 architecture used in s939, which proved to not benefit from DDR2. Now look at intel, who stuck with the netburst architecture whent hey migrated to DDR2 standards they saw pretty major improvements.
 
well ati's upcoming dx 10 card with infied shaders have gddr4 memory.
So iguess it must be good.Maybe it'll raise the mem speeds above 1000(as in 2000 effective)
 
Things just get a little more efficient in terms of performance at a certain frequency - pretty sure GDDR2 @ 800Mhz would be closer to GDDR3 @700Mhz on average performance for example.
 
Originally posted by: GundamSonicZeroX
To be honest, I don't think I've seen very many cards with DDR2. And I don't think that GDDR 4 is gonna be that great. I know it's not teh same but, look at DDR2 and DDR1. In AM2 you get OMG! 1%!!/sarcasm

overclocking the crap out of regular DDR doesn't get you that much with the A64s. they're just no where near as memory dependent as the P4s are.
 
It's almost twice as fast (clockspeed wise) as GDDR3. 1.4Ghz for an effective 2.8Ghz speed. Samsung has already pushed the stuff to 3.2Ghz. Not sure how much realworld perf. gain will come from that though. I think it will be fairly sizeable for the G80 and R600 chips running at high res. though. Also uses less power.Depending on when G80 comes the ATI 580+ chip on 80nm process will probably be the first card to use it, sometime Q3, 2006.
 
Originally posted by: ElFenix
Originally posted by: GundamSonicZeroX
To be honest, I don't think I've seen very many cards with DDR2. And I don't think that GDDR 4 is gonna be that great. I know it's not teh same but, look at DDR2 and DDR1. In AM2 you get OMG! 1%!!/sarcasm

overclocking the crap out of regular DDR doesn't get you that much with the A64s. they're just no where near as memory dependent as the P4s are.

I'm not sure I'd fully agree with that. Have tests been performed in areas where heavy use of memory is likely to be made (databases, emulating old computer systems, caching of data etc) and not just on ordinary apps that are fairly easy going on memory?
 
Originally posted by: GundamSonicZeroX
To be honest, I don't think I've seen very many cards with DDR2. And I don't think that GDDR 4 is gonna be that great. I know it's not teh same but, look at DDR2 and DDR1. In AM2 you get OMG! 1%!!/sarcasm

That's because the A64 was nowhere close to being bandwidth limited by DDR1. The A64 prefers low latency over high bandwidth, which is just the opposite of what ddr2 brings. But graphic cards are more sensitive to memory bandwidth, and although the gpu throughput is still more important, the memory makes a significant difference too.
 
Originally posted by: josh6079
Just wondering, from GDDR2 to GDDR3, how much of a performance increase was there? Would that be comparable to GDDR3 to GDDR4?

I've heard that ATI's new card will have that, but I don't know if it will really do much. I also am wondering if the G80 will have it, which I'd hope it would.

Probably not much improvement clock for clock but gddr4 can run alot faster, so it all depends on how fast the memory is clocked.
 
Considering todays cards really arn't memory bandwidth limited, I don't think it'll be that much of a performance impact. On todays cards, you really don't max out memory bandwidth until you get up there really high in res.

I have my 7900GT at 900mhz (1800mhz effective) and it use to be near 2000mhz effective, and I can't tell the difference. And the 3dmark score is the same. Gddr4 will most likely be for the future as resolution increases and the need for memory bandwidth (AA, AF, HDR, all use up a lot of memory bandwidth) increases as well, beyond GDDR3's capabilities.
 
Thanks for the info guys. I know that if the G80 will come with GDDR4, they'll have to have a more effiecient memeory draw, like ATI's ring-bus system. I just can't see why if all of the new cards coming out are going to be more effecient that's going to demand more power. I guess its the GPU's that is doing most of the draining.
 
Back
Top