Originally posted by: Gamingphreek
Originally posted by: humey
I do know how ddr1 and ddr2 works and its the way i said above and i knew this as long ago i looked on sites same as one i posted.
The dia's in that webpage show exactly how it works and its crap for top range gpus hense we had pos 5800/5900/5950 running hot and now used gddr3 which is better.
BTW Intel have dropped ddr2 from new chipset mobos range, prob as its not to great yet as of high latencys CL, and amd claimed long ago in interview they wont use till price is similar to ddr1, i for one think if you on amd you may even skip ddr2 for some form of ddr3 not sure if it can be like gddr as this is ultrafast gpu ram and gpu is always diff and superiour to system ram.
You are way off, on everything.
100x2x2 is not DDR3 (G stands for Graphics). That is QDR, Quad Data Rate, which is not used yet.
The only card to use GDDR2 was the 5800U and later the 5700U tried it. There are no other cards that used GDDR2. In that generation (5800/5900/5950; 9xxx series), EVERYTHING used DDR except the two cards i listed earlier. Also it was not the fact that it was running hot, but it was VERY expensive to produce. It was excellent however yeilds were not good. I dont know where you got it in you head that GDDR2 was horrible for high end GPU's or for any GPU for that matter.
Additionally where in the world did you hear that intel dropped support for DDR2. Because that is completely false.
Also DDR3 for the desktop is not ready yet or dont you think we would be using it. We are just starting to use DDR2. Once again their are only minor archiectural differences between GDDR(x) and DDR(x), other than that they are the exact same thing. One is not "superior" to the other.
-Kevin