• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

Does DDR3 = GDDR3?

I've been seeing lots of video card vendors websites that say the card has 128mb of DDR3 RAM when places like newegg say its GDDR3 SDRAM for the same card. So is DDR3 really just GDDR3 and if not how much performance is lost between the two?
 
Isn't 2.0ns DDR1 and 2.0ns GDDR3 also pretty much the same? They both have the same theoretical frequency limit (1000mhz). But I think GDDR3 produces more heat and uses more power... or something like that. Somebody get this right more me.
 
ddr1 is like 200mhz x2 = 400mhz ddr

ddr2 is like 100x2x2 = 400mhz ddr2 this created massive heat on nvidia crap 5800/5900/5950 cards as high end as of fast ram speeds

ATI talked of Gddr3 and nvidia got in 1st with it and its ddr2 specs as above but runs lower volts so runs cooler.
 
Originally posted by: humey
ddr1 is like 200mhz x2 = 400mhz ddr

ddr2 is like 100x2x2 = 400mhz ddr2 this created massive heat on nvidia crap 5800/5900/5950 cards as high end as of fast ram speeds

ATI talked of Gddr3 and nvidia got in 1st with it and its ddr2 specs as above but runs lower volts so runs cooler.

Nope DDR2 isn't QDR, it's only x2. It just runs on a lower process and uses less voltage, thus more efficient and can clock higher.
 
So DDR isn't much < then DDR3? Because all the vanilla 6800's using PCI-e seem to be DDR RAM. I'm probably nitpicking but I just want to learn more I guess, especially since I might get one and just hope for a successful pipe unlock. Cookies to all of you regardless. 🙂
 
Originally posted by: humey
virtualgames0, ahh qddr ? is this typo

DDR2 is 100x2x2 = 400 for example and thats a fact.

http://www.pcstats.com/articleview.cfm?articleid=1573&page=3


Look at pics please.

*Technically*, DDR2-400 is 200Mhz DDR that is implemented interally as dual-channel 100Mhz DDR, along with some other improvements to lower power usage and allow higher clock rates.

As far as your memory controller is concerned, though, it runs at 200Mhz (400Mhz effective). It has no way to really tell if it's being double-pumped internally, or if some other mechanism is in use.
 
Do you all remember the differences between DRAM, VRAM, and WRAM, and then SDRAM and SGRAM? I vaguely do, am I right in assuming that SDRAM -> SGRAM is basically the same as DRAM -> VRAM, that they made the data interface dual-ported and added a secondary serial readout for the RAMDACs to use? (And WRAM was just VRAM with an optimized rectangle/buffer-clear function, AFAIK.) Would that likely imply that GDDR3 is like DDR3, but with a secondary read port for the RAMDACs as well?
 
I do know how ddr1 and ddr2 works and its the way i said above and i knew this as long ago i looked on sites same as one i posted.

The dia's in that webpage show exactly how it works and its crap for top range gpus hense we had pos 5800/5900/5950 running hot and now used gddr3 which is better.

BTW Intel have dropped ddr2 from new chipset mobos range, prob as its not to great yet as of high latencys CL, and amd claimed long ago in interview they wont use till price is similar to ddr1, i for one think if you on amd you may even skip ddr2 for some form of ddr3 not sure if it can be like gddr as this is ultrafast gpu ram and gpu is always diff and superiour to system ram.
 
Originally posted by: humey
I do know how ddr1 and ddr2 works and its the way i said above and i knew this as long ago i looked on sites same as one i posted.

The dia's in that webpage show exactly how it works and its crap for top range gpus hense we had pos 5800/5900/5950 running hot and now used gddr3 which is better.

BTW Intel have dropped ddr2 from new chipset mobos range, prob as its not to great yet as of high latencys CL, and amd claimed long ago in interview they wont use till price is similar to ddr1, i for one think if you on amd you may even skip ddr2 for some form of ddr3 not sure if it can be like gddr as this is ultrafast gpu ram and gpu is always diff and superiour to system ram.

You are way off, on everything.

100x2x2 is not DDR3 (G stands for Graphics). That is QDR, Quad Data Rate, which is not used yet.

The only card to use GDDR2 was the 5800U and later the 5700U tried it. There are no other cards that used GDDR2. In that generation (5800/5900/5950; 9xxx series), EVERYTHING used DDR except the two cards i listed earlier. Also it was not the fact that it was running hot, but it was VERY expensive to produce. It was excellent however yeilds were not good. I dont know where you got it in you head that GDDR2 was horrible for high end GPU's or for any GPU for that matter.

Additionally where in the world did you hear that intel dropped support for DDR2. Because that is completely false.

Also DDR3 for the desktop is not ready yet or dont you think we would be using it. We are just starting to use DDR2. Once again their are only minor archiectural differences between GDDR(x) and DDR(x), other than that they are the exact same thing. One is not "superior" to the other.

-Kevin
 
Originally posted by: Gamingphreek
Originally posted by: humey
I do know how ddr1 and ddr2 works and its the way i said above and i knew this as long ago i looked on sites same as one i posted.

The dia's in that webpage show exactly how it works and its crap for top range gpus hense we had pos 5800/5900/5950 running hot and now used gddr3 which is better.

BTW Intel have dropped ddr2 from new chipset mobos range, prob as its not to great yet as of high latencys CL, and amd claimed long ago in interview they wont use till price is similar to ddr1, i for one think if you on amd you may even skip ddr2 for some form of ddr3 not sure if it can be like gddr as this is ultrafast gpu ram and gpu is always diff and superiour to system ram.

You are way off, on everything.

100x2x2 is not DDR3 (G stands for Graphics). That is QDR, Quad Data Rate, which is not used yet.

The only card to use GDDR2 was the 5800U and later the 5700U tried it. There are no other cards that used GDDR2. In that generation (5800/5900/5950; 9xxx series), EVERYTHING used DDR except the two cards i listed earlier. Also it was not the fact that it was running hot, but it was VERY expensive to produce. It was excellent however yeilds were not good. I dont know where you got it in you head that GDDR2 was horrible for high end GPU's or for any GPU for that matter.

Additionally where in the world did you hear that intel dropped support for DDR2. Because that is completely false.

Also DDR3 for the desktop is not ready yet or dont you think we would be using it. We are just starting to use DDR2. Once again their are only minor archiectural differences between GDDR(x) and DDR(x), other than that they are the exact same thing. One is not "superior" to the other.

-Kevin

Basically the GDDR3 memory is optimised for the kind of dataflow a graphics card would require, which means it may not be so suitable as RAM for a normal system board, and vice versa, DDR3 would be more suited to normal system board but not so useful on a graphics card, altho i imagine the differences are not that fantastic, possibly has more even to do with the lifespan of the chips.

from what i remember the cards with normal DDR memory only hit just under 1,000Mhz effective, the GDDR3 cards can happily pop over that by a good couple hundred MHz....

if im wrong, please dont get abusive, just going on what i remember seeing over various sites and magazines.
 
Gamingphreek is a user who misreads others post and then likes to troll them, so ignore him mate, im already sick of him today.

I never said gddr3 was same as ddr 1/2 or 3, and you better get of my case Gamingphreek cause you are doing nothing constructive but going against any infoi others post by claiming we said stuff thet we didnt.

the 5800 used ddr2 not anything else and it ran hot and was slagged as a hoover for noise to trying to cool it.

Again i never compaired gddr3 to system ram, so butt out mate.
 
Originally posted by: Gamingphreek
Originally posted by: humey
I do know how ddr1 and ddr2 works and its the way i said above and i knew this as long ago i looked on sites same as one i posted.

The dia's in that webpage show exactly how it works and its crap for top range gpus hense we had pos 5800/5900/5950 running hot and now used gddr3 which is better.

BTW Intel have dropped ddr2 from new chipset mobos range, prob as its not to great yet as of high latencys CL, and amd claimed long ago in interview they wont use till price is similar to ddr1, i for one think if you on amd you may even skip ddr2 for some form of ddr3 not sure if it can be like gddr as this is ultrafast gpu ram and gpu is always diff and superiour to system ram.

You are way off, on everything.

100x2x2 is not DDR3 (G stands for Graphics). That is QDR, Quad Data Rate, which is not used yet.

The only card to use GDDR2 was the 5800U and later the 5700U tried it. There are no other cards that used GDDR2. In that generation (5800/5900/5950; 9xxx series), EVERYTHING used DDR except the two cards i listed earlier. Also it was not the fact that it was running hot, but it was VERY expensive to produce. It was excellent however yeilds were not good. I dont know where you got it in you head that GDDR2 was horrible for high end GPU's or for any GPU for that matter.

Additionally where in the world did you hear that intel dropped support for DDR2. Because that is completely false.

Also DDR3 for the desktop is not ready yet or dont you think we would be using it. We are just starting to use DDR2. Once again their are only minor archiectural differences between GDDR(x) and DDR(x), other than that they are the exact same thing. One is not "superior" to the other.

-Kevin

There were some 9800pro's which used DDR2 instead of DDR1.
 
Originally posted by: Sc4freak
Originally posted by: Gamingphreek
Originally posted by: humey
I do know how ddr1 and ddr2 works and its the way i said above and i knew this as long ago i looked on sites same as one i posted.

The dia's in that webpage show exactly how it works and its crap for top range gpus hense we had pos 5800/5900/5950 running hot and now used gddr3 which is better.

BTW Intel have dropped ddr2 from new chipset mobos range, prob as its not to great yet as of high latencys CL, and amd claimed long ago in interview they wont use till price is similar to ddr1, i for one think if you on amd you may even skip ddr2 for some form of ddr3 not sure if it can be like gddr as this is ultrafast gpu ram and gpu is always diff and superiour to system ram.

You are way off, on everything.

100x2x2 is not DDR3 (G stands for Graphics). That is QDR, Quad Data Rate, which is not used yet.

The only card to use GDDR2 was the 5800U and later the 5700U tried it. There are no other cards that used GDDR2. In that generation (5800/5900/5950; 9xxx series), EVERYTHING used DDR except the two cards i listed earlier. Also it was not the fact that it was running hot, but it was VERY expensive to produce. It was excellent however yeilds were not good. I dont know where you got it in you head that GDDR2 was horrible for high end GPU's or for any GPU for that matter.

Additionally where in the world did you hear that intel dropped support for DDR2. Because that is completely false.

Also DDR3 for the desktop is not ready yet or dont you think we would be using it. We are just starting to use DDR2. Once again their are only minor archiectural differences between GDDR(x) and DDR(x), other than that they are the exact same thing. One is not "superior" to the other.

-Kevin

There were some 9800pro's which used DDR2 instead of DDR1.

Yep, the 256mb versions
 
Originally posted by: humey
BTW Intel have dropped ddr2 from new chipset mobos range

If you are trying to say that intel is not putting DDR2 mem controllers in their current and future high-end chipsets, you are just full of it.
 
Originally posted by: humey
Gamingphreek is a user who misreads others post and then likes to troll them, so ignore him mate, im already sick of him today.

I never said gddr3 was same as ddr 1/2 or 3, and you better get of my case Gamingphreek cause you are doing nothing constructive but going against any infoi others post by claiming we said stuff thet we didnt.

the 5800 used ddr2 not anything else and it ran hot and was slagged as a hoover for noise to trying to cool it.

Again i never compaired gddr3 to system ram, so butt out mate.

How old are you? Nothing constructive was said in your post either, except personally attacking another AT'er for disagreeing with you. While you have some facts right, it is not likely that you have them all right, another person may remember a different thing than you and not even you can remember everything.

All arguing over how DDR1,2,3 may work can be saved by mearly reading this one article. http://www.lostcircuits.com/memory/ddr3/
 
rgreen83 are you some kind of asshole aswell im 33 btw, the info i put in was from a site, and intel have sropped ddr2 from new chipset mobo, i advise you go and read up and STFU till you get facts.

www.google.com

And to get something straight i only got issue with that 1 prick who is on my case in 3 post and i seem to see others to after reading.

I know 5800 was ddr2 i know my 4600 was ddr1 i know my 6800 is gdrr3, i never said mobos will use gddr3 so why the hell do other pricks that cant read change what i and others say constantly ?.

Out of all the forums i know and on irc this is most agruementative one, it must be the age group i guess, i only answer to try help the peep asking the Q not get load bollocks back, i can easily back up what i say.
 
Back
Top