- Feb 16, 2018
- 41
- 4
- 11
Hey there,
I'm noticing more and more, that when it comes to RAM speeds, people are still spending HUNDREDS and HUNDREDS of dollars per build for a board that supports the latest frequency of ram, sometimes an extra 100 for a DDR x board if the new ram stepping is out..
Can i ask... why? Why do people still seem to class RAM in their System build as JUST as important to have the newest and fastest and most overclockable like it's just as big of a benchmark as the CPU's ability to boost, keep cool and overclock? Or the graphics card? I mean, they've gone as fat as COVERING the little things with massive heatsinks and even fans on them, sometimes spending like 500 dollars on the laest 128gb set of DDR4 10k overclocked to it's maximum stable of 10240MHz etc...
Again, why? I've been following the speeds, ram timings, ram types on and off for years, and i think everyone has been had!
Again to reiterate, people seem to be upgrading their ram, and overclocking it and treating it like it's just as important to keep this stuff up to date and overclocked just as much as the CPU and GPU, which is absolute nonsense.
The biggest difference to usage and gaming, was the dump from SDR to DDR. Double data rate. That's it.
Like DDR literally doubled the throughput of data between itself and the CPU.
And because of this, alot of people naturally think that DDR2 triples the rate, and DDR3 Quadruples it etc, when this simply isn't the case.
I was doing alot of testing back when the switch went from DDR2 to DDR3. I tested a couple of games on my 4gb DDR2 @800Mhz vs my new DDR3 @ 1600 MHz, so naturally you'd think "Well this is gonna give me double the data rate". But it didn't. It barely made any difference at all. Like in some cases i was getting a couple of more frames per second from the DDR3, but then in the same benchmark in the same game, i was getting more FPS on the DDR2!
And the same is happening again today. There's LITERALLY no difference to the end user in speeds or noticible FPS between DDR3 at 1600Mhz, and DDR4 at 4800Mhz overclocked within an inch of it's life.
you MIGHT see a 2-3% difference in SOME small parts of benchmarks, but to the player, himself, playing the game, rendering the model or scene etc, it's ABSOLUTELY 100% and categorically identical.
There's no difference in real world applications between DDR3 1600Mhz and DDR4 4800MHz overclocked, and there CERTAINLY isn't any difference whatsoever between Ram running on the same stepping with just different timings. (like 2 lots of the same DDR 4 and 1 is running at an overclocked rate). All you're doing is putting stress on your components, costing yourself more money in electricity etc, and wasting hundreds of dollars per year, and thousands over your whole career as a system builder.
I know it sounds like i'm shouting lol, i'm not, i love overclocking and stuff, i started off back in the early 2000's when i had an AMD Athlon Thunderbird AXIA 1.3Ghz which i had overclocked on air at 1.71GHz (that was massive back then) and 256MB of SDRam, and a Geforce 2 GTS overclocked on the memory and core up to Grforce Ultra type speeds.
We're being ripped off with the ram.
The fact that everyone has switched to a new motherboard, new CPU's etc in alot of cases to accomodate this magical DDR4 4800MHz that can be overclocked like this is gonna give them some extra magic boost in their gaming etc, is just nonsense.
Has anyone else been following any of this? Like there's barely any tests ANYWHERE of someone testing the real world speeds od DDR vs DDR 2 vs DDR 3 vs DDR 4 and CERTAINLY none doing all types at all common and overclocked speeds and putting all the data out. All we're seeing is like Sisoft or whatever telling us what should happen theoretically, when in reality, there's is hardly any difference between the generations and I'd say NO difference between ram timings in ns or frequency,
if anyone would like to add onto this, or to correct me I'd be delighted to chat.
Thanks!
I'm noticing more and more, that when it comes to RAM speeds, people are still spending HUNDREDS and HUNDREDS of dollars per build for a board that supports the latest frequency of ram, sometimes an extra 100 for a DDR x board if the new ram stepping is out..
Can i ask... why? Why do people still seem to class RAM in their System build as JUST as important to have the newest and fastest and most overclockable like it's just as big of a benchmark as the CPU's ability to boost, keep cool and overclock? Or the graphics card? I mean, they've gone as fat as COVERING the little things with massive heatsinks and even fans on them, sometimes spending like 500 dollars on the laest 128gb set of DDR4 10k overclocked to it's maximum stable of 10240MHz etc...
Again, why? I've been following the speeds, ram timings, ram types on and off for years, and i think everyone has been had!
Again to reiterate, people seem to be upgrading their ram, and overclocking it and treating it like it's just as important to keep this stuff up to date and overclocked just as much as the CPU and GPU, which is absolute nonsense.
The biggest difference to usage and gaming, was the dump from SDR to DDR. Double data rate. That's it.
Like DDR literally doubled the throughput of data between itself and the CPU.
And because of this, alot of people naturally think that DDR2 triples the rate, and DDR3 Quadruples it etc, when this simply isn't the case.
I was doing alot of testing back when the switch went from DDR2 to DDR3. I tested a couple of games on my 4gb DDR2 @800Mhz vs my new DDR3 @ 1600 MHz, so naturally you'd think "Well this is gonna give me double the data rate". But it didn't. It barely made any difference at all. Like in some cases i was getting a couple of more frames per second from the DDR3, but then in the same benchmark in the same game, i was getting more FPS on the DDR2!
And the same is happening again today. There's LITERALLY no difference to the end user in speeds or noticible FPS between DDR3 at 1600Mhz, and DDR4 at 4800Mhz overclocked within an inch of it's life.
you MIGHT see a 2-3% difference in SOME small parts of benchmarks, but to the player, himself, playing the game, rendering the model or scene etc, it's ABSOLUTELY 100% and categorically identical.
There's no difference in real world applications between DDR3 1600Mhz and DDR4 4800MHz overclocked, and there CERTAINLY isn't any difference whatsoever between Ram running on the same stepping with just different timings. (like 2 lots of the same DDR 4 and 1 is running at an overclocked rate). All you're doing is putting stress on your components, costing yourself more money in electricity etc, and wasting hundreds of dollars per year, and thousands over your whole career as a system builder.
I know it sounds like i'm shouting lol, i'm not, i love overclocking and stuff, i started off back in the early 2000's when i had an AMD Athlon Thunderbird AXIA 1.3Ghz which i had overclocked on air at 1.71GHz (that was massive back then) and 256MB of SDRam, and a Geforce 2 GTS overclocked on the memory and core up to Grforce Ultra type speeds.
We're being ripped off with the ram.
The fact that everyone has switched to a new motherboard, new CPU's etc in alot of cases to accomodate this magical DDR4 4800MHz that can be overclocked like this is gonna give them some extra magic boost in their gaming etc, is just nonsense.
Has anyone else been following any of this? Like there's barely any tests ANYWHERE of someone testing the real world speeds od DDR vs DDR 2 vs DDR 3 vs DDR 4 and CERTAINLY none doing all types at all common and overclocked speeds and putting all the data out. All we're seeing is like Sisoft or whatever telling us what should happen theoretically, when in reality, there's is hardly any difference between the generations and I'd say NO difference between ram timings in ns or frequency,
if anyone would like to add onto this, or to correct me I'd be delighted to chat.
Thanks!