-----
Latency is a big matter on RAM. Because rambus is in serial, the more ram you have the overall latency of it increases. Thats why its NOT good to be used in server applications that demand a huge amount of ram.
However, Rambus latency can actually be *lower* than DDR SDram latency. Once your throughput reaches anywhere near max, the latency of SDram increases. It was a long time ago when Intel did a comparison to i820 and i440BX, when memory bandwidth consumption was relatively low, the i440BX outperfromed the i820 by a large margin, latency wise. However, once it got near the half way mark, it was about even, and anywhere after that the i820 outperformed the i440BX latency wise. When memory bandwidth was near max capacity, the RDram would significantly outperform the i440BX's SDRam platform. The same situation would apply to DDR SDram because it is a derivation of original SDram. The very ironic thing was that i820 was a VERY poor implementation of RDram (only single channel) and i440BX was probably the best implementation of SDram.
-----
This Fud is still floating around, my god.
come on people, latecny does not magically increase because you ask your ram for more data/second. This is completly false and anyone who beleives otherwise has fallen victum to some very shady PR by Intel, rambus and possibly Anand ( though i suspect Anand just does not know any better and is not really in on it )
The information your getting this from is based of something a little dishonest on Intels part. What They did was basically ask the ram to deleiver more data/sec that it's theoritcall maxium, resulting in a sustained effciency of infinity ( or 0% depending on how you view it ). What of course they ommit is, RD-RAM will do exactly the same thing provided you ask it to give you more than it's theorietical max. In the graph used, Intel does not get to close to RD-RAM maxium bandwdth potential.
If you star at this Intel PR long enough, you will not only realize Intel is basically really streching the truth, but are even dumb enough to break the laws of mathmatics and the universe to tell the lie. Hint. Intel believes if you reach infinity and keep going, you suddendly start going backwards. Last I heard this is in the realm of a very grey area in theoritical mathmatics/Positive-physics which has no real life application known yet. Excpet at intel of course!