<<
A FLOP is a FLoating point Operation Per Second. (Everyone seems to be missing out the per second in their definition, althought correctly staying that 78Gigaflops is 78 billion of these suckers in a second.)
Floating point opperations take different amount of time depening on the numbers and the operation. I would read this as 78 billion very trivial operations that we are highly optimised to do. (Not sure if load/store counts or not?) >>
well, just to be pendantic, a FLOP
S is a floating point operation per second.
also, according to dell anyway, a Gigabyte is 1,000,000,000 bytes which would make a Megabyte 1,000,000 bytes and not 1,024,000 bytes which means your getting short changed even more. I'm pretty sure a gigabyte of RAM however, is 1024^3 as the memory count thingie on boot counts in Kbytes and 65526 KBytes is 64 MB
Superdoopercooper: When working with things in Base 2, each progressive prefex denotes a rasing of 2^10. When working with stuff that isnt base 2, each progressive prefix is 10^3 The obvious exception being HD size. With lower prefixs (2^10)^n ~ (10^3)^n but this difference increases with size. Only with things that work inherently in base 2 is 2^10 used. ie not crack cocaine
The reason being, quite often you have objects whose sizes is exact integer multiples of 2^10 which means it is a lot easier sayin 64MB rather than 65.536 MB
the IEEE were planning on creating a new convention of kibi, mebi, gibi etc. but I guess it didnt catch on
BTW: onto another controversy but who the hell invented the whole MB/s and Mb/s thing?