SO where are the 128 bit cpus? Sony had a 128 bit cpu in the playstation 2 called the emotion engine.
SO where are the 128 bit cpus? Sony had a 128 bit cpu in the playstation 2 called the emotion engine.
Hopefully in museums. Using the measurement method of console PR people, IBM's System 370, released in 1970, was 128-bit.SO where are the 128 bit cpus?
Sony had a 128 bit cpu in the playstation 2 called the emotion engine.
If your program has and uses enough 64-bit pointers that those 4 bytes each makes a difference, you will already have been facing major performance issues related to memory, and that will be the least of your worries (assuming 64-bit does not run faster, such as if you use 64-bit data types, or if your code could use more than 2-4 GPRs).It would actually likely make computers slower due to 128bit pointers.
Programs actually tend to grow in size when they are recompiled from 32bit to 64bit due to needing 64 bits in x64 for pointers, and thus, pieces of the program are less likely to fit in cache and thus more likely to be slower.
128 bit processing compared to 32 bit and 64 bit processing
He is talking about instructions. Like how the N64 was a "64 bit" processor, Dreamcast was a "128 Bit" porcessor (And PS2).
As far as I know, the "Bits" of a system or it's CPU was purely marketing gibberish. Like geebees.
SO where are the 128 bit cpus? Sony had a 128 bit cpu in the playstation 2 called the emotion engine.
There are many things in a computer that are measured in 'bits'.
Addressable memory, 4GB limit for 32-bit, whatever the 64 bit limit is, and we likely won't need more than that for a very long time.
Memory bus - This is just a speed related thing. Current CPUs have 128-bit memory controllers or greater, but many low end cpus and graphics cards use less, as low as 32-bit.
The size of mathematical calculations the cpu can do. The x87 extension to the x86 instruction set made cpus capable of 80-bit precision a long time ago. SSE can do 128 bit. AVX can do 256 bit.
And many more things...
There's no one measure of the bitness of a processor. Back in the day, the definition used to be the maximum precision math the cpu could do (so bulldozer and sandy bridge would be 256-bit cpus), but now it's generally the amount of addressable memory (so bulldozer and sandy bridge are 64-bit).
I think the naming has more to do with how capable the registers on the CPU are.
http://en.wikipedia.org/wiki/X86-64
Isn't AMD64 really only 40 bits of memory addressing anyway?
Which is still a freaking lot of memory.
