Actually, I looked into this further. I played with the MATLAB memory settings and started factoring 2^55. I set it's affinity to run only on CPU0 of my hyperthreaded machine, so I can still use Windows without a problem. Basically, memory usage seems to spike to 400MBish as it handles the vector operations, but matlab still reports 'busy' even though it is utilizing <10% of the CPU. That, to me, confirms that factoring a large number on an x86 box is limited by memory bandwidth, and not functional units. If functional units were the limiting factor, matlab would be eating up the CPU like mad, but since it's hovering around low values, it's just waiting on Windows to page in the correct amounts of memory. This is on a machine with 1 GB of RAM, and a datatype of 1:n:2^55 is, umm, 2^55*2^8 bytes long, when defaulted to 'double' as matlab does. That's a *LOT* of virtual memory to be moving about. And x86 has sh!tty memory bandwidth anyways.