I am curious to hear people's opinions on going 64bit for the CPU on a desktop.
I can see the advantage for servers, as well as 64bit PCI, etc. But it all has to do with memory usage. I don't see how going 64bit on a CPU is going to give you a benefit on the desktop for Doom3 or WindowsXP, or FarCry, etc, when you aren't running 12 gigs of RAM and have a 4 gig database table loaded into memory. or need to manage a multiple terrabyte RAID5 array on 64bit PCI slots.
Why waste the money on a 64-bit CPU now, when there's almost zero advantage of using it on the desktop? If I'm wrong, point out the advantages. To me I see a lot of posts where people thing 64bit = twice as fast as a 32bit processor. That's just plain wrong.
I'm running a P4 3.0ghz Prescott, 512meg DDR400 CL2, and a NVidia GeForce 6800 GT 256mb, and pushing out a solid 60fps in Doom3 in 1024x1280 running through the levels and avg'ng around 40fps in heavy battles. What, 95% of it is the graphics card. So why go 64bit on a processor?
I can see the advantage for servers, as well as 64bit PCI, etc. But it all has to do with memory usage. I don't see how going 64bit on a CPU is going to give you a benefit on the desktop for Doom3 or WindowsXP, or FarCry, etc, when you aren't running 12 gigs of RAM and have a 4 gig database table loaded into memory. or need to manage a multiple terrabyte RAID5 array on 64bit PCI slots.
Why waste the money on a 64-bit CPU now, when there's almost zero advantage of using it on the desktop? If I'm wrong, point out the advantages. To me I see a lot of posts where people thing 64bit = twice as fast as a 32bit processor. That's just plain wrong.
I'm running a P4 3.0ghz Prescott, 512meg DDR400 CL2, and a NVidia GeForce 6800 GT 256mb, and pushing out a solid 60fps in Doom3 in 1024x1280 running through the levels and avg'ng around 40fps in heavy battles. What, 95% of it is the graphics card. So why go 64bit on a processor?