- Aug 26, 2008
- 1,774
- 14
- 81
With the prevalence of GPU's with framebuffers larger than 1GB now, up to 2, 3 or even 4GB, there are bound to be people with older systems (possibly still XP) who are purchasing even low end graphics cards with 2GB GDDR3 graphics memory when they only have maybe 2GB of system RAM or less.
Does anyone know if it is OK for an OS which handles a GPU with 2GB of VRAM with only 1GB of system RAM (like the PC in my sig) or does their need to be as a minimum at least as much system memory (and preferably more) as there is for a GPU?
I haven't heard of someone running a 2GB GPU in a PC with only 1 GB of system memory and I know in XP even though it is 32-bit it can only really handle about 3GB of system memory.
Since RAM is so cheap nowadays and most new PC's come with at the very minimum 8GB maybe this just isn't an issue. I for the life of me can't find any system requirements for GPU's which state that you need >= more system memory than the GPU has to correctly address all of it. I'm guessing the OS can somehow accomodate these large GPU's VRAM buffers.
Does anyone know if it is OK for an OS which handles a GPU with 2GB of VRAM with only 1GB of system RAM (like the PC in my sig) or does their need to be as a minimum at least as much system memory (and preferably more) as there is for a GPU?
I haven't heard of someone running a 2GB GPU in a PC with only 1 GB of system memory and I know in XP even though it is 32-bit it can only really handle about 3GB of system memory.
Since RAM is so cheap nowadays and most new PC's come with at the very minimum 8GB maybe this just isn't an issue. I for the life of me can't find any system requirements for GPU's which state that you need >= more system memory than the GPU has to correctly address all of it. I'm guessing the OS can somehow accomodate these large GPU's VRAM buffers.