Originally posted by: Kiwi
At the start, in the PCI-e versions, the 6200 and 6600 used the same core. Anyway, you need to study up on video. Google will help you. Your comment "just the 128-Bit" shows you aren't putting enough emphasis on that value
Thanks. I'm aware that 128-bit memory speeds it up quite a bit, but I was wondering more about the rest of the specs of the cards.
The reason I ask is because I may be getting a flashed 6200 for my ancient backup Mac, and I'm wondering why the OS supports it in the first place, cuz no Mac has ever shipped with a 6200. The card has been modded I believe (in addition to the flash), and I'm guessing that they've targeted the mods so that the OS treats it like a 6600LE (which shipped in some Macs).
I have since found this
GeForce 6xxx wiki:
6600 LE NV43
Core 300
Memory 500 128-bit
Pipes 4
Vertex 3
6200 AGP NV44a DDR
Core 350
Memory 500/400 64-bit DDR
Pipes 4
Vertex 3
or
6200 AGP NV44a DDR2
Core 350
Memory 532 64-bit DDR2
Pipes 4
Vertex 3
Will there be a significant difference between a 500 MHz 64-bit DDR version and the 532 MHz 64-bit DDR2 version of the 6200 NV44a?
Originally posted by: Kiwi
That ought to put things in perspective, and this will put the cap on it:
http://www.gpureview.com/show_cards.php?card1=352&card2=194
This is the one I'm looking at, except with DDR2.
In case you're wondering why such a low-end video card... Cuz it's a low end machine (released in 2000) with AGP only, and it's one of the few cards that will actually fit in the machine. Plus it's passively cooled.