Originally posted by: Auric
Still recommending that o'erpriced junk? Shader Model 3.0 is irrelevant in a GPU which is already severely underpowered for games from the last half decade. By the time games actually require 3.0, that card will be virtually useless for them.
To put it in perspective, the 6200 has the equivalent performance of a GF4 MX460 or R9000 Pro. Just because it is a "6 series" does not make it competent (similarly, the GF4 MX itself was derided as a glorified GF2).
For example, v1.4 from four or five years ago (Radeon 9100 as below) is enough to allow running a recent game such as BF2 but video settings would probably still have to be low to medium (depending upon the rest of the system). I would not even think about trying it with the likes of a 6200, regardless of the other components.
wikipedia games & shaders
copied from
recent AT thread:
Fill-rate (MT/s) / Memory Bandwidth (GB/s)
Geforce 6200: 1200 / 8.83
Radeon 9100: 2000 / 8.00
Radeon 8500: 2200 / 8.80
The 9100 (aka 8500LE) is an underclocked 8500. Default core/mem clocks are 250/250 (275/275 for 8500). I had an AGP model which could run at 290/300. I guess the 6200 must be capable of some oc'ing too and of course conforms to current DX shader specs but its relatively anemic performance prevents it from really being a contender for "best" PCI. A new price of $80 vs
$40 pretty much knocks it out.