PCIe lane, that is. I wish he had tested more games, but it's neat to see it can still provide playable experiences:
That's entirely subjective, of course. Graphics fidelity <> enjoyment. I would argue that playing Dota2 @ 100fps and Rocket League @ 60fps at 1080p is plenty of fun, even if it is on low presets. Still better than Intel IGP in many instances. Point being, if you still have this GPU kicking around and know someone that wants to play F2P games and has no money, this could be a nice upgrade for them.on the lowest settings. playable? yes. enjoyable? No.
this card (same specs) was quickly renamed to GTS 250...
basically G92 with this configuration (fully enabled) started as 8800GTS 512MB, went to 9800GTX, GTX+ and GTS 250...
but it was a pretty good card, also I think it's important to consider that anything at 1080P is pushing it, when it was new 1280x1024 was still the main res, and 1680x1050 was the high end gaming res.
also the 4850 aged better, the Nvidia cards at the time (even GTX 285) supported DX10, but the ATIs supported DX10.1, for this reason a 4850 can run Overwatch fairly well, while the GTS 250/9800GTX+ can't run it.
I wouldn't say the HD 4850 definitely aged better. AMD stopped regular support already in 2012 and Legacy support in 2013. Battlefield Hardline for example can be run on a GTX 260, but refuses to run on HD 4800 series because it says the driver is too old.
PCgameshardware.de actually had the 9800 GTX 512MB and HD 4870 1GB in their test for GTA V. Click on the Phenom II X4 940 tab. 9800 GTX is very close to HD 4870.
"Niedrig" means low and "mittel" is medium.
http://www.pcgameshardware.de/GTA-5...95/Specials/Systemanforderungen-Test-1156263/
G92 was amazing.
My old 9800 GT EE ("Energy Efficient") 2GB was only ~$47 on clearance at Best Buy. It required no power connector. Modern Warfare 2 with max settings at 1920x1200 looked liquid smooth.
It was just a die-shrunk G92.
Yeah. Theas far as I can remember the G92-g94 cards without the connector also had reduced clocks (like from 650 to 600... not a huge deal), I think they were called "Eco"
I think almost but not all the 9800GTs used the 55nm version of G92
but I had no idea they made 2GB versions, kind of funny that they had 256MB and 2GB models with the same GPU.
that's a good point, AMD indeed dropped support to soon and it caused lots of problems, even a 6970 suffers with that in newer Battlefield and other games (with bugs that don't exist on old Geforces) still Overwatch (one of the main games from 2016) is definitely a win for ATI longevity due to that 10.1 support
https://www.youtube.com/watch?v=UmQpavpvjiY
not even the 285 can run it, (the 40nm pre-Fermi Nvidia cards can, like the GT 240, but they perform very poorly)
as far as I can remember the G92-g94 cards without the connector also had reduced clocks (like from 650 to 600... not a huge deal), I think they were called "Eco"
I think almost but not all the 9800GTs used the 55nm version of G92
but I had no idea they made 2GB versions, kind of funny that they had 256MB and 2GB models with the same GPU.
Whoa, I never knew of a 2GB 9800GT either. I find that fascinating. 4 memory amounts: 256MB (8800 GT) 512MB, 1GB, and 2GB is unprecedented for consumer cards. There are professional Hawaii cards with 32GB and 16GB, so throw in 8GB and 4GB and that's the only other chip to have as many sizes that I know of. Radeon 4870, Radeon 7850/265/370, and GTX 750 Ti are also recent rare examples of 3 different memory sizes available, but 4 is crazy.
