Originally Posted by VirtualLarry
I thought GTX460 1GB (original) was GF104, and GTX560 was GF114.
sure, but how different GF114 really is from the GF104?
performance at the same clock is any different? (if you only use 336 cores like they did with the gtx 460 and 560), even some newer GTX 460 (some of those 192bit cards with 1GB called "GTX 460 V2" and things like that) actually used the "gf114" I think, the changes from 104-114 didn't look all that significant (far less than 100 to 110), they were basically tuning the manufacturing process for lower leakage and things like that as far as I know.
that's why I said "in a way" and not something like "exactly", it's not like with the 8800GT.
Originally Posted by Zap
Don't know about CS, but I think the problem with UT has to do with drivers. I remember BITD with a 3dfx card and then GeForce cards of the time, the game played nice and smooth. However, sometime along the way Nvidia stopped caring about older games, so they don't play right anymore.
At least that's my theory.
I remember going back to play Crimson Skies around 5 years ago, and it was completely unplayable with GPU acceleration turned on with Nvidia cards. Radeons worked fine, and software rendering worked fine.
from the 6 series onwards nvidia dropped support for something related to 8 bit textures, I know because back when I had a 8600GT I couldn't play some games like MGS without some mods, while my radeon would run without any problem.
but 6600 faster than gt 620 doesn't make much sense... in newer games I'm sure the 520/620 is MUCH faster, but even for older games it should deliver something close at least (if you look at the specs)...