Originally posted by: munky
The funny thing is that even a 10x7 w/o AA or AF none of the cards even break 50fps, and the minimum fps is in the single digits. Also notice that the Ati cards have slightly higher min fps. This tells me not only that the Ati cards have more brute force when the scene gets really complex with a lot of geometry and effects, but also that all those who bragged about future profing themselves with sm3 will be not be running the game with all the sm3 eye candy at decent fps.
I know this game is only in a beta stage, and probably hasnt been optimized yet, but if this is any indication of future games, it only confirms my theory that by the time many games start utilizing sm3 eye candy, the gf6 series will be too slow to run them with the eye candy enabled.
Originally posted by: Ackmed
If throwing crap, is stating fact, then ok.
EVERYONE KNOWS Fear runs LIKE CRAP on ANY CARD.
If you wish to read more about it, check his 7 page thread; http://forums.anandtech.com/messageview...atid=31&threadid=1626929&enterthread=y
Which he edited the title, and the first post after several days. There is no point to this thread, at all. It has been discussed.
Originally posted by: Genx87
Originally posted by: munky
The funny thing is that even a 10x7 w/o AA or AF none of the cards even break 50fps, and the minimum fps is in the single digits. Also notice that the Ati cards have slightly higher min fps. This tells me not only that the Ati cards have more brute force when the scene gets really complex with a lot of geometry and effects, but also that all those who bragged about future profing themselves with sm3 will be not be running the game with all the sm3 eye candy at decent fps.
I know this game is only in a beta stage, and probably hasnt been optimized yet, but if this is any indication of future games, it only confirms my theory that by the time many games start utilizing sm3 eye candy, the gf6 series will be too slow to run them with the eye candy enabled.
This is standard for first generation hardware. Try running a 9800 Pro or NV3.x in these games.
It is interesting to see SM3 does give a higher frame rate which is good for future development.
Originally posted by: munky
Originally posted by: Genx87
Originally posted by: munky
The funny thing is that even a 10x7 w/o AA or AF none of the cards even break 50fps, and the minimum fps is in the single digits. Also notice that the Ati cards have slightly higher min fps. This tells me not only that the Ati cards have more brute force when the scene gets really complex with a lot of geometry and effects, but also that all those who bragged about future profing themselves with sm3 will be not be running the game with all the sm3 eye candy at decent fps.
I know this game is only in a beta stage, and probably hasnt been optimized yet, but if this is any indication of future games, it only confirms my theory that by the time many games start utilizing sm3 eye candy, the gf6 series will be too slow to run them with the eye candy enabled.
This is standard for first generation hardware. Try running a 9800 Pro or NV3.x in these games.
It is interesting to see SM3 does give a higher frame rate which is good for future development.
The 9800p is a 3 year old card, but there were plenty of dx9 games a few years back that ran well on it. HL 2 runs well, FarCry runs well too, and even before there were dx9 games like Max Payne 2 that the card just brezed through no problem. The Nv30, even though it looked better on paper, with features like dx9+, longer-than-required shader length ability and 32-bit presicion, still sucked at dx9 games, but that's a whole other topic.
What is standard however is when Ati or Nvidia try to pimp features that the card will not run at acceptable fps. Ati, for example, had the trueform feature, and I only know a handful of games that use it, but when you enable it in HL2, the performance gets a lot worse. The same thing goes for Nvidia with the HDR and soft shaows and all that crap, the performance takes a huge hit when the features are enabled, and sometimes other features like AA are disables in addition to that.
I know there was a bunch of people in this forum who earlier this year toted sm3 as the the deciding factor of Nv superiority with their gf6 cards, and what happened now? With this game, you now actually have to turn off features like AA and AF, with or without sm3, just to have playable frame rates on a single gf6 card, and thats at 10x7 resolution. So much for future-proofing...
Originally posted by: McArra
Of course, but look what the poor X850XT PE does, it doesn't beat 6800GT, and not only that, @1280x1024 no AA/AF the 6800GT is around 33% faster.... now tell me Serie 6 is not more futureproof.
Originally posted by: McArra
Of course, but look what the poor X850XT PE does, it doesn't beat 6800GT, and not only that, @1280x1024 no AA/AF the 6800GT is around 33% faster.... now tell me Serie 6 is not more futureproof.
Originally posted by: Ackmed
It looks to me as the game run poor on any setup, and they need to optimize the heck out of it. Its not what I would consider playable with any of those cards.
Its pretty well known Fear runs like crap. Even before rollo posted his trolling thread.