SM3.0 finally having a nice performance gain over older tech?

jazzboy

Senior member
May 2, 2005
232
0
0
Hard to tell really. Correct me if I'm wrong but isn't this supposed to be a TWIMTBP game?
 

McArra

Diamond Member
May 21, 2003
3,295
0
0
Yup. Anyways TWIMTBP games were generally faster in ATi a year or so ago. I don't think they coded it poorly just for ATi to get in bad light.
 

Ackmed

Diamond Member
Oct 1, 2003
8,499
560
126
It looks to me as the game run poor on any setup, and they need to optimize the heck out of it. Its not what I would consider playable with any of those cards.

Its pretty well known Fear runs like crap. Even before rollo posted his trolling thread.
 

Munky

Diamond Member
Feb 5, 2005
9,372
0
76
The funny thing is that even a 10x7 w/o AA or AF none of the cards even break 50fps, and the minimum fps is in the single digits. Also notice that the Ati cards have slightly higher min fps. This tells me not only that the Ati cards have more brute force when the scene gets really complex with a lot of geometry and effects, but also that all those who bragged about future profing themselves with sm3 will be not be running the game with all the sm3 eye candy at decent fps.

I know this game is only in a beta stage, and probably hasnt been optimized yet, but if this is any indication of future games, it only confirms my theory that by the time many games start utilizing sm3 eye candy, the gf6 series will be too slow to run them with the eye candy enabled.
 

Ackmed

Diamond Member
Oct 1, 2003
8,499
560
126
Im not. Dont get mad for me stating facts.

A thread with Fear being faster on NV cards, has been posted before. By rollo, which turned into a 7 page crap thread.

How about you get this silly thread locked, because its old news, and the game is several months off. And we all know the game runs very poorly on any card.
 

McArra

Diamond Member
May 21, 2003
3,295
0
0
Ackmed, you should really try to stay away from this threads, you only throw crap. Now someone please stick to the thread and post opinions adn or technical explanations...
 

Genx87

Lifer
Apr 8, 2002
41,091
513
126
Originally posted by: munky
The funny thing is that even a 10x7 w/o AA or AF none of the cards even break 50fps, and the minimum fps is in the single digits. Also notice that the Ati cards have slightly higher min fps. This tells me not only that the Ati cards have more brute force when the scene gets really complex with a lot of geometry and effects, but also that all those who bragged about future profing themselves with sm3 will be not be running the game with all the sm3 eye candy at decent fps.

I know this game is only in a beta stage, and probably hasnt been optimized yet, but if this is any indication of future games, it only confirms my theory that by the time many games start utilizing sm3 eye candy, the gf6 series will be too slow to run them with the eye candy enabled.

This is standard for first generation hardware. Try running a 9800 Pro or NV3.x in these games.

It is interesting to see SM3 does give a higher frame rate which is good for future development.


 

VIAN

Diamond Member
Aug 22, 2003
6,575
1
0
Well, easy enough, you just have to turn down some settings.

Some games see very little change from Medium to High, but a big performance hit. Probably the crappy settings doom3 had. They uncompress the effects or something. Yeah - like that's an increase in image quality.

When is this game coming out anyway?
 

Munky

Diamond Member
Feb 5, 2005
9,372
0
76
Originally posted by: Ackmed
If throwing crap, is stating fact, then ok.

EVERYONE KNOWS Fear runs LIKE CRAP on ANY CARD.

If you wish to read more about it, check his 7 page thread; http://forums.anandtech.com/messageview...atid=31&threadid=1626929&enterthread=y

Which he edited the title, and the first post after several days. There is no point to this thread, at all. It has been discussed.

Reading the other thread right now, LOL @ rollo getting owned when his Ati-bashing backfires on him. :laugh:
 

Munky

Diamond Member
Feb 5, 2005
9,372
0
76
Originally posted by: Genx87
Originally posted by: munky
The funny thing is that even a 10x7 w/o AA or AF none of the cards even break 50fps, and the minimum fps is in the single digits. Also notice that the Ati cards have slightly higher min fps. This tells me not only that the Ati cards have more brute force when the scene gets really complex with a lot of geometry and effects, but also that all those who bragged about future profing themselves with sm3 will be not be running the game with all the sm3 eye candy at decent fps.

I know this game is only in a beta stage, and probably hasnt been optimized yet, but if this is any indication of future games, it only confirms my theory that by the time many games start utilizing sm3 eye candy, the gf6 series will be too slow to run them with the eye candy enabled.

This is standard for first generation hardware. Try running a 9800 Pro or NV3.x in these games.

It is interesting to see SM3 does give a higher frame rate which is good for future development.

The 9800p is a 3 year old card, but there were plenty of dx9 games a few years back that ran well on it. HL 2 runs well, FarCry runs well too, and even before there were dx9 games like Max Payne 2 that the card just brezed through no problem. The Nv30, even though it looked better on paper, with features like dx9+, longer-than-required shader length ability and 32-bit presicion, still sucked at dx9 games, but that's a whole other topic.

What is standard however is when Ati or Nvidia try to pimp features that the card will not run at acceptable fps. Ati, for example, had the trueform feature, and I only know a handful of games that use it, but when you enable it in HL2, the performance gets a lot worse. The same thing goes for Nvidia with the HDR and soft shaows and all that crap, the performance takes a huge hit when the features are enabled, and sometimes other features like AA are disables in addition to that.

I know there was a bunch of people in this forum who earlier this year toted sm3 as the the deciding factor of Nv superiority with their gf6 cards, and what happened now? With this game, you now actually have to turn off features like AA and AF, with or without sm3, just to have playable frame rates on a single gf6 card, and thats at 10x7 resolution. So much for future-proofing...
 

McArra

Diamond Member
May 21, 2003
3,295
0
0
Originally posted by: munky
Originally posted by: Genx87
Originally posted by: munky
The funny thing is that even a 10x7 w/o AA or AF none of the cards even break 50fps, and the minimum fps is in the single digits. Also notice that the Ati cards have slightly higher min fps. This tells me not only that the Ati cards have more brute force when the scene gets really complex with a lot of geometry and effects, but also that all those who bragged about future profing themselves with sm3 will be not be running the game with all the sm3 eye candy at decent fps.

I know this game is only in a beta stage, and probably hasnt been optimized yet, but if this is any indication of future games, it only confirms my theory that by the time many games start utilizing sm3 eye candy, the gf6 series will be too slow to run them with the eye candy enabled.

This is standard for first generation hardware. Try running a 9800 Pro or NV3.x in these games.

It is interesting to see SM3 does give a higher frame rate which is good for future development.

The 9800p is a 3 year old card, but there were plenty of dx9 games a few years back that ran well on it. HL 2 runs well, FarCry runs well too, and even before there were dx9 games like Max Payne 2 that the card just brezed through no problem. The Nv30, even though it looked better on paper, with features like dx9+, longer-than-required shader length ability and 32-bit presicion, still sucked at dx9 games, but that's a whole other topic.

What is standard however is when Ati or Nvidia try to pimp features that the card will not run at acceptable fps. Ati, for example, had the trueform feature, and I only know a handful of games that use it, but when you enable it in HL2, the performance gets a lot worse. The same thing goes for Nvidia with the HDR and soft shaows and all that crap, the performance takes a huge hit when the features are enabled, and sometimes other features like AA are disables in addition to that.

I know there was a bunch of people in this forum who earlier this year toted sm3 as the the deciding factor of Nv superiority with their gf6 cards, and what happened now? With this game, you now actually have to turn off features like AA and AF, with or without sm3, just to have playable frame rates on a single gf6 card, and thats at 10x7 resolution. So much for future-proofing...

Of course, but look what the poor X850XT PE does, it doesn't beat 6800GT, and not only that, @1280x1024 no AA/AF the 6800GT is around 33% faster.... now tell me Serie 6 is not more futureproof.
 

zendari

Banned
May 27, 2005
6,558
0
0
Not sure if this has to do with SM3. Doom 3 was faster on a 6800 GT than a x850xtpe. Conversely, the x800 XL is often comparable to the 6800 U in HL2 and Farcry.

Plus FEAR is still in beta, with this level of performance its nowhere near relase.
 

Todd33

Diamond Member
Oct 16, 2003
7,842
2
81
Benchmarking a unoptimized beta game that you cannot buy is dumb. Reviewers need to use their heads, stick to finished retail products.
 

kylebisme

Diamond Member
Mar 25, 2000
9,396
0
0
Originally posted by: McArra
Of course, but look what the poor X850XT PE does, it doesn't beat 6800GT, and not only that, @1280x1024 no AA/AF the 6800GT is around 33% faster.... now tell me Serie 6 is not more futureproof.

yet the minimum framerates are generally better on the ATI cards, and all the minimum framerates are far to low to be considered playable anyway. If you have to turn less stuff down on the ATI cards to hit a minimum of 30fps; then I'd say the x850xt-pe proves more "futureproof" in this game.
 

McArra

Diamond Member
May 21, 2003
3,295
0
0
I guess those low fps are due to not having enough RAM memory, as it happens in Battlefield 2.
 

Megatomic

Lifer
Nov 9, 2000
20,127
6
81
I hope that's true McArra. I can upgrade to 2GB but I can't get a 7800 card (no PCI-e slot).
 

Creig

Diamond Member
Oct 9, 1999
5,170
13
81
Originally posted by: McArra
Of course, but look what the poor X850XT PE does, it doesn't beat 6800GT, and not only that, @1280x1024 no AA/AF the 6800GT is around 33% faster.... now tell me Serie 6 is not more futureproof.

Since minimum frame rates at even 1024x768 dropped down to single digits on every card tested, I'd say it's more a case of the game being "sales proof".
 

nemesismk2

Diamond Member
Sep 29, 2001
4,810
5
76
www.ultimatehardware.net
Originally posted by: Ackmed
It looks to me as the game run poor on any setup, and they need to optimize the heck out of it. Its not what I would consider playable with any of those cards.

Its pretty well known Fear runs like crap. Even before rollo posted his trolling thread.

Rollo isn't a troll!