Different Far Cry for nV40 owners?

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

CaiNaM

Diamond Member
Oct 26, 2000
3,718
0
0
Originally posted by: Shad0hawK
Originally posted by: CaiNaM
sounds like some folks might be in for a letdown.... straight from the horse's mouth (crytek):

In current engine there are no visible difference between PS2.0 and PS3.0. PS3.0 is used automatically for per-pixel lighting depending on some conditions to improve speed of rendering.

http://www.pcper.com/article.php?ai...pe=expert&pid=2

6800 owners will not benefit from "better image IQ" compared to x800 users in Far Cry. sounds like there could be some performance benefits, however.


i "lmao" every time i read this comment. while true in a vastly oversimplistic sense...many are often mislead because they take it out of context, not realizing(or simply ignoring the fact that speed has a great impact on visual quality in the following 2 ways.

1. the ability to do more shaders means the ability to do more shader effects...more shader effects means better IQ because a givin scene can have more(or more complex) lighting effects through shaders thus the IQ is "better"...because of speed of shader performance.

2. a card that can "just run faster" can also run at a higher resolution...and we all know higher resolutions look better than lower resolutions. if you do not believe me fire up your favorite game at 640x480, then run it at 1600x1200 at same quality settings...it will be obvious which "looks better" even with the same LOD settings...

so snipping that quote(or a few quotes) and performing a gross act of oversimplification by taking it to mean "SM3 will not look any better than SM2" is nothing more than deluding yourself...anyone that has increased a resolution to make a game look better has the verifiable evidence right in front of them on their monitor.

you should read the article which i linked before accusing anyone of anything; it's not like anything was taken out of context. here's their conclusion:

Taking what we have been shown of Shader 3.0 by NVIDIA and utilizing the information gathered within this interview, we begin to have a clearer picture of how Shader 3.0 will affect your gaming experience. Whereas many had taken the position that Shader 3.0 would make significant improvements in image quality over Shader 2.0, we are beginning to see that is hardly the case. Although the foundation is present in Shader 3.0 to give developers even more freedom to maximize image quality, the immediate gains seem to be based entirely upon performance. Mr. Yerli?s answer to question eight is an excellent summation of where we currently stand regarding Shader 3.0.

this was primarily stated for those who make comments such as, "I would love to see the SM3.0 difference, especially lighting and water ", etc. as there will be no differences to see.

again, it may be possible that some things can be done at less "cost", however it's uncertain whether there will be any tangible performance benefits with comments from the developers such as:

Several new features in 3.0 shader model aren?t for free. Texture access in vertex shader is expensive, dynamic branching is not for free. So if developer has to utilize some features of PS3.0 shader model he/she should design shader in way which will remove other much important bottlenecks of application (CPU limitations, states/shaders changing, make shorter shader execution path, decrease streams bandwidth?).

there's also other misinformation, such as claims dynamic branching requires sm3, which is certainly not the case as shown in the demo you will find, along with some background, in this
thread.

there are of course tangible benefits to sm3, but marketing has blown it much out of proportion. while it's certainly not as insignificant as ati tries to make it seem, neither is it as significant as nvidia makes it out to be. the truth is, as we often find, somwhere in the middle.
 

Shad0hawK

Banned
May 26, 2003
1,456
0
0
Originally posted by: CaiNaM
you should read the article which i linked before accusing anyone of anything; it's not like anything was taken out of context. here's their conclusion:

i am not accusing anyone of anything, except for ignoring some facts which you are still doing. i am not disagreeing with the article(which i read in full over a month ago BTW) in fact if you go over it again you will see the issues i raise addressed as well by Cervat Yerli.

Originally posted by: CaiNaM Taking what we have been shown of Shader 3.0 by NVIDIA and utilizing the information gathered within this interview, we begin to have a clearer picture of how Shader 3.0 will affect your gaming experience. Whereas many had taken the position that Shader 3.0 would make significant improvements in image quality over Shader 2.0, we are beginning to see that is hardly the case. Although the foundation is present in Shader 3.0 to give developers even more freedom to maximize image quality, the immediate gains seem to be based entirely upon performance. Mr. Yerli?s answer to question eight is an excellent summation of where we currently stand regarding Shader 3.0.

this was primarily stated for those who make comments such as, "I would love to see the SM3.0 difference, especially lighting and water ", etc. as there will be no differences to see.

again, it may be possible that some things can be done at less "cost", however it's uncertain whether there will be any tangible performance benefits with comments from the developers such as:


Several new features in 3.0 shader model aren?t for free. Texture access in vertex shader is expensive, dynamic branching is not for free. So if developer has to utilize some features of PS3.0 shader model he/she should design shader in way which will remove other much important bottlenecks of application (CPU limitations, states/shaders changing, make shorter shader execution path, decrease streams bandwidth?).

there's also other misinformation, such as claims dynamic branching requires sm3, which is certainly not the case as shown in the demo you will find, along with some background, in this
thread.


again you are missing the points i raised in order to make it sound as if PS3 will not be any better than PS2. the difference will not be in effects, but the ability to use more of the same effects, like replacing some vertex lighting with shaders. this is not theory, compare the hair of nalu to the hair of ruby, the visual impact of long freeflowing hair is musch more visually appealing than the hair ruby has which is merely typical. nothing was done in ps3 that could not ahve been done in in ps2, except that in ps2 the nalu demo gets about 3-4 fps compared to the 30-40 fps a SM# card could give, which is another example of performance directly impacting the visual quality of the scene as it plays out. if making PS2 perform just as good as PS3 is as easy as some make it sound, why then did ATI not do it? ruby does not look any more complex than dusk. and there is a reason for that...performance issues limited how long rubie's hair could be and still get a decent framerate :) also those scenes are rather simple. let's see some of those demos of ps2 being "just as good as" ps3 in something more complex, such as a game.

oh! we will in a couple of weeks! ;)


Originally posted by: CaiNaMthere are of course tangible benefits to sm3, but marketing has blown it much out of proportion. while it's certainly not as insignificant as ati tries to make it seem, neither is it as significant as nvidia makes it out to be. the truth is, as we often find, somwhere in the middle.


i agree with you there, but being able to up my res because of faster performance will still make my game look better :)

goodnight! :D
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,004
126
II'm most interested to see whether the enhanced IQ of this patch is provided through offset or displacement mapping as well. (not to mention just see the difference in IQ!)
There are no IQ differences between the two paths. SM 3.0 just makes things faster which basically now makes the NV40 the preferred card for Far Cry.

it was the first game, AFAIK, to have any PS2 effects in it.
Actually no, it wasn't. I've pointed this out to you many times but you seem to feel that continually repeating the same rubbish makes it true.

I hadn't seen as good of screenshots of PS2 differences, and was not that impressed with the shinier pipes and water, as you say.
How could you be impressed given you never even played Far Cry then? Or are you admitting that you were talking rubbish back then?

The patched Far Cry screenshots I've seen lately look a lot better than the Far Cry I'm playing, so I'm more impressed.
That's astonishing considering Anand made it clear there was no IQ difference between the two. If you're going to be nVidia biased I suggest you actually find out what it is you're biased about.

As for my "love" of SM3, it's just part of the big picture for me:
But no such "big picture" on ATi's 2x performance gain and IQ gain in SM 2.0, right?

Potential improvements of displacement mapping and geometric instancing, with huge developer support for this hardware/feature set.
I thought "potential" doesn't count? Or is that only when ATi has potential?

2. Potential SLI power heretofore unimagined
Which has absolutely nothing to do with SM 3.0.

4. Not the same feature set I've been playing with the better part of the last two years
I thought features don't matter as long as performance is similar? Or was that the tune that you sang only when you had your NV30 fetish?
 

vshah

Lifer
Sep 20, 2003
19,003
24
81
dudes your posts are too long

in any case from that article canaim posted, there look to be some nice speed bumps using sm3 on the 6800s.

their conclusion was:

What we saw here at PC Perspective was a moderate to impressive increase in performance in FarCry with the new 1.2 patch enabling SM3.0 support on NVIDIA's 6800 series of cards. The performance enhancements varied across the diferent levels of AA and AF as well as resolution, but the best results for NVIDIA came when the resolution was high and all bells and whistles were turned on in the control panel. If you have a 6800 series card, this new patch and graphics driver will give you a nice increase in performance for one of the most popular shooter games for free -- and that is something you can't beat. Is this performance increase enough to get people to take notice of SM3.0 and buy a card that utilizes it? I don't know if a single game is enough to convince anyone what card to buy, especially with the likes of Doom3 and Half Life 2 coming out within 2 months. In either event, the success of Pixel Shader 3.0 here is a feather in NVIDIA's cap.

good times for all!

-Vivan
 

CaiNaM

Diamond Member
Oct 26, 2000
3,718
0
0
Originally posted by: Shad0hawK
i agree with you there, but being able to up my res because of faster performance will still make my game look better :)

goodnight! :D

i certainly havent read anything that would make me expect a performance increase to that degree. from anand's fc patch 2 preview:

The main point that the performance numbers make is not that SM3.0 has a speed advantage over SM2.0 (as even the opposite may be true), but that single pass per-pixel lighting models can significantly reduce the impact of adding an ever increasing number of lights to a scene.

It remains to be seen whether or not SM3.0 offer a significant reduction in complexity for developers attempting to implement this advanced functionality in their engines, as that will be where the battle surrounding SM3.0 will be won or lost.

to me that seems to say the "jury is still out" on whether it will decrease the "cost" of coding to where IQ could be improved by using more specialized rendering effects allowed by possible performance gains....

and goodnight :)
 

Shad0hawK

Banned
May 26, 2003
1,456
0
0
Originally posted by: CaiNaM
Originally posted by: Shad0hawK
i agree with you there, but being able to up my res because of faster performance will still make my game look better :)

goodnight! :D

i certainly havent read anything that would make me expect a performance increase to that degree. from anand's fc patch 2 preview:

The main point that the performance numbers make is not that SM3.0 has a speed advantage over SM2.0 (as even the opposite may be true), but that single pass per-pixel lighting models can significantly reduce the impact of adding an ever increasing number of lights to a scene.

It remains to be seen whether or not SM3.0 offer a significant reduction in complexity for developers attempting to implement this advanced functionality in their engines, as that will be where the battle surrounding SM3.0 will be won or lost.

to me that seems to say the "jury is still out" on whether it will decrease the "cost" of coding to where IQ could be improved by using more specialized rendering effects allowed by possible performance gains....

and goodnight :)

looks like i will be able to increase my res on my 6800gt from 1280x960(LOD set at highest with 4xAF with no AA) to 1600x1200 4x8x)...thus improving my games visual appeal due to the increased performance... and still get about the same FPS

goodni..err.. MORNING!:D

i really am going to bed now, i promise! LOL
 

Regs

Lifer
Aug 9, 2002
16,666
21
81
Well, as it looks right now Far Cry 1.2 doesn't offer much new to the user as expected. It's more helpful to the game developer right now.

The performance gain is negligible to the point where their are too many other variables to consider. While IQ doesn't improve one bit.

Site Link


Still I'm buying a 6800GT though. ;)
 

Acanthus

Lifer
Aug 28, 2001
19,915
2
76
ostif.org
Originally posted by: Regs
Well, as it looks right now Far Cry 1.2 doesn't offer much new to the user as expected. It's more helpful to the game developer right now.

The performance gain is negligible to the point where their are too many other variables to consider. While IQ doesn't improve one bit.

Site Link


Still I'm buying a 6800GT though. ;)

It does appear to improve slightly in some of the AT screenshots, they even mention the difference, although slight. They are waiting to hear from the Far Cry devs which shots are closer to the intended source art. One example is the darker and more even looking reflection on the tile flooring on the volcano map.
 

Safeway

Lifer
Jun 22, 2004
12,075
11
81
Wait for X800 optimized drivers. Maybe ATI can actually match the nVidia cards. XT PE to Ultra Extreme, XT to Ultra, Pro to GT.
 

ZobarStyl

Senior member
Mar 3, 2004
657
0
0
Man that's the first AT review with the newest drivers and it's really showing...look at the benches where the GT is beating the XT PE...crazy jumps...but yeah, looks like SM3.0 will be a decent feature, worth some minor but useful perf gains, but nothing groundbreaking. Now that that's over, let's argue about 3dc; yay! :laugh:
 

Keysplayr

Elite Member
Jan 16, 2003
21,219
55
91
Originally posted by: Safeway
Wait for X800 optimized drivers. Maybe ATI can actually match the nVidia cards. XT PE to Ultra Extreme, XT to Ultra, Pro to GT.

Match the Nvidia cards how? By handling SM3.0 via software drivers? Not likely. The only way for ATI to increase its performance at this point is to sacrifice its IQ and that has been ATI's bread and butter for the last 2 years. I don't know how badly they want to stay ahead of NV but if they start to trade performance for IQ, it will be all over the review sites. Just like Nvidia sacrificed major IQ in the NV3x series to compete with the performance of ATI R360. We shall see what unfolds. I guess it all comes full circle eventually.
 

reever

Senior member
Oct 4, 2003
451
0
0
The only way for ATI to increase its performance at this point is to sacrifice its IQ

Apparently you know everything about the ATI driver team and what they're doing :roll:
 

Acanthus

Lifer
Aug 28, 2001
19,915
2
76
ostif.org
They definately pulled a rabbit out of their hat with the Adaptive/Trylinear AF, after seeing it in action, i think its safe to say the IQ difference is completely unnoticable.
 

Safeway

Lifer
Jun 22, 2004
12,075
11
81
Originally posted by: keysplayr2003
Originally posted by: Safeway
Wait for X800 optimized drivers. Maybe ATI can actually match the nVidia cards. XT PE to Ultra Extreme, XT to Ultra, Pro to GT.

Match the Nvidia cards how? By handling SM3.0 via software drivers? Not likely. The only way for ATI to increase its performance at this point is to sacrifice its IQ and that has been ATI's bread and butter for the last 2 years. I don't know how badly they want to stay ahead of NV but if they start to trade performance for IQ, it will be all over the review sites. Just like Nvidia sacrificed major IQ in the NV3x series to compete with the performance of ATI R360. We shall see what unfolds. I guess it all comes full circle eventually.

Someone said that both nVidia and ATI cards run at 60% peak. This is no longer true for the nVidia cards. With all the lastest driver revisions, and game intro of SM3.0 (This may not affect anything) ... nVidia may have been 60% once the 6800 was announced, but now that driver revisions led to 20-30fps increases, I don't hold that to be true.

ATI on the other hand is whack when it comes down to it. They need to release official Cat 4.7s, even though 4.7beta is already out, improves performance without sacrificing IQ. It doesn't increase it drastically or anything, we have to wait until 4.8, 4.9 ... 17.5
 

ZobarStyl

Senior member
Mar 3, 2004
657
0
0
The Inq (for whatever that's worth) said that the 6800's were running at 30% below top speed, but I've never heard anyone say the x800's were that way as well...link? From a basic logic standpoint it makes sense that a new architecture could be pushed farther than one that's been refined for 2 years already. Does that mean ATi won't get any driver gains out of the x800's? Hell no. But it does mean that the 6800's may have some power left to be tapped that won't show for a couple more driver revisions...but point at hand they have already gone from totally losing to ATi in FC perf to the GT sometimes beating the XTPE...it's safe to say that there's definitely been some improvement on the NV end.
 

CaiNaM

Diamond Member
Oct 26, 2000
3,718
0
0
Originally posted by: ZobarStyl
The Inq (for whatever that's worth) said that the 6800's were running at 30% below top speed, but I've never heard anyone say the x800's were that way as well...link? From a basic logic standpoint it makes sense that a new architecture could be pushed farther than one that's been refined for 2 years already. Does that mean ATi won't get any driver gains out of the x800's? Hell no. But it does mean that the 6800's may have some power left to be tapped that won't show for a couple more driver revisions...but point at hand they have already gone from totally losing to ATi in FC perf to the GT sometimes beating the XTPE...it's safe to say that there's definitely been some improvement on the NV end.

ati's engineers stated that during the ati/adaptive af chat on ati's website sometimes back. the mem controller was only running around 30% efficiency.. when all was said and done they were talking overall performance gains in the 10-15% area when drivers were optimized.

edit: err.. should have stated "the mem controller was only running around 70% efficiency"...
 

TStep

Platinum Member
Feb 16, 2003
2,460
10
81
I asked in a couple of other similar threads, but how does this figure into the equation. I really don't know much about how SM3.0 works, but will this allow non-sm3.0 DX9 cards to reap the performance benefits also?
 

Safeway

Lifer
Jun 22, 2004
12,075
11
81
Originally posted by: CaiNaM
Originally posted by: ZobarStyl
The Inq (for whatever that's worth) said that the 6800's were running at 30% below top speed, but I've never heard anyone say the x800's were that way as well...link? From a basic logic standpoint it makes sense that a new architecture could be pushed farther than one that's been refined for 2 years already. Does that mean ATi won't get any driver gains out of the x800's? Hell no. But it does mean that the 6800's may have some power left to be tapped that won't show for a couple more driver revisions...but point at hand they have already gone from totally losing to ATi in FC perf to the GT sometimes beating the XTPE...it's safe to say that there's definitely been some improvement on the NV end.

ati's engineers stated that during the ati/adaptive af chat on ati's website sometimes back. the mem controller was only running around 30% efficiency.. when all was said and done they were talking overall performance gains in the 10-15% area when drivers were optimized.

As CaiNaM stated, this came from ATi. That can prove, or discount, the information. To be honest, I heard about the lackluster X800 efficiency before I ever got wind of nVidia's. And as I stated previously, nVidia quickly jumped on numerous -prelaunch- driver revisions that partially filled in the gap. If you guys can remember, benchmarking sites had to constantly update their engineering models and reports. These revisions were aimed almost exclusively at the 6800 series. ATI on the other hand, has not implimented a single performance or efficiency optimization or revision. Utilizing the memory more efficiently adds performance, and in -no way- detracts from the ATI IQ standard.

As reviewers have said from the beginning, the nVidia card is a real engineering and technological feat compared to the X800. ATI was able to toss out a half-baked card, with no over-the-top changes and managed to kick major graphics ass. It seems almost lazy... I am sure nVidia spends alot more on R&D, although I could be deathly mistaken. Either way, nVidia has already started to improve memory performance ... in multiple, obviously incomplete, driver revisions. All X800 owners hope that ATI jumps on it and gets out a Cat 4.7 (or 4.8) that effectively compensates.

By the way, nVidia has better box/card art.
 

reever

Senior member
Oct 4, 2003
451
0
0
I am sure nVidia spends alot more on R&D, although I could be deathly mistaken.

Nvidia stated they spent around 400million on r&D for the NV40, but they also said they spent around the same amount for the NV30, so I think most cards use that amount
 

reever

Senior member
Oct 4, 2003
451
0
0
Originally posted by: Safeway
Can you get an R&D quote from ATI?


Unless they specifically state how much they spent there is no way of knowing from simple annual reports as both companies don't just make graphic cards
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,004
126
FYI Anand's results were wrong (no AA on nVidia cards). Newer results have been published.
 

Acanthus

Lifer
Aug 28, 2001
19,915
2
76
ostif.org
Originally posted by: reever
Originally posted by: Safeway
Can you get an R&D quote from ATI?


Unless they specifically state how much they spent there is no way of knowing from simple annual reports as both companies don't just make graphic cards

Judging from the technology and specs of the new card, i think its safe to say they spent far far less than nvidia.