• We should now be fully online following an overnight outage. Apologies for any inconvenience, we do not expect there to be any further issues.

SM3.0 effects in FarCry

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

OfficerDoofey

Member
May 26, 2004
112
0
0
ok im getting a little personal here. sorry guys.

i guess i just get a little upset when where all in these forums trying to learn a little about stuff which basically hasnt even been released and we get ppl telling us "Thats crap" or "thats just this" or whatever...
 

Avalon

Diamond Member
Jul 16, 2001
7,571
178
106
Originally posted by: SickBeast
How can you guys complain so much about those graphics? They are vastly enhanced! Look at some of the lighting effects!

Sure, they are overexposed a bit at times, but I'm sure that will be resolved. You should be focussing on the added visual quality brought on by PS3.0 instead of complaining about the overexposure making your eyes hurt.

Please don't tell me what I should or should not be focusing on. I'm not going to care about added visual quality if I'm uncomfortable just looking at it. That hurts my eyes. If they can work on it so the lighting isn't overdone in places, I'm sure it would look great. A few of the pictures that don't glare as badly, mainly the last of the indoor catacomb pictures, I think look very nice. The rest don't have to be that bright though.
 

ponyo

Lifer
Feb 14, 2002
19,688
2,811
126
Those picture shots are demonstrating HDR (High-Dynamic Range) lightning. The extremely bright light source and the sheer intensity glare is caused by HDR. You don't need PS 3.0 to do this. Any card that supports Vertex Shader 2.0 and Pixel Shader 2.0 can do this effect. So that means Radeon 9700 and FX5900 can do this and we know these cards don't have PS 3.0. You don't need PS 3.0. It's irrelevant.

You can see more examples herepic1
pic2
pic3
pic4
pic5

Go download rthdribl demo if you want to see for yourself what HDR looks like.

OfficerDork, I guess Radeon 9700/9800 and FX5900 all have SM3.0 since HDR and SM3 is the same thing? PS 2.0 and 3.0 is supposed to look the same. PS 3 is just supposed to be faster.
 

OfficerDoofey

Member
May 26, 2004
112
0
0
Originally posted by: Naustica
Those picture shots are demonstrating HDR (High-Dynamic Range) lightning. The extremely bright light source and the sheer intensity glare is caused by HDR. You don't need PS 3.0 to do this. Any card that supports Vertex Shader 2.0 and Pixel Shader 2.0 can do this effect. So that means Radeon 9700 and FX5900 can do this and we know these cards don't have PS 3.0. You don't need PS 3.0. It's irrelevant.

You can see more examples herepic1
pic2
pic3
pic4
pic5

Go download rthdribl demo if you want to see for yourself what HDR looks like.

OfficerDork, I guess Radeon 9700/9800 and FX5900 all have SM3.0 since HDR and SM3 is the same thing? PS 2.0 and 3.0 is supposed to look the same. PS 3 is just supposed to be faster.

Hey no need to go name calling... how old are u? 12?

so your saying there is no difference in PS 2 to PS 3.0 apart from speed...? so PS 3.0 makes no enhancements what so ever? IMO thats not correct.. but i may be wrong.. its happened before..
and even if you are right and the only difference is speed... how can u then still say PS3.0 is irrelevant?

any performace whether its speed or IQ is relevant
 

ponyo

Lifer
Feb 14, 2002
19,688
2,811
126
Hey you're the one who rolled eyes, thumbed down, and was an ass towards me. I even tried to let it go but you wouldn't .

I never said there was no difference between PS2.0 and 3.0. I said both are supposed to look the same.

As for PS3.0 as irrelevant comment, you took that out of context. What I meant was that PS 3.0 is not required to do HDR lighting effects. PS 2.0 is sufficient.

In all my previous posts I was simply pointing out the pictures were showcasing HDR lighting and not necessary visual qualities brought on by PS 3.0.
 

OfficerDoofey

Member
May 26, 2004
112
0
0
I was just very surprised that you could tell it was definatly HDR and that it straight out wasnt PS 3.0.

You said there was only a speed difference between the two, PS 3.0 is only supposed to be faster.. i was saying i thought that statement was wrong..

I did take the irrelevant statement out of context... re-read it and it makes sense what u were saying.. sorry bout that one, my bad

Werent the original pictures meant to be showcasing PS 3.0?? i dont understand how if that was the case you are telling me that its not.. its HDR..
 

ponyo

Lifer
Feb 14, 2002
19,688
2,811
126
Those shots are using PS 3.0 and HDR. But in the current FarCry engine there are no visible differences between PS2.0 and 3.0. So you can't see it in the screen shots shown in these pictures. HDR is however very evident and extremely easy to spot as seen by the blinding lights in these same pics. PS 3.0 is just used to improve speed of rendering and for performance optimization purposes only and not for visual purposes.
 

CaiNaM

Diamond Member
Oct 26, 2000
3,718
0
0
Originally posted by: BFG10K
nv40 faster when using fp16 blending
Given the entire pipeline and framebuffer is full speed FP32 why even bother with FP16 on the NV40?

because you've confused it with something else... nv40 supports 16 bit floating point internal filtering, not 32 (ati supports neither, which is why HL2 HDR is done differently) with fp16-output. the output can then be converted to FX12 or FP32.


"Now, GeForce 6800 Ultra introduces an optional 64 bit floating point framebuffer. That is, RGBA FP16 FP16 FP16 FP16. We already covered its higher range compared to FX8. While nVidia calls it "high dynamic range" (HDR), it is in fact medium dynamic range (MDR). HDR needs at least 32 bit for each single value, NV40 has 16 bit. Anyway, FP16 is a great leap forward. The main advantage is: no longer do we have to render to an FP texture for MDR rendering. Just activate the FP16 framebuffer and you'll get an aceptable MDR render target. NV40 also supports all the alphablending stuff with its 64 bit framebuffer"

~ 3dcenter
 

Alkali

Senior member
Aug 14, 2002
483
0
0
Originally posted by: nemesismk2
Originally posted by: Cerb
That looks like crap.
If I want to see something like that, I can raise the brightness and contrast and lower gamma on my own.
It could have actually looked good, I imagine, but yecch.

Are you an ati owner by any chance? I am sorry to say the SM3.0 patch isn't for you!

Please get you facts straight. ATi have been able to do HDR since the Radeon 9700 series. nVidia have managed to do a brilliant job of making people think SM3.0 is gods gift to mankind - but in fact, in situations like this (HDR) - ATi has already been able to do it for years.

So, before you start proclaiming to everyone that it cant run on ATi hardware, concider the fact that CryteK have to satisfy not only their nVidia customers but their ATi ones too.

As for the technical side of things - HDR can be created in different ways (ATi's implementation is unchanged from 9700 series), whereas the nVidia implementation uses (at best quality) floating point 16 blending. CryteK will probably end up creating seperate paths for SM2.0/SM2.0b/SM3.0 so that every card capable of HDR can render the effects.
 

nRollo

Banned
Jan 11, 2002
10,460
0
0
So, before you start proclaiming to everyone that it cant run on ATi hardware, concider the fact that CryteK have to satisfy not only their nVidia customers but their ATi ones too

This just isn't true. There have always been vendor specific versions of programs, and this isn't really even vendor specific. (e.g. 3dfx Glide, S3 MeTal)

I don't see why you guys think programmers of TWIMTBP games are going to work to code down to ATIs 2002 level of tech when a. nVidia backs them b. you have the freedom of choice to buy a card that has offset and displacement mapping.

I can see it now: "Our backers at nVidia want us to make 100% sure the game looks every bit as good on ATI cards, no matter how much development time it takes! Oh yeah, and they want it to run faster on ATI too!"

Uh huh.
 
Jun 14, 2003
10,442
0
0
Originally posted by: reever
Originally posted by: nemesismk2
Originally posted by: Cerb
That looks like crap.
If I want to see something like that, I can raise the brightness and contrast and lower gamma on my own.
It could have actually looked good, I imagine, but yecch.

Are you an ati owner by any chance? I am sorry to say the SM3.0 patch isn't for you!

Are you a person who has never seen
this demo? I am sorry to say that you are uninformed

thats how HDR should be done.......alas its only a tech demo/benchmark
 
Jun 14, 2003
10,442
0
0
Originally posted by: Naustica
Originally posted by: SickBeast
How can you guys complain so much about those graphics? They are vastly enhanced! Look at some of the lighting effects!

Sure, they are overexposed a bit at times, but I'm sure that will be resolved. You should be focussing on the added visual quality brought on by PS3.0 instead of complaining about the overexposure making your eyes hurt.


What PS3.0 stuff are you seeing? All I see is HDR.

this is true........ok there might be the displacement mapping in there, if u can pick out from the supernova of light goin on in some of the pics, but the lighing effects are only High Dynamic range lighting, ATI demoed this with HL2..........hell they even have a tech demo for the 9700 that shows how HDR is used to make a scene look more real ....you know that demo with the crystal orb things? theres probably PS3.0 in those shots but the guy with the HDR pen needs to calm down abit first
 

oldfart

Lifer
Dec 2, 1999
10,207
0
0
Originally posted by: Rollo
So, before you start proclaiming to everyone that it cant run on ATi hardware, concider the fact that CryteK have to satisfy not only their nVidia customers but their ATi ones too

This just isn't true. There have always been vendor specific versions of programs, and this isn't really even vendor specific. (e.g. 3dfx Glide, S3 MeTal)

I don't see why you guys think programmers of TWIMTBP games are going to work to code down to ATIs 2002 level of tech when a. nVidia backs them b. you have the freedom of choice to buy a card that has offset and displacement mapping.

I can see it now: "Our backers at nVidia want us to make 100% sure the game looks every bit as good on ATI cards, no matter how much development time it takes! Oh yeah, and they want it to run faster on ATI too!"

Uh huh.
Rollo, you seemed to have done a 180 on your opinion of video card tech and the importance of shader visuals.

Last year, when you downgraded from a 256 bit PS 2.0 9800P to a 128 bit PS 1.4 5800U, you posted how it didn't matter what the technology was as long as the cards had about the same performance.

This year, you are all about having a newer core, and are against the Ati "2 year old core", even though the cards perform ~ the same.

Last year, when the visual differences between PS 2.0 and 1.4 were shown, you were not impressed. "Ooooooh shiny water.....oooohh shiny pipes....spank spank" or something like that. Your opinion was the PS 2.0 Vs 1.4 visuals were not important anyway.

This year, SM 3.0 is something that cant be done without. Are you now impressed with the PS 2.0/SM 3.0 visuals? No silly comments about the lighting effects shown here?

Lat year, you didn't have any gripe with nVidia's Brilinear, or other optimizations.

This year, you are outraged because ATi has a similar optimization.

Why the sudden change of heart on this stuff? It seems very inconsistent.
 

Alkali

Senior member
Aug 14, 2002
483
0
0
Originally posted by: Rollo
So, before you start proclaiming to everyone that it cant run on ATi hardware, concider the fact that CryteK have to satisfy not only their nVidia customers but their ATi ones too

This just isn't true. There have always been vendor specific versions of programs, and this isn't really even vendor specific. (e.g. 3dfx Glide, S3 MeTal)

I don't see why you guys think programmers of TWIMTBP games are going to work to code down to ATIs 2002 level of tech when a. nVidia backs them b. you have the freedom of choice to buy a card that has offset and displacement mapping.

I can see it now: "Our backers at nVidia want us to make 100% sure the game looks every bit as good on ATI cards, no matter how much development time it takes! Oh yeah, and they want it to run faster on ATI too!"

Uh huh.

So what?

You want games to start becoming nVidia versus ATi tech FIGHTS?

That, my friend, would become a future where WHOEVERS card you own, you can only see the full details in 50% of the games you own. How pathetic a future for gaming that would be.
 

Matthias99

Diamond Member
Oct 7, 2003
8,808
0
0
Originally posted by: Rollo
So, before you start proclaiming to everyone that it cant run on ATi hardware, concider the fact that CryteK have to satisfy not only their nVidia customers but their ATi ones too

I don't see why you guys think programmers of TWIMTBP games are going to work to code down to ATIs 2002 level of tech when a. nVidia backs them b. you have the freedom of choice to buy a card that has offset and displacement mapping.

Well, I think Crytek would *want* to if they could, since "ATIs 2002 level of tech" is what every R350/360, R420, AND GeForceFX card uses. It's TWIMTBP, not "TWIMTBP if you spend $300+ to buy our next-gen technology". Unless NVIDIA is paying them a bundle to only do this for the 6800-series cards, it doesn't make financial sense for Crytek to spend a *lot* of time and effort producing effects that only a handful of users will ever see.
 

CaiNaM

Diamond Member
Oct 26, 2000
3,718
0
0
Originally posted by: Rollo
So, before you start proclaiming to everyone that it cant run on ATi hardware, concider the fact that CryteK have to satisfy not only their nVidia customers but their ATi ones too

This just isn't true. There have always been vendor specific versions of programs, and this isn't really even vendor specific. (e.g. 3dfx Glide, S3 MeTal)

I don't see why you guys think programmers of TWIMTBP games are going to work to code down to ATIs 2002 level of tech when a. nVidia backs them b. you have the freedom of choice to buy a card that has offset and displacement mapping.

I can see it now: "Our backers at nVidia want us to make 100% sure the game looks every bit as good on ATI cards, no matter how much development time it takes! Oh yeah, and they want it to run faster on ATI too!"

Uh huh.

well, first they need to actually make it run on nvidia hardware :p
 

VisableAssassin

Senior member
Nov 12, 2001
767
0
0
Originally posted by: CaiNaM
Originally posted by: VisableAssassin
that isnt PS3.0 its HDR of which the radeons can do as well :)
The NV40 just does it faster

that's a deceptive statement.

r420's will render HDR faster using the shader method, nv40 faster when using fp16 blending (r420 can't do fp16(floats) filtering and blending.

HL2 uses the A16R16G16B16 buffers (not float buffers) for HDR lighting, which runs faster on r420.

That was in my first quick post while I was at work I explained a little more in the post below it if you woulda saw it my friend

NV40 has a floating point framebuffer and supports fp blending and filtering. This allows it to do HDR more efficiently.
Also HRD is a SM2 feature if i remember correctly. but ATI cards can do it...it just the nv40 does it faster.

there I mentioned the floats which is what I was specifically refering to.
How HL2 does it...I was not speaking of I could have allready told you HL2 would have use what would benefit the 420 the best since Valve and ATi are in bed so to speak, so naturally theyd use the method thats faster for an ATi card. But we were not speaking of HL2 right off so I didnt think it would have mattered much.
 

Ages120

Senior member
May 28, 2004
218
0
0
Soon enough there will be shaders for 3.0 that are done in 1 pass while 2.0 will take multiple passes. Just gotta wait till then. 3d mark 04 will shed some light on what the true improvements are.
 

Matthias99

Diamond Member
Oct 7, 2003
8,808
0
0
Originally posted by: Ages120
Soon enough there will be shaders for 3.0 that are done in 1 pass while 2.0 will take multiple passes. Just gotta wait till then. 3d mark 04 will shed some light on what the true improvements are.

The problem with this argument is that any shader too long to execute in a single pass in SM2.0 (that is, >128 instructions) is going to run very, very slowly even on a 6800U. SM3.0 doesn't increase the clock speed of the pixel shader; it just lets you run longer (and thereby slower) programs in a single pass.

If I knew more about writing pixel shaders (and I had a 6800-series GPU), it would be easy to show how much of an impact this has on speed. Just write, say, a 100-instruction PS2.0 shader, then a 400-instruction PS3.0 shader that just does the same thing four times. Then compare the speed of running the 400-instruction shader once against the speed of running the 100-instruction shader four times. I suspect the difference in time is VERY marginal - although, if it's not, then this would be a noticeable advantage for SM3.0, since you could also combine small shaders using branching and/or looping to increase performance similarly.
 

nRollo

Banned
Jan 11, 2002
10,460
0
0
Old Fart, I'm impressed, you've not only read what I've said but remembered.

My current stance would seem hypocritical on the face of it I imagine. Here's why it's not:

Last year, when you downgraded from a 256 bit PS 2.0 9800P to a 128 bit PS 1.4 5800U, you posted how it didn't matter what the technology was as long as the cards had about the same performance
You've mixed a few things together here, but I'll address each:
a. I went from a 9700Pro to a 5800NU that I clocked at Ultra level without a problem. There is much less difference going from 9700Pro to 5800U, they perform approximately equally. One has a little higher memory bandwidth, the other has higher fill rate. I found that I wasn't hitting the memory bandwidth limitation at the settings I ran. Upon going from 9800Pro to actual 5800U this year, I noticed AA/AF didn't seem as good, but overall am still loving the 5800U. (it's no slouch, still performs at ~ 9700P level)
b. PS 2 wasn't an issue last year. The only game that used it at all last year was Wallet Raider: Angel of Sloppy Code, and no one really played that.

This year, you are all about having a newer core, and are against the Ati "2 year old core", even though the cards perform ~ the same.
Personal preference. I've bought that core feature set twice for $400, there's a new core/ feature set to buy now. (I also bought two usable 5800s, seems fair) If the situation were reversed and ATI had the new tech and no overriding reason not to buy it, I'd buy ATI.

Last year, when the visual differences between PS 2.0 and 1.4 were shown, you were not impressed. "Ooooooh shiny water.....oooohh shiny pipes....spank spank" or something like that. Your opinion was the PS 2.0 Vs 1.4 visuals were not important anyway.
That was my opinion, I didn't think the shinier water of the famous HL2 comparison screen was that big a deal. ( the pipes did look better, but not enough to make me buy a video card based on that) Remember there were no games with PS2 effects last year, so they were easy to dismiss. I think we'll see much more of SM3 in the next 12 months than we saw of PS 2 in the last 12.

This year, SM 3.0 is something that cant be done without. Are you now impressed with the PS 2.0/SM 3.0 visuals? No silly comments about the lighting effects shown here?
I haven't seen enough comparison to say. For me, my hunch that nVidia will be anxious for developers to use their new card's now exclusive functionality, coupled with my lack of desire to buy a faster version of a card I've already bought twice, points me toward nVidia this round. (at least when I don't have to monitor inventory and pay $100 over MSRP :roll: )

Lat year, you didn't have any gripe with nVidia's Brilinear, or other optimizations.
This year, you are outraged because ATi has a similar optimization.

Two reasons:
1. nVidia gives you the option to turn off their optimizations and run true trilinear.
2. I'm giving those who said they would never buy from a company that built image degrading optimizations into their drivers and hid the fact to not be hypocrites and say the same about ATI. Not only did ATI do the exact same thing, but they made it worse by insisting reviewers turn off nVidia's optimizations for an accurate comparison to their "true trilinear". It's funny how a lot of those same people that stuffed that down my throat like Archie Bunker choking on a brat are now saying,"It doesn't matter, ATIs brilinear has a better algorhythm, so who cares?" even though IQ is degraded.
 
Apr 14, 2004
1,599
0
0
My question is why developers would bother with SM3.0 currently as only a minute percentage of their customers will have it. SM2.0, at least, is rather widespread by now considering its in so many cards in many different price ranges.
 

nRollo

Banned
Jan 11, 2002
10,460
0
0
Originally posted by: GeneralGrievous
My question is why developers would bother with SM3.0 currently as only a minute percentage of their customers will have it. SM2.0, at least, is rather widespread by now considering its in so many cards in many different price ranges.

I think because nVidia pays them to do so. The software firm I work for will customize a product for money. Isn't TWIMTBP an agreement nVidia has was developers to give them funds and technology? Remember Giants?