SM3.0 effects in FarCry

Page 4 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Acanthus

Lifer
Aug 28, 2001
19,915
2
76
ostif.org
Originally posted by: Rollo
Originally posted by: GeneralGrievous
My question is why developers would bother with SM3.0 currently as only a minute percentage of their customers will have it. SM2.0, at least, is rather widespread by now considering its in so many cards in many different price ranges.

I think because nVidia pays them to do so. The software firm I work for will customize a product for money. Isn't TWIMTBP an agreement nVidia has was developers to give them funds and technology? Remember Giants?

Thats exactly what it is.
 

Matthias99

Diamond Member
Oct 7, 2003
8,808
0
0
Originally posted by: Rollo
Originally posted by: GeneralGrievous
My question is why developers would bother with SM3.0 currently as only a minute percentage of their customers will have it. SM2.0, at least, is rather widespread by now considering its in so many cards in many different price ranges.

I think because nVidia pays them to do so. The software firm I work for will customize a product for money. Isn't TWIMTBP an agreement nVidia has was developers to give them funds and technology? Remember Giants?

I *thought* their TWIMTBP program was being used to provide testing/QA assistance, and support for NVIDIA cards' features. That's certainly the implication I've gotten from descriptions of it. It would be a little disconcerting to me if NVIDIA was directly paying off game developers.
 

Cerb

Elite Member
Aug 26, 2000
17,484
33
86
Originally posted by: GeneralGrievous
My question is why developers would bother with SM3.0 currently as only a minute percentage of their customers will have it. SM2.0, at least, is rather widespread by now considering its in so many cards in many different price ranges.
NVidia offers you money. Would you say no?
IMO, since there is nothing that can't be done in 2.0 (just more pipeline passes, IIRC), SM 3.0 should be made so it can back down. Now, if NVidia was feeding you, you wouldn't think that way, though, would you...?
 

Bovinicus

Diamond Member
Aug 8, 2001
3,145
0
0
It looks good in some of those shots, but it is overdone in others. If they toned it down a little bit, I think it would look amazing. These are actually the first screenshots of Far Cry I have ever seen. It has some seriously nice bump mapping effects.
 

OfficerDoofey

Member
May 26, 2004
112
0
0
Originally posted by: Matthias99
Originally posted by: Rollo
Originally posted by: GeneralGrievous
My question is why developers would bother with SM3.0 currently as only a minute percentage of their customers will have it. SM2.0, at least, is rather widespread by now considering its in so many cards in many different price ranges.

I think because nVidia pays them to do so. The software firm I work for will customize a product for money. Isn't TWIMTBP an agreement nVidia has was developers to give them funds and technology? Remember Giants?

I *thought* their TWIMTBP program was being used to provide testing/QA assistance, and support for NVIDIA cards' features. That's certainly the implication I've gotten from descriptions of it. It would be a little disconcerting to me if NVIDIA was directly paying off game developers.

why do companys make games/software? $$$
 

fsstrike

Senior member
Feb 5, 2004
523
0
0
Originally posted by: oldfart
Originally posted by: Rollo
So, before you start proclaiming to everyone that it cant run on ATi hardware, concider the fact that CryteK have to satisfy not only their nVidia customers but their ATi ones too

This just isn't true. There have always been vendor specific versions of programs, and this isn't really even vendor specific. (e.g. 3dfx Glide, S3 MeTal)

I don't see why you guys think programmers of TWIMTBP games are going to work to code down to ATIs 2002 level of tech when a. nVidia backs them b. you have the freedom of choice to buy a card that has offset and displacement mapping.

I can see it now: "Our backers at nVidia want us to make 100% sure the game looks every bit as good on ATI cards, no matter how much development time it takes! Oh yeah, and they want it to run faster on ATI too!"

Uh huh.
Rollo, you seemed to have done a 180 on your opinion of video card tech and the importance of shader visuals.

Last year, when you downgraded from a 256 bit PS 2.0 9800P to a 128 bit PS 1.4 5800U, you posted how it didn't matter what the technology was as long as the cards had about the same performance.

This year, you are all about having a newer core, and are against the Ati "2 year old core", even though the cards perform ~ the same.

Last year, when the visual differences between PS 2.0 and 1.4 were shown, you were not impressed. "Ooooooh shiny water.....oooohh shiny pipes....spank spank" or something like that. Your opinion was the PS 2.0 Vs 1.4 visuals were not important anyway.

This year, SM 3.0 is something that cant be done without. Are you now impressed with the PS 2.0/SM 3.0 visuals? No silly comments about the lighting effects shown here?

Lat year, you didn't have any gripe with nVidia's Brilinear, or other optimizations.

This year, you are outraged because ATi has a similar optimization.

Why the sudden change of heart on this stuff? It seems very inconsistent.

Hey, its rollo, what can you possibly expect? He still thinks his 5800 U is better than a 9800 Pro LOL.
 

oldfart

Lifer
Dec 2, 1999
10,207
0
0
Rollo
Old Fart, I'm impressed, you've not only read what I've said but remembered.
Thanks. Senility hasn't totally set in yet. :p
Upon going from 9800Pro to actual 5800U this year
That was the downgrade I posted about. You had no issue then about an older core with similar performance. Now you wont use an older core with similar performance.
That was my opinion, I didn't think the shinier water of the famous HL2 comparison screen was that big a deal. ( the pipes did look better, but not enough to make me buy a video card based on that) Remember there were no games with PS2 effects last year, so they were easy to dismiss. I think we'll see much more of SM3 in the next 12 months than we saw of PS 2 in the last 12.
Again, do you think SM3.0 visuals are worth it? You seem to be very unimpressed about any new visual features up until now. You have also reversed your position on the value of video card features for future games.
Two reasons:
1. nVidia gives you the option to turn off their optimizations and run true trilinear.
Now they do. They didn't when they first implemented it. I dont recall you being upset at all with nVidia, but you are very upset with ATi for doing the same.
2. I'm giving those who said they would never buy from a company that built image degrading optimizations into their drivers and hid the fact to not be hypocrites and say the same about ATI.
Agree about being a hypocrite about one co Vs the other. That goes for nVidia fan boys as well as ATi. Right?
 

nRollo

Banned
Jan 11, 2002
10,460
0
0
Originally posted by: fsstrike
Hey, its rollo, what can you possibly expect? He still thinks his 5800 U is better than a 9800 Pro LOL.

Hey, it's Fsstrike, what can you expect? He talks about good hardware as if he's ever owned any (hasn't) and talks smack about guys who buy it on a whim and use it till they're tired of it.

That P4 1.5 and Ti4200 crankin' some Quake 2 for you there, little buddy?

;)

BTW Big Chief- I never said my 5800U was "better" than the 9800Pro I traded for it. I said it was a neat card I'm glad I got a chance to try, that has performance comparable to the 9700 Pro I used to have back in the day.

I'll put it this way Chief- the only game I have I can't play 10X7 4X8X with no slowdowns is Painkiller, and that pumps a lot of polys. Why don't you try turning on the 4X8X and count the frames on your fingers?
:roll:
 

Pete

Diamond Member
Oct 10, 1999
4,953
0
0
Originally posted by: Matthias99
I *thought* their TWIMTBP program was being used to provide testing/QA assistance, and support for NVIDIA cards' features. That's certainly the implication I've gotten from descriptions of it. It would be a little disconcerting to me if NVIDIA was directly paying off game developers.

nV and EA have an agreement whereby--AFAIK--EA develops mainly on GeForce cards provided by nVidia. Obviously they must test on other cards, but Tiger Woods 2003 and 2004 had some nV-only features that were enabled for ATi owners only after a patch or two. That seems like at least a little special preference.
 

Insomniak

Banned
Sep 11, 2003
4,836
0
0
Originally posted by: fsstrike
Hey, its rollo, what can you possibly expect? He still thinks his 5800 U is better than a 9800 Pro LOL.


I'd rather have a 5800 Ultra over a 9800 Pro any day. It's rarer - I prefer collecter's item appeal far more than raw power.

I mean, compare a modern corvette to a 70s era Mako 'Vette. The modern Vette would dog the classic, but which would I rather have? The Mako. There are very few of those on the road. If I had a 5800U, I'd probably keep it even after I built a new system, just to have arguably the most infamous card in the history of graphics hardware...
 

nRollo

Banned
Jan 11, 2002
10,460
0
0
That was the downgrade I posted about. You had no issue then about an older core with similar performance. Now you wont use an older core with similar performance.
I may end up with a X800 something some day, but what I'm looking for now is a GT or an Ultra. You're right that I like to play with video cards, unfortunately, my son and his hobbies, his daycare/preschool, and my other hobbies have turned into a black hole of money. I just can't justify buying a new card every month and eating the losses like I used to, the money is better spent on my son or invested.

Again, do you think SM3.0 visuals are worth it? You seem to be very unimpressed about any new visual features up until now. You have also reversed your position on the value of video card features for future games.
Like I said, I haven't seen enough to comment. I do think their will be far more SM3 patched and native games within a year from now than there were PS2 games 1.5 years after the R300s launch. I think the "future" is here for SM3 because it's not as big a leap forward as the DX9b feature set was.

Now they do. They didn't when they first implemented it. I dont recall you being upset at all with nVidia, but you are very upset with ATi for doing the same.
I wasn't upset, and I'm not really upset with ATI for doing the same thing. Am I going to note ATI did the same thing, actually much worse when you couple it with their recommendation to turn off nVidia optimizations for equal benching?
You bet. I had that "cheater drivers", "reduced IQ", "lied to customers" BS shoved down my throat so many times there's no way I'm going to just say "Oh well. Everyone is doing it. What a crazy world we live in." The people who yammered that claptrap day in, day out are now going to hear about it whenever appropriate.
Those who live in glass houses shouldn't throw stones.

Agree about being a hypocrite about one co Vs the other. That goes for nVidia fan boys as well as ATi. Right?
Agreed.
 

nRollo

Banned
Jan 11, 2002
10,460
0
0
Originally posted by: Insomniak
Originally posted by: fsstrike
Hey, its rollo, what can you possibly expect? He still thinks his 5800 U is better than a 9800 Pro LOL.


I'd rather have a 5800 Ultra over a 9800 Pro any day. It's rarer - I prefer collecter's item appeal far more than raw power.

I mean, compare a modern corvette to a 70s era Mako 'Vette. The modern Vette would dog the classic, but which would I rather have? The Mako. There are very few of those on the road. If I had a 5800U, I'd probably keep it even after I built a new system, just to have arguably the most infamous card in the history of graphics hardware...

We think alike Insomniak. The 5800U is going in my son's machine, with pride, when I upgrade.

For me it would be a new corvette vs. a 70-74 Challenger or Barracuda. The only muscle car I ever had was a 72 Monte Carlo with a 350 with Edelbrock Tarantula/Holley double pumper/Holley racing fuel pump/hot cam. That thing was mucho macho grande'- Cragar SSs, 60/70s, pearl white over gold lacquer, hood tach- cars have gotten too high tech and plastic. Had a 74 Camaro 350/4 speed in HS too, loved that also.
 

oldfart

Lifer
Dec 2, 1999
10,207
0
0
I may end up with a X800 something some day, but what I'm looking for now is a GT or an Ultra. You're right that I like to play with video cards, unfortunately, my son and his hobbies, his daycare/preschool, and my other hobbies have turned into a black hole of money. I just can't justify buying a new card every month and eating the losses like I used to, the money is better spent on my son or invested.
I know the feeling. I just had a new deck built, new mower, new fridge and dishwasher (gotta keep Mrs Oldfart happy). Beach and Disney vaca coming up soon. Got two little ones myself that are doing all kinds of stuff. I travel a lot these days and honestly, have little to no time to play any games. The 9800P I have now is pretty much a waste. I can easily afford to buy any card I want, but dont see a point of a new card at this time. Maybe when D3 and HL2 come out I may think about it. And I agree about the current price gouging. I wont pay the prices they are getting at the moment anyway.
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,003
126
because you've confused it with something else... nv40 supports 16 bit floating point internal filtering, not 32
Yeah you're right it is FP16. It seems strange for nVidia to claim full speed FP32 in light of this.

I'd rather have a 5800 Ultra over a 9800 Pro any day.
How about three of them?
 

Insomniak

Banned
Sep 11, 2003
4,836
0
0
Originally posted by: BFG10K

How about three of them?


you mean 3 9800 Pros? I'd take the Ultra. Like I said, collectability. You could probably get $600 for a 5800U on ebay right now.
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,003
126
No, three 5800 Ultras.

Or more specifically would you follow Rollo's upgrade path of 9700P -> 5800U -> 9800P -> 5800U -> 9800P?

That's not collecting, that's lunacy.
 

Acanthus

Lifer
Aug 28, 2001
19,915
2
76
ostif.org
Originally posted by: BFG10K
No, three 5800 Ultras.

Or more specifically would you follow Rollo's upgrade path of 9700P -> 5800U -> 9800P -> 5800U -> 9800P?

That's not collecting, that's lunacy.

I thought his 1st 5800 was a vanilla 5800 @ ultra speeds.
 

CaiNaM

Diamond Member
Oct 26, 2000
3,718
0
0
Originally posted by: Acanthus
Originally posted by: BFG10K
No, three 5800 Ultras.

Or more specifically would you follow Rollo's upgrade path of 9700P -> 5800U -> 9800P -> 5800U -> 9800P?

That's not collecting, that's lunacy.

I thought his 1st 5800 was a vanilla 5800 @ ultra speeds.

yea, it was.. was that the one he drove over a few times, or did he actually have 3 nv30s? ;)
 

CaiNaM

Diamond Member
Oct 26, 2000
3,718
0
0
Originally posted by: BFG10K
because you've confused it with something else... nv40 supports 16 bit floating point internal filtering, not 32
Yeah you're right it is FP16. It seems strange for nVidia to claim full speed FP32 in light of this.

marketing ;)
 

gunblade

Golden Member
Nov 18, 2002
1,470
0
71
Originally posted by: BFG10K
because you've confused it with something else... nv40 supports 16 bit floating point internal filtering, not 32
Yeah you're right it is FP16. It seems strange for nVidia to claim full speed FP32 in light of this.

I'd rather have a 5800 Ultra over a 9800 Pro any day.
How about three of them?

The FP blending is not done in the shader peipeline. So it doesn't matter the FP32 pipeline stuff. Ati , as confirm by some ati employee, doesn't support any float or integer blending and filtering. As a result, they cannot do real HDR rendering. HL2 HDR is a simulated effect using the shader pipeline and it is simulated effect which is not a real HDR and has limitation, but look danm close enough.
 

McArra

Diamond Member
May 21, 2003
3,295
0
0
Originally posted by: gunblade
Originally posted by: BFG10K
because you've confused it with something else... nv40 supports 16 bit floating point internal filtering, not 32
Yeah you're right it is FP16. It seems strange for nVidia to claim full speed FP32 in light of this.

I'd rather have a 5800 Ultra over a 9800 Pro any day.
How about three of them?

The FP blending is not done in the shader peipeline. So it doesn't matter the FP32 pipeline stuff. Ati , as confirm by some ati employee, doesn't support any float or integer blending and filtering. As a result, they cannot do real HDR rendering. HL2 HDR is a simulated effect using the shader pipeline and it is simulated effect which is not a real HDR and has limitation, but look danm close enough.

And what about those HDR demos in internet? how can they manage to make HDR if they aren't specially coded for ATI? Could someone explain?
 

yhelothar

Lifer
Dec 11, 2002
18,409
39
91
I think you guys are overreacting that they're "overdoing it"
These screenshots are probably just to show off the different exposure level effects you can use with HDR.
Check out the 7th and 8th shot. They are both the same shot, albiet with different exposure levels. One is over exposed, and one is underexposed.
This is kind of like what ATI did in their HDR demo with the marbles.
 

nRollo

Banned
Jan 11, 2002
10,460
0
0
Originally posted by: BFG10K
No, three 5800 Ultras.

Or more specifically would you follow Rollo's upgrade path of 9700P -> 5800U -> 9800P -> 5800U -> 9800P?

That's not collecting, that's lunacy.

I never bought the same video card twice:
9700Pro 7 months
5800NU OCd to U level 2 months
9800 Pro 8 months
5800 OTES (5800 NU with 2ns RAM and OTES [quiet flowfx]) DOA, trolled
Traded 9800 Pro for 5800U 3 months now?

Wanted the OTES because they're very rare, in effect, a quiet Ultra. Wanted the Ultra because they're rare and infamous. Bought the 5800NU because I was bored with 9700P, needed a change.

So I've had a working 9700P, 9800P, 5800NU, and 5800U. What madness possessed me to want to have 4 different video cards in the last two years?! Call the asylum!

Doesn't seem so "crazy" to me.
 

oldfart

Lifer
Dec 2, 1999
10,207
0
0
Did you save the quiet OTES off the DOA card?

And I have to agree your card swapping was a very odd thing. Its one thing to change cards to upgrade, another to go lateral (or downgrade). I dont get bored with a piece of PC hardware. A CPU, HD, ram, mobo, video card, or whatever is there to do a job in the PC. If it is working well, that is all that matters. If I change something, it is to make the PC faster, more reliable, quieter, or something meaningful. It has to have a benefit of some sort. I think that would describe the vast majority of users.

IIRC, your 9800P -> 5800U swap was to prove that the nV30 was as good as a R350. Without searching out the posts, it went something like "to prove that the nv30 is not inferior to the R350, I'll offer to trade my 9800P for one." Your point was no one would trade for the supposedly "superior" 9800P since the nV30 was just as good.

The trade was to prove a point. You did this a couple of times. I never thought the trade idea proved one thing or another, and found the whole idea rather odd (I did post that at the time).

Since the trade did happen, did it prove anything? Did it prove the nV30 is an inferior card?