Originally posted by: Actaeon
The fanboy-ism in this thread is rampant.
Originally posted by: housecat
I dont know if you are referring to my data (which no one seems to be able to refute- this happens when someone speaks the truth).. but theres a lot of DOTF on this forum about SM3/DX9C as well.
And in my definition, DOTF is denial of the facts.![]()
Originally posted by: housecat
Originally posted by: Noob
Do you guys think the 6800's will even have the power to run next generation games that are SM 3.0 supported?
I do.
I can run insane resolutions with HDR on in Farcry today.
Not only that, but the NV50 will likely be able to even better.
Its the R520 you should worry about, it is ATI's first gen SM3.. while NV will be on their 2nd revision SM3, having much more experience with it than ATI.
So if you are looking for your next card to be SM3, I'd stick with NV50.
Originally posted by: housecat
Originally posted by: Noob
Do you guys think the 6800's will even have the power to run next generation games that are SM 3.0 supported?
I do.
I can run insane resolutions with HDR on in Farcry today.
Not only that, but the NV50 will likely be able to even better.
Its the R520 you should worry about, it is ATI's first gen SM3.. while NV will be on their 2nd revision SM3, having much more experience with it than ATI.
So if you are looking for your next card to be SM3, I'd stick with NV50.
Originally posted by: humey
1ST i never said it couldnt play 2days games, i said not futureproof it not even dx9c
2ND plz dont flood screen with lots of quotes its PIA and i feel for 56k users trying to load 2 screens of repeats.![]()
How is this relevant?Originally posted by: munky
How many generations did it take for NV to get DX9 right? I'll give you a hint - it wasn't 1.
Originally posted by: housecat
Originally posted by: pyrosity
The only difference between 2.0 and 3.0 is instruction lengths allowed and other shader programming differences. None of this results in new features. Everything that 3.0 does, 2.0 can do as well. That's including displacement mapping. I believe that Epic claims d-mapping as "3.0 exclusive" because they couldn't have pulled it off if they didn't have the speed boost of 3.0. The lighting differences shown in Far Cry 1.3 are done by HDR which you know is supported in PS 2.0. Every visual effect and improvement made is not done by switching/changing some shader language; it is done by adding different effects that come from the 2.0 era.
This is so misinformed.
1. PS3.0 offers slightly to moderately better efficiency and will offer negligible to somewhat better performance than PS2.0
2. Effects may be (somewhat) easier to implement, and those same effects may be slightly less stressing on the hardware through PS3.0
3. PS3.0 offers full precision (32bit floating point), while 2.0 does not (24bit).
Anyone who has studied Nvidia's new pixel shaders will know that it has PLENTY of power to use this as well.
4. VS (vertex shader) 3.0 will bring exponentially greater image quality over VS2.0 and where we'll see the biggest IQ improvements. This feature finally brings us displacement mapping that DX9 originally promised.
Originally posted by: Matthias99
Originally posted by: housecat
Originally posted by: pyrosity
The only difference between 2.0 and 3.0 is instruction lengths allowed and other shader programming differences. None of this results in new features. Everything that 3.0 does, 2.0 can do as well. That's including displacement mapping. I believe that Epic claims d-mapping as "3.0 exclusive" because they couldn't have pulled it off if they didn't have the speed boost of 3.0. The lighting differences shown in Far Cry 1.3 are done by HDR which you know is supported in PS 2.0. Every visual effect and improvement made is not done by switching/changing some shader language; it is done by adding different effects that come from the 2.0 era.
This is so misinformed.
1. PS3.0 offers slightly to moderately better efficiency and will offer negligible to somewhat better performance than PS2.0
2. Effects may be (somewhat) easier to implement, and those same effects may be slightly less stressing on the hardware through PS3.0
Fixed those for you. One big shader is not necessarily faster than multiple small shaders. And a lot of simple shader code just doesn't need much (if any) branching. Frankly, once you start getting into the sizes of shader programs where SM3.0 might make a noticeable (>= 10%) performance difference, it's running so slowly on today's hardware that the difference is mostly irrelevant.
I haven't seen developers saying it's worlds easier to write PS3.0 code than PS2.0. Having HLSLs is more of a benefit in terms of development.
3. PS3.0 offers full precision (32bit floating point), while 2.0 does not (24bit).
This is just flat-out wrong; PS2.0 offers 32-bit shaders if the hardware supports it (see, for instance, the GeForce5, which can do 32-bit PS2.0); it only REQUIRES that you support at least 24-bit shaders. PS3.0 requires 32-bit shader support. It's unclear how important this is anyway; HDR looks pretty damn good even at 24-bit precision.
Anyone who has studied Nvidia's new pixel shaders will know that it has PLENTY of power to use this as well.
Sort of. They don't really perform any better than ATI does in shader-heavy games like Far Cry and HL2. Doom3 is a different situation, because NVIDIA can do stencil shadows way, WAY faster than ATI can.
4. VS (vertex shader) 3.0 will bring exponentially greater image quality over VS2.0 and where we'll see the biggest IQ improvements. This feature finally brings us displacement mapping that DX9 originally promised.
VS3.0 does add hardware displacement mapping (well, actually, what it adds is letting vertex shaders access texture memory, which among other things lets you do HW-level displacement mapping). However, I do not believe this will run at a usable speed on any (single) GeForce6 card, and so I do not think it is that big a deal. It's no good if it brings $400-500 hardware to its knees on today's games; tomorrow's will be unplayable. Furthermore, no games are currently using this, and I don't know of any before UE3 that are specifically planning on using it.
Also, people keep bringing up FarCry's HDR -- which does not use SM3.0 at all. It is implemented through a floating-point framebuffer, which is (so far) an NVIDIA-only feature, and not part of the SM3.0 spec (this is also why you cannot use AA with HDR in FarCry -- NVIDIA does not support AA on FP framebuffers). If a card has FP framebuffers, even if it does not have SM3.0 support, FarCry HDR will work on it. Now, you can do HDR via pixel shaders -- HL2 is supposed to be doing this in a patch sometime -- but it is not clear what future games will go for.
Basically, where all these threads wind up is:
PS3.0 does not improve IQ over PS2.0, but may improve performance in some situations. However, the performance boost mostly comes with very long shaders, which will not run at usable speeds on today's cards anyway. Also, ATI's "PS2.0b" offers much of the same improvement for developers wishing to support it (Far Cry did this; the performance difference between PS2.0b and PS3.0 is negligible).
VS3.0 can improve IQ over VS2.0, but the really advanced features will probably run too slowly to be useful on today's hardware. It can also improve performance in some situations through geometry instancing, but again, PS2.0b also offers this.
FarCry's HDR is not related to SM3.0 at all, but uses an NVIDIA-specific feature. It is unclear if other games will use this as well, and/or if ATI's next-gen hardware will support this. At least some other games will be doing HDR through pixel shaders, which should work on any PS2.0/3.0 card.
IMO, at this point, it seems unlikely that SM3.0 will provide any sizable benefits for GF6 hardware going forwards. Certainly it will not make a 6600GT beat an X800XL/XT (and anyone who says this probably has a fuzzy grasp at best of how graphics hardware works), although I could see a 6600GT maybe getting close to an X800Pro in extremely optimal circumstances. 500+-line shaders and HW displacement mapping are likely to bring any GF6 card to a crawl even with SM3.0.
Originally posted by: hans030390
Um, no, Sm3.0 allows like...65000 somethings...i forget the term lol, as 2.0 supports 512...so yeah
REMEMBER!!! Sm3.0 supports displacement mapping, and 2.0 doesnt! Dmapping will be used HEAVILY in next gen games! So 2.0 might try to look like it, and it will go slow, while 3.0 will have less of a problem making it look like that.
Displacement mapping- instead of making the object appear to have 3d bumps, as in bump mapping, Dmapping actually physically changes the geometry of that object it is applied to. What you get is an extrememly detailed model without all that extra polys and modelling, but with real bumps...not only that, the shadows show the Dmapping bumps as well.
don't say 2.0 and 3.0 are the same, and that 3.0 is just more effiecient with no eye candy...read up on it before you reply, that was the purpose of my first post.
actually, far cry did have a change with 3.0. It rendered lights differently on objects and i think it applied it to foliage. It made a slight change, but you won't notice the real 2.0 to 3.0 difference unless the game uses all of the 3.0 features.
Originally posted by: keysplayr2003
That's two slightly's and two somewhat's. Seems to me that these might add up to a single substantial. Not saying your right or wrong. I'm saying you don't know along with the rest of us informed or not although I do respect your opinions of course.
Originally posted by: Rollo
How is this relevant?Originally posted by: munky
How many generations did it take for NV to get DX9 right? I'll give you a hint - it wasn't 1.
Originally posted by: Ackmed
Originally posted by: housecat
Originally posted by: Noob
Do you guys think the 6800's will even have the power to run next generation games that are SM 3.0 supported?
I do.
I can run insane resolutions with HDR on in Farcry today.
Not only that, but the NV50 will likely be able to even better.
Its the R520 you should worry about, it is ATI's first gen SM3.. while NV will be on their 2nd revision SM3, having much more experience with it than ATI.
So if you are looking for your next card to be SM3, I'd stick with NV50.
Anyone with a monitor that can display a high enough res can run any game at "insane resolutions". Thats doesnt mean its going to be smooth. Especially since smooth is very subjective. Personally I dont consider 25-30 frames to be smooth. Which is what I got with my 6800GT, even lower with lots of action.
Does it really matter if its their first card for SM3? I dont think so. Look at their first try on DX9 and PS2.0. A MUCH better outing than NV's first.
I also agree with whomever said to wait till we actually see cards to decide which to "stick" with.
Originally posted by: housecat
While this would normally be a great, and witty response.. I must say I disagree.
ATI's first outing with DX9 that was so great, was not really ATI's outting. They had purchased ArtX, who had done the bulk of the development on the R300.
And not that it mattered, how long did it take for DX9 to become a powerful force after the 9700 was released?
Now, now.. dont be a fanboy, you must admit that.. just like SM3/DX9C today! It wasnt useful in many games!!!!!!!!![]()
![]()
When the shoe is on the other foot is when you always spot the fanboys.
ATI itself really didnt do anything spectacular (R300) to speak of. We have no idea how ATI would have done with DX9 hardware had they not bought out ArtX.. the same technology was used on the Gamecube.
I should note, it was said that 3dfx's technology that NV aquired was supposed to have been used on the NV30. So we know who got the better deal there.
NV would've been better off continuing and doing their own thing. Its prob safe to say that 3dfx goign under when they did was prob a good thing, because their next few GPUs were probably going to be horrid (at least in DX9).
Regardless, ATI has a LOT on the line now with R520. I'll briefly outline.
1. They have to have a successful launch of their AMR/SLI setup, and it has to come out on time and be fast
2. R520 has to be out ON TIME, and IN QUANTITY this time.
3. They must have R520s DX9C/SM3 performance match at least the NV40, without being a disgrace. instead of assuming its faster, ppl better settle on equal to NV's first SM3 outting. Everyone assumes ATI has all of NV's research on SM3. NV not only researched it, they put it out in retail form!
4. This is ATI's first core since before the R300 done on their own.. that means NEW DRIVERS need written after riding the easy train from 9700-X800, not a good thing for ATI as they do not accel at this (see HDTV Blunder). And the core has to be damn fast at everythign it does (besides DXNext features)
A lot of risk. Needless to say, I will be selling ATI stock and buying Nvidia stock.
When the shoe is on the other foot is when you always spot the fanboys
Originally posted by: housecat
This product is too expensive and has too much advanced technology.
Originally posted by: housecat
Who in the hell is against superior/advanced technology?
Oh wait. I'll answer that.. ATI devotees in this case.
Originally posted by: PrayForDeath
Splinter Cell 3 Demo is an example of an SM3 game that runs horribly on NV4x hardware. I think what UE3 engine programmers said about 6600GT is pure marketing bs.
That doesn't mean SM3 is pointless, it actually has some pretty good features, it's just that IMHO NV4x isn't fast enough to support it.
Originally posted by: housecat
lol look at creig defending ATI like a true champ
i dont give a damn about nv/ati. i do not understand the downplaying of SM3/DX9C by everyone here. it makes no sense to me at all.
bring me two cards with identical speed, and one has DX9C hardware.. I'm going to take the DX9C hardware. Its that simple.
The rest is all ATI fanboyism.
Originally posted by: housecat
i dont give a damn about nv/ati. i do not understand the downplaying of SM3/DX9C by everyone here. it makes no sense to me at all.
bring me two cards with identical speed, and one has DX9C hardware.. I'm going to take the DX9C hardware. Its that simple.
The rest is all ATI fanboyism.
