Shader model 3.0 (a civilized discussion)

hans030390

Diamond Member
Feb 3, 2005
7,326
2
76
Now, I am probably known as being one of the biggest shader model 3.0 fans on the forums. Why? Why not? How can you not be? Is it because only Nvidia offers it now and you are still bonded to ATI? Well, I hope I can get my point across to everyone on why it is good. Also, what about the 6xxx series? Will they perform well in next-gen games that use SM3.0?

First off, I want this to be a civilized discussion (unlikely to happen) and not a flame war (likely to happen). This thread is based around my opinion of SM3.0, but my opinions are backed by the facts. (I understand that sounds strange and you want to know "what facts").

I will list many sites and they all have something to do with SM3.0. I recommend you read all of them before posting.

http://www.gamers-depot.com/interviews/dx9b/001.htm (Interviews many game developers and give you their input on shader model 3.0)

http://www.gamespy.com/articles/510/510938p1.html (Talks a bit about the Sm3.0 on 6800 cards.)

www.hardocp.com/article.html?art=NjA5 (Great, thourough explanation of SM3.0 and what it can do. It also puts it up against 2.0. I must also tell you that ANY Sm3.0 screenshot on there is really using sm2.0. Disregard the pictures, but the text is what is important)

http://www.nordichardware.com/Articles/?page=2&skrivelse=346 (Simple Far Cry tests running Sm3.0...Shows that it slightly improves performance, while also improving eye candy)

http://www.gamespot.com/news/2005/03/09/news_6120091.html (Not really worth reading, but it states that many next generation games on XBOX 2 are using the Unreal Engine 3, which uses SM3.0 heavily. Many Xbox 2 games will be on PC)

http://www.gamespot.com/news/2005/03/09/news_6120126.html (Unreal Engine 3 info. It is important to note that a programmer of the engine mentioned that the 6600GT would run the engine, and games based off of it, easily)

http://www.microsoft.com/whdc/winhec/partners/shadermodel30_NVIDIA.mspx (a list of some SM3.0 features and how they improve over features in 2.0)

This is directly from the Gamespot article with Unreal engine 3 at GDC:

"An even more pressing question is the kind of hardware you will require to run Unreal Engine 3 games. Next-generation consoles will have no problems with Unreal Engine 3 games, but PC gamers don't need to worry, according to Rein. He said that a video card based on an Nvidia 6600GT, the kind currently available for around $250, will be able to handle games based on the engine easily. However, by the time any games based on Unreal Engine 3 ship next year, those cards will have dropped in price considerably, meaning that PC gamers won't have to drop big bucks on upgrades.

Epic also acknowledged that there are already several third-party projects under way using Unreal Engine 3, including at least one online role-playing game.?

Now, this my though, but Rein only mentioned a card from the 6xxx series...and they support Shader Model 3.0. In fact, Unreal Engine 3 uses SM3.0 heavily. Not only that, he mentioned that a 6600GT would run the engine easily. Plus, there are several games under way using the engine (most likely to be released early next year).
That alone can partially answer the question, "Will my 6600gt run next gen games?"
The answer (or at least for UE3) would be YES.
Also, many think SM3.0 games won't be out for a while. They will in fact be released late this year or early next year.

Also, many people say this: "Far Cry has minimal improvement with SM3.0, therefore 3.0 isn't that important."
No, Far Cry was built with 1.1 and 2.0. A game built with a heavy reliance on (or great support for) shader model 3.0 will truly show us what it is capable of.

There is an official list of games that support (or will) SM3.0:
Unreal 3engine
Lord of the rings: Battle for the middle earth,
Stalker: Shadows of Chernobyl,
Vampire: Bloodlines,
Splinter Cell 3,
Tiger Woods 2005,
Madden 2005,
Driver 3.0,
Grafan ,
Metal of Honor: Pacific Assault,
Painkiller trough Patch,
Far cry trough Patch.
Half life 2 (sometime)

I will also mention a couple of feature that will make 3.0 look far superior to 2.0.

Displacement Mapping- This is only available with SM3.0 Unlike normal mapping, where the object is not physically changed, but looks like it, displacement mapping actually changes the object. The texture and lighting creates real height on an object. Not only that, but the shadow of that object will show all of the displacement mapping changes. SM2.0 can't do any of that (or at least, not easily)

Full-Speed 32-Bit Color Precision- To be honest, I don't know what this does. But many game developers find it to be important.

Many other websites and magazines (when they did a 6800 vs x800 article) that although the x800 performs better in today's games, the 6800 WILL pull ahead in future games as early as late 2005. They mentioned that SM3.0 will be a must.

For those of us with the 6xxx series, we are set for future games. It is a fact, it came straight from the programers of Unreal Engine 3. Sorry to say, you can't argue with that because I would trust a programmer of that engine over someone who knows a lot less about it.

It seems ATI thinks SM3.0 is important enough to go into the R520 core. So it must be important.

For those of you who think SM3.0 is pointless and should not be a buying factor for the 6xxx series, you are wrong.

BUT, I must add this before you flame me. If you upgrade often, don't worry about 2.0 or 3.0. If you don't (like me) SM3.0 should be enough to make you pick up a 6xxx card now.

I really don't know what point I was trying to get across. It is a mass response to those who disagree with me on my Shader Model 3.0 views. I wanted to get some facts down about it too.

Please discuss in a civilized way with me and others!
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,003
126
a video card based on an Nvidia 6600GT, the kind currently available for around $250, will be able to handle games based on the engine easily
I find that very hard to believe.

Nevertheless I've always been a strong supporter of shaders but fanbois will always be fanbois. First it was the nVidia ones making shiny pipes comments and now it's the ATi guys claiming nobody needs SM 3.0.
 

hans030390

Diamond Member
Feb 3, 2005
7,326
2
76
I find it hard to believe too, but if they say it will, I'm gonna trust them. Plus it makes sense...Nvidia payed ALOT of companies (even the UE3 peoples) to support SM3.0 for the 6xxx series...so i wouldn't doubt if it runs the UE3 at least decently.

Ok, so good so far...no flames yet...
 

sellmen

Senior member
May 4, 2003
459
0
0
Doom III was built around the Geforce 3, AFAIK. A Geforce 3 can run Doom III, albiet at low resolutions and detail levels.

I suspect the same will happen with the 6600GT and Unreal Engine 3. There are games out already that stress a 6600GT; Unreal Engine 3 looks worlds better than any game out today.
 

hans030390

Diamond Member
Feb 3, 2005
7,326
2
76
Well, If they program and code it, there is a very good chance it will run great with a 6600gt. for example, Gran Turismo 4, looks not only amazing, but supports 1080i WITH 60fps...this is on Playstation2 also...

I'm still convinced as the programmers for UE3 said it would run fine with 6600gt. I don't think it would if it only had Sm2.0 though...i think Sm3.0 is the key...
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
All is fine and dandy and SM3.0 will be good. However, given that currently Nvidia cards offer 0 performance advantage in SM3.0, I just don't see how they will play SM3.0 games better. You see, radeon 9700/9800 series were always faster than 5900 series in PS2.0 games and that is how the argument started that 9800Pro is > 5900U. This proved to be true (see HL2 where 5900 cards default to 8.1). However, today, 6xxx series cards offer no advantage that can be measured even by benchmarks (not even talking about gaming experience). Also, in some games that use some of the features of SM2.0++ (SM3.0), nvidia cards become too slow because SM3.0 in its true form is too intensive for today's cards (ie. HDR in Far Cry) => Chronicles of Riddick with SM2.0++

"Chronicles of Riddick takes advantage of normal maps, while the game?s ?2.0++? mode enables soft stencil shadows for GeForce 6 users (although this comes at a remarkable performance hit, which you?ll see in our benchmarks)." - firingsquad

This logic can be applied to current PS2.0 games, even though 5900 supports DX9.0, it runs faster in DX8.1 mode. So far nothing indicates that just because 6xxx support SM3.0, they will play games faster. If anything, if those cards are forced to run SM3.0, they might run them slower due to increased complexity as a result of SM3.0.

 

hans030390

Diamond Member
Feb 3, 2005
7,326
2
76
Those games using 2.0++ aren't really 3.0. The reason it hurts performance is because it is trying to achieve what 3.0 looks like, but with only using 2.0. 3.0 is meant to look alot better than 2.0, while not being as taxing. That is why it performs bad.

I mentioned that Far Cry doesnt improve much with 3.0, but when REAL 3.0 games are available, then you will see the advantage of having 3.0.
 

Creig

Diamond Member
Oct 9, 1999
5,170
13
81
Originally posted by: hans030390
Now, I am probably known as being one of the biggest shader model 3.0 fans on the forums. Why? Why not? How can you not be? Is it because only Nvidia offers it now and you are still bonded to ATI? Well, I hope I can get my point across to everyone on why it is good. Also, what about the 6xxx series? Will they perform well in next-gen games that use SM3.0?

First off, I want this to be a civilized discussion (unlikely to happen) and not a flame war (likely to happen). This thread is based around my opinion of SM3.0, but my opinions are backed by the facts. (I understand that sounds strange and you want to know "what facts").

I will list many sites and they all have something to do with SM3.0. I recommend you read all of them before posting.

http://www.gamers-depot.com/interviews/dx9b/001.htm (Interviews many game developers and give you their input on shader model 3.0)[\q]


Bart Kijanka ? CTO Gas Powered Games

GD: A lot of attention has been paid to NVIDIA's support of Pixel Shader 3.0 - can you specifically think of anything PS 3.0 can be used for that can't be done in PS 2.0?

Bart: Currently it appears the impact of 3.0 on developers will be minimal. Therefore it's unlikely the consumer will see significant benefits from 3.0 for quite some time, especially since the improvement of 3.0 over 2.0 isn't as great as 2.0 over 1.0.


GD: Is there ever a need to use 32-bit precision over 24-bit? I'm looking for an example where 32-bit precision shows an obvious superiority over 24-bit - both are considered "Full precision" by the current DX9 spec.

Bart: We don't have an identified need for 32 bit precision at this time.




Rowan Wyborn, lead engineer for Tribes: Vengeance at Irrational Games Canberra

GD: A lot of attention has been paid to NVIDIA's support of Pixel Shader 3.0 - can you specifically think of anything PS 3.0 can be used for that can't be done in PS 2.0?

Rowan: I can not think of any effects that can be done in PS 3.0 that can't be done in PS 2.0. Under PS 2.0 they might take some extra passes, or maybe a few more instructions, but the final result should exactly match the equiv PS 3.0 Shader.


GD: And can you tell me if there's ever a need to use 32-bit precision over 24-bit?

Rowan: If there is, I haven?t found it yet :) I?m still finding even 16bit precision is suitable for most real-time shaders at the moment. I imagine that once people start doing all sorts of crazy procedural material generation within the Shader that 32bit might be necessary, however I think we are still a ways off from there in terms of required hardware speed.



So not everybody interviewed thinks SM3.0 is necessary.




http://www.gamespy.com/articles/510/510938p1.html (Talks a bit about the Sm3.0 on 6800 cards.)



some major developers feel as though PS30 will change the face of gaming while others feel there is little difference between the two. Additionally, NVIDIA recently released screenshots purported to show the difference between PS30 and PS1x, but in a later interview with NVIDIA's VP of Technical Marketing Tony Tamasi, he said, "Yeah, the images that you've seen from Far Cry, the current path, those are actually Shader Model 2.0, and anything that runs Shader Model 2.0 should be able to produce those images." In fact, there is no game currently available that uses SM30, and games that support the features of SM30 aren't expected to show up this year.

So is the inclusion of SM30 important? The correct answer is that it will be; just don't expect games that really use it extensively to come out for around a year. Of course, both cards have full SM2x implementations, and that's what many of the current new games and many of those coming out later this year will be pushing.

Even if you're a fan boy/girl of a particular brand, there's no denying that, for the most part, ATI's Radeon X800 XT Platinum Edition is the speed leader. While it's very close throughout most of the race, the X800 XT Platinum Edition just comes out on top more often. It's important to remember that NVIDIA's cards offer full SM30 support and FP32. It's just that games with these features won't be out for quite some time.

For many (especially those with small form factor cases that are all the rage), such power and space considerations make their next video card purchase a no-brainer. (Remember that as soon as I've got a working GeForce 6800 GT, I will post its numbers.) Sure, the benchmarks were mostly close among the big boys and all these cards are vast improvements over the current generation. However, only two of these four video cards are realistic options for most gamers. ATI has come through again.



And again.



www.hardocp.com/article.html?art=NjA5 (Great, thourough explanation of SM3.0 and what it can do. It also puts it up against 2.0. I must also tell you that ANY Sm3.0 screenshot on there is really using sm2.0. Disregard the pictures, but the text is what is important)



So what is the real bottom line here about SM3.0 and the rather misplaced NVIDIA comparisons that have been made public so far? If you are using an older graphics card not capable of supporting SM2.0, like any pre-GeForceFX NVIDIA or pre-Radeon 9500 graphics card, you are going to be in for an upgrade soon if you want to see all the eyecandy in the newest games. If your current video card will support SM2.0, like NVIDIA's GeForceFX or the ATI?s Radeon 9500 series and "better," currently nothing in terms of Shader Model quality is going to be revealed with an upgrade. Of course all this forgoes talking about better AntiAliasing, Anisotropic Filtering, and higher resolutions that play a huge role in gameplay.

UPDATE:

Tech Report posted an interview with Tony Tamasi of NVIDIA and they comment on the FarCry SM3.0 screenshots and they make some statements that might lead you to beleive that no real SM3.0 operations are being done in the "PS3.0 Screenshots" from NVIDIA. We are a bit familiar with Virtual Displacement Mapping and Parrallx Mapping, but we were not lead to believe by our NVIDIA contacts that is what is going on in the screenshots from NVIDIA. And quite frankly it is not really that important. This seems to simply further our thoughts that SM3.0 is not bringing much in terms of Image Quality to the table currently that cannot be done in SM2.0 and that is what will be important to the gamers buying the video cards.

Going back and reading exactly what we asked NVIDIA, "What exact PS 3.0 features are (Crytek) using?" They responded, "As stated earlier, displacement mapping is used for the walls and stone textures like the Buddha." Funny enough, as noted earlier, Displacement Mapping is a Vertex Shader feature not a Pixel Shader feature. So all in all, I would have to believe what Tamasi is quoted as saying in the TR interview as true.

It seems that once again getting the exact truth out of NVIDIA can be a painstakingly complex exercise.


UPDATE #2:

PC Perspective has posted an interview with the CEO of the makers of FarCry. We found that answers to questions 7 and 8 fully support our conclusions on the subject we have put forth here in our article.


Q: 7) What aspects of the screenshots seen at the launch event are specific examples of the flexibility and power of Shader 3.0?

A: In current engine there are no visible difference between PS2.0 and PS3.0. PS3.0 is used automatically for per-pixel lighting depending on some conditions to improve speed of rendering.

Q: 8) Is the same level of image quality seen when using Shader 3.0 possible using Shader 2.0? If so, what dictates which Shader you decide to use?

A: In current generation engine quality of PS3.0 is almost the same as PS2.0. PS3.0 is used for performance optimization purposes.




And again.




http://www.nordichardware.com/Articles/?page=2&skrivelse=346 (Simple Far Cry tests running Sm3.0...Shows that it slightly improves performance, while also improving eye candy)



Considering how small the margins are between ATi's and nVidia's current cards, Shader Model 3.0 gives nVidia a small nudge. With the full version of DirectX 9.0c and fully developed GeForce 6 drivers for DirectX 9.0c it's an obvious advantage. An advantage far from enormous, but on the other hand not as insignificant as many earlier thought.


"an advantage far from enormous"



http://www.gamespot.com/news/2005/03/09/news_6120091.html (Not really worth reading, but it states that many next generation games on XBOX 2 are using the Unreal Engine 3, which uses SM3.0 heavily. Many Xbox 2 games will be on PC)


Okay... Nothing really to do with SM3.0 performance.



http://www.gamespot.com/news/2005/03/09/news_6120126.html (Unreal Engine 3 info. It is important to note that a programmer of the engine mentioned that the 6600GT would run the engine, and games based off of it, easily)


Again, nothing to do with SM3.0 performance.


http://www.microsoft.com/whdc/winhec/partners/shadermodel30_NVIDIA.mspx (a list of some SM3.0 features and how they improve over features in 2.0)

List of SM3.0 vs SM2.0b specs, but doesn't say which specs are limiting performance.





So basically every comparison you listed was mixed on the subject. Hardly makes SM3.0 a "must have" feature.
 

Creig

Diamond Member
Oct 9, 1999
5,170
13
81
Don't get me wrong, it's obvious that SM3.0 is the direction the industry is heading towards. It's just that there's such a performance hit by enabling some SM3.0 features, coupled with the fact that there's not much that can't be done as well with SM2.0b that I think it will be some time before SM3.0 is as important as you're portraying it to be today.
 

hans030390

Diamond Member
Feb 3, 2005
7,326
2
76
thats why at the end i said

"BUT, I must add this before you flame me. If you upgrade often, don't worry about 2.0 or 3.0. If you don't (like me) SM3.0 should be enough to make you pick up a 6xxx card now. "

so, it CAN be a must have. but i'm tired of people who say it is completely pointless, and the last link mentioned that the 6600gt would run UE3 games fine, because of SM3.0


 

hans030390

Diamond Member
Feb 3, 2005
7,326
2
76
i am not portraying it to be important today, but i am telling people exactly what it is, and why it will be important. Hopefully, this will change some negative thoughts about 3.0 into positive, and hopefully help someone purchase a video card. you see, When i was searching for a video card, everyone would suggest a 9800 or such. But i knew that i would need Sm3.0 someday, so i got a 6600gt, as i upgrade once every 5 years or so. It was important for me to get SM3.0.

So, i'm not always just trying to say "its the best, you need it"...i hope you understand...i really dont lol
 

Creig

Diamond Member
Oct 9, 1999
5,170
13
81
Stating that a 6600GT will run UE3 "fine" because it has SM3.0 doesn't mean that an X800XL having twice as many pixel pipelines and vertex processors can't run it just as fast (or faster) with SM2.0b
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
U P D A T E :

More importantly, by looking at the past, we can see that the greatest factor that influences graphics performance is the intensity of shaders in the game. (hence the need for PS2.0, 3.0, etc.) If Radeon 9800Pro showed that it performed faster than 5900xt in shader intensive games, we can hypothesize that at the end of the day, graphics cards today that perform faster in "Overall" shader intensive games (regardless of SM2.0 or 3.0), will be more long-term.

Most shader-intensive games today:

Perimeter
Deus Ex: Invisible War - "Deus Ex: Invisible War prefers cards capable of fast execution of pixel shaders. The RADEON X850 and X800 cards are the best in this respect as the diagrams show.
Thief: Deadly Shadows - This third-person 3D shooter is a pixel-shader-heavy application, so ATI?s cards run it faster overall.
Hitman: contracts - The game has a liking towards graphics cards capable of effectively processing pixel shaders, so the victory of the RADEON X800 family looks deserved.
Far Cry
- There are numerous sophisticated pixel shaders in this game, and it has complex geometry and high-resolution textures. The ?eye candy? makes any differences between graphics cards stand out more conspicuously and we can see now that the members of the ATI RADEON X800 family go ahead and increase their advantage over the GeForce 6800 series as the resolution grows
3dMark03 - Nature Test #4 - There are no surprises here ? the high speed of processing complex pixel shaders allows the ATI team to win all the resolutions, in the hardest and most intensive test of 3dMark03 (sure this is synthetic but still)

3dMark05 - Test 1 - The first game test of the new 3DMark is a typical scene from modern first-person 3D shooters: closed environments, numerous light sources - ATI cards win
3dMark05 - Test 2 - The scene features rich dynamically-generated vegetation and numerous spectacular light and shadow effects - ATI cards win
3dMark05 - Test 3 - This test is the most difficult as it uses sophisticated pixel shaders to render the water and the walls of the canyon.

If anything this can be concluded:

Because, "RADEON X800 cards do the maths associated with pixel shaders better than NVIDIA?s cards" - XbitLabs.com, they should be more futureproof and play more shader intensive future games faster than 6xxx series cards. Now add to the fact that ATI cards are faster in HL2 on which many games will be based and arent much slower due to driver updates in Doom 3.

However, I must note that this is somewhat of a trivial point since both cards are likely to be slow at Quake 4 and Unreal 3, etc.

Also, in your case SM3.0 is irrelevant. Anyone who keeps a card for 5 years will know that after 2.5 years you'll have to start playing at 1024x768 or lower (I know with my 8500) and now I have to dial to 640x480 or 800x600 tops for newest games. Doom 3 or Far Cry, choppy. You think the fact that my card supports PS1.4 (DX8.1) and Geforce 4 only supported PS1.1 (DX8.0) makes any difference today (or even 2 years after i bought it??) 6600GT will be a slide show 3-5 years from now.
 

hans030390

Diamond Member
Feb 3, 2005
7,326
2
76
It won't have all of the eye candy (like displacement mapping, a feature only on SM3.0) but it should run alright with 2.0 settings. I'm thinking the 6xxx series in a whole will pull away from ATI when SM3.0 games come out...oh but wait, the R520 cards will be out by then i think...hmm...well the 6xxx series will pull away from the x800 series (or x300/600/700/850)

 

hans030390

Diamond Member
Feb 3, 2005
7,326
2
76
None of those games you listed are really shader model 3.0 games...that didnt help us...of course the x800 series is faster now, but I think SM3.0 games will really boost the 6800 past the x800 cards
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Originally posted by: hans030390
None of those games you listed are really shader model 3.0 games...that didnt help us...of course the x800 series is faster now, but I think SM3.0 games will really boost the 6800 past the x800 cards

You completely missed my point. Games of tomorrow will be more shader intensive than ever. Given that X800 series cards win by a significant margin in every shader intensive test/game today, not even SM3.0 will help 6800 series cards perform better. If X800 cards are faster in SM2.0 games, and have the ability to perform same number of operations with 2.0B extension, why would SM3.0 allow Nvidia cards to run faster, especially in more shader intensive applications that it brings?
 

xtknight

Elite Member
Oct 15, 2004
12,974
0
71
Originally posted by: Creig
Stating that a 6600GT will run UE3 "fine" because it has SM3.0 doesn't mean that an X800XL having twice as many pixel pipelines and vertex processors can't run it just as fast (or faster) with SM2.0b

--------------------------------------------------------------------------------------------

the whole discussion summarized thus far:

- shader model 3.0 will improve performance over equally-configured shader model 2.0 cards providing there's no visual quality improvements

- it's possible a card that uses SM2.0 will outperform an SM3.0 card if it's faster

- SM3.0 allows for more shader applications, and thus better visual quality

- IF the 6800gt cost $300 and the x800xl cost $300, you'd be better off getting the sm3.0 card because it's the same price and will offer more functionality.

- most sm3.0 games will be able to fall back to sm2.0b or lower, compromising the extra image quality if it's used. We're not sure about the Unreal 3 Engine.

- if you were choosing between the equal-performance 6800gt ($400) and x800xl ($300) this year, you should get:

NOTE: the boldfaced words MUST be relatively correct and the prices approximately the same difference for the following to be true. remember if these cards were to cost the same you'd obviously want the SM3.0 one. this not being the case, refer to the following.

years/upgrade
1 - save the money and get the X800 XL
2 - if you feel like spending $100 for SM3.0 (6800 GT), then do it. it will come up in games eventually, and may be very common by "now" (2007).
2+ - definitely get the 6800 GT for future games

can we all agree on this? if any of this is misleading, please let me know. the whole reason i posted this was to help inform newcomers with an unbiased look at the advantages of SM3.0. i'm not an expert on this subject nor am I trying to make blind assumptions. this is only off of what i've read (hopefully reliable sources).
 

Creig

Diamond Member
Oct 9, 1999
5,170
13
81
A thing to keep in mind is that the most important video card specs are still:

GPU clock speed
Memory clock speed
Memory bus width
Number of pixel pipelines
Number of vertex processors

While SM2.0 and SM3.0 can speed up certain operations or enable specific visuals, it's still the basic hardware specifications that make up the majority of the video card performance.
 

pyrosity

Member
Dec 20, 2004
42
0
0
Originally posted by: Creig
Stating that a 6600GT will run UE3 "fine" because it has SM3.0 doesn't mean that an X800XL having twice as many pixel pipelines and vertex processors can't run it just as fast (or faster) with SM2.0b

Whoa there buddy, UE3 doesn't support "fallback" shaders, meaning that it runs using Shader Model 3.0 only. Don't jump to basic logic that more pixel and vertex units will automatically result in better performance. Also, UE3 hardly uses vertex processing, as almost everything is done on per-pixel level.

Yes, the X800 XL's extra pixel pipes will help it. I'd even go as far to say that differences between a card with 8 pipes and a card with 12 pipes will greatly expand, as pixel calculations are rediculously heavy on the UE3 engine.

Please, continue reading the following as well, to those of you that are about to reply in flames or even reply without reading the rest of this post, as I find it to be important. For starters, I'm an ATi fan at heart. However, I'm buying a 6600 GT soon because of its nice price/performance ratio and its SM 3.0 support. I do not expect the simple support to work wonders in next-generation games, however. Remember how well the 5xxx cards did/do with DX 9.0 games? History could very well repeat itself. Also, people need to realize that no Shader Model 3.0 game exists yet.

"Well, Far Cry has the 3.0 patch..."

No. It is not coded from the ground up in 3.0, therefore the example is a waste of time. Let us debate 3.0 performance difference margins when we have true data, but for now, let us recognize that 3.0 should at least help in the efficiency of the execution of shader paths and instructions.

As hans said (and he may have not said this if it were not for me :p ), if you plan on upgrading between now and "early 2006" (release of Epic's UE3 PC game), then don't worry about it. ATi and Nvidia will surely both support at least 3.0 in their next-gen parts. At the moment, the X800 XL is a freaking amazing buy if it can be found at MSRP ($300 for a card that competes with the competitor's $400 card - and $500 card in HL2?!? WOW. How could yearly-upgraders NOT be jumping all over this?).
 

xtknight

Elite Member
Oct 15, 2004
12,974
0
71
Originally posted by: pyrosity
Originally posted by: Creig
Stating that a 6600GT will run UE3 "fine" because it has SM3.0 doesn't mean that an X800XL having twice as many pixel pipelines and vertex processors can't run it just as fast (or faster) with SM2.0b

Whoa there buddy, UE3 doesn't support "fallback" shaders, meaning that it runs using Shader Model 3.0 only. Don't jump to basic logic that more pixel and vertex units will automatically result in better performance. Also, UE3 hardly uses vertex processing, as almost everything is done on per-pixel level.

Yes, the X800 XL's extra pixel pipes will help it. I'd even go as far to say that differences between a card with 8 pipes and a card with 12 pipes will greatly expand, as pixel calculations are rediculously heavy on the UE3 engine.

Please, continue reading the following as well, to those of you that are about to reply in flames or even reply without reading the rest of this post, as I find it to be important. For starters, I'm an ATi fan at heart. However, I'm buying a 6600 GT soon because of its nice price/performance ratio and its SM 3.0 support. I do not expect the simple support to work wonders in next-generation games, however. Remember how well the 5xxx cards did/do with DX 9.0 games? History could very well repeat itself. Also, people need to realize that no Shader Model 3.0 game exists yet.

"Well, Far Cry has the 3.0 patch..."

No. It is not coded from the ground up in 3.0, therefore the example is a waste of time. Let us debate 3.0 performance difference margins when we have true data, but for now, let us recognize that 3.0 should at least help in the efficiency of the execution of shader paths and instructions.

As hans said (and he may have not said this if it were not for me :p ), if you plan on upgrading between now and "early 2006" (release of Epic's UE3 PC game), then don't worry about it. ATi and Nvidia will surely both support at least 3.0 in their next-gen parts. At the moment, the X800 XL is a freaking amazing buy if it can be found at MSRP ($300 for a card that competes with the competitor's $400 card - and $500 card in HL2?!? WOW. How could yearly-upgraders NOT be jumping all over this?).

Well, Far Cry has the [SM]3.0 patch...!@#$% :p

but seriously, what did Crytek change with the 1.3 patch? they only changed part of the code to 1.3 or what? :confused:
 

Creig

Diamond Member
Oct 9, 1999
5,170
13
81
If somebody were to plop down a 6800GT or an X800XL in front of me and ask me which one I wanted, I'd take the 6800GT without a second thought. But I personally don't believe the 6800GT is worth the extra $60 over an X800XL. Now, if Nvidia dropped the 6800GT down to within, say, $20-$30 then I'd most likely swing the other way and recommend the 6800GT.


Yes, the X800 XL's extra pixel pipes will help it. I'd even go as far to say that differences between a card with 8 pipes and a card with 12 pipes will greatly expand, as pixel calculations are rediculously heavy on the UE3 engine.

Were you speaking in general term when you were referring to pipelines? Because the X800XL has 16 pipelines, not 12.

 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Originally posted by: pyrosity

Whoa there buddy, UE3 doesn't support "fallback" shaders, meaning that it runs using Shader Model 3.0 only.

So you are saying that unless you have an SM3.0 capable card, you wont be able to play UE3 at all?

 

housecat

Banned
Oct 20, 2004
1,426
0
0
Originally posted by: RussianSensation
All is fine and dandy and SM3.0 will be good. However, given that currently Nvidia cards offer 0 performance advantage in SM3.0, I just don't see how they will play SM3.0 games better.

No. They have 100% performance advantage.
Because ATI cannot use SM3 at all. Hence NV is the benchmark... with no competition.
 

Dman877

Platinum Member
Jan 15, 2004
2,707
0
0
They demo'd UT3 on a 6800U when nVidia announced the card a year ago and it ran like ass. I highly doubt you'll be playing UT3 on a 6600GT at anything higher then low/average settings/medium res. I bet it will be like playing UT2K4 on a Ti4200 was.