• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

Shader model 3.0 (a civilized discussion)

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Originally posted by: RussianSensation
Originally posted by: pyrosity

Whoa there buddy, UE3 doesn't support "fallback" shaders, meaning that it runs using Shader Model 3.0 only.

So you are saying that unless you have an SM3.0 capable card, you wont be able to play UE3 at all?

That would be pretty damn stupid, only allowing 5% of people to play it...
 
I was under the impression that Unreal 3 would support SM2.0/SM3.0


Does anybody have a link to some verifiable source on Unreal 3 specifications?
 
Originally posted by: Sc4freak
Originally posted by: RussianSensation
Originally posted by: pyrosity

Whoa there buddy, UE3 doesn't support "fallback" shaders, meaning that it runs using Shader Model 3.0 only.

So you are saying that unless you have an SM3.0 capable card, you wont be able to play UE3 at all?

That would be pretty damn stupid, only allowing 5% of people to play it...

I'd second that, but please recognize that the first Unreal Engine 3 game will likely launch in a year. By then ATi and likely Nvidia as well will have launched new GPUs. From what I hear, ATi's R520 will kick some serious tail. And about UE3 not supporting fallback shaders, here's your quote and link.

Note: After rereading this I realized that I could have misunderstood it in a way--but read ahead and I'll comment later...

"A major design goal of Unreal Engine 3 is that designers should never, ever have to think about "fallback" shaders, as Unreal Engine 2 and past mixed-generation DirectX6/7/8/9 engines relied on. We support everything everywhere, and use new hardware features like PS3.0 to implement optimizations: reducing the number of rendering passes to implement an effect, to reduce the number of SetRenderTarget operations needed by performing blending in-place, and so on. Artists create an effect, and it's up to the engine and runtime to figure out how to most efficiently render it faithfully on a given hardware architecture."
-Tim Sweeney of Epic on Beyond 3d
linky

Back to note: "we support everything everywhere" seems to be saying that they will support "fallback shaders," but he's very unclear about how designers wouldn't "have to think about fallback shaders." ...In the sense that he doesn't right-out say that UE3 games will only support sm3.0 capable cards. To be honest, I'm not sure.

Originally posted by: Creig
If somebody were to plop down a 6800GT or an X800XL in front of me and ask me which one I wanted, I'd take the 6800GT without a second thought. But I personally don't believe the 6800GT is worth the extra $60 over an X800XL. Now, if Nvidia dropped the 6800GT down to within, say, $20-$30 then I'd most likely swing the other way and recommend the 6800GT.


Yes, the X800 XL's extra pixel pipes will help it. I'd even go as far to say that differences between a card with 8 pipes and a card with 12 pipes will greatly expand, as pixel calculations are rediculously heavy on the UE3 engine.

Were you speaking in general term when you were referring to pipelines? Because the X800XL has 16 pipelines, not 12.

Yeah, sorry, I was talking in general terms. I should have said something like "a card with 4 pipes and a different card with 8 pipes" or something.
 
Originally posted by: BFG10K
a video card based on an Nvidia 6600GT, the kind currently available for around $250, will be able to handle games based on the engine easily
I find that very hard to believe.

Nevertheless I've always been a strong supporter of shaders but fanbois will always be fanbois. First it was the nVidia ones making shiny pipes comments and now it's the ATi guys claiming nobody needs SM 3.0.

Smack smack smack?
 
Originally posted by: Creig
Don't get me wrong, it's obvious that SM3.0 is the direction the industry is heading towards. It's just that there's such a performance hit by enabling some SM3.0 features, coupled with the fact that there's not much that can't be done as well with SM2.0b that I think it will be some time before SM3.0 is as important as you're portraying it to be today.

What is the "performance hit" of SM3? My only testing of SM3 has shown performance either stays about the same, or rises using SM3 on Far Cry?
 
As I said earlier, there are not any games released to the public that use Shader Model 3.0 and are coded from the ground up with SM 3.0 in mind. "Performance hit" isn't something we can really accurately think about at the moment, I believe. We can, however gawk at people from Epic saying that a 2004 $200 card will run their high-end 2006 game "easily." 😉 Same is true for 9600s on Half-Life 2/Doom 3/Far Cry. It might run it "easily" enough but the frame rates aren't exactly going to be beautiful.

Also, to readers of this thread, any time that you see people associate High Dynamic Range Lighting (HDR Lighting) or the "2++" shader model of Chronicles of Riddick with SM 3.0, please correct them. Both of those are being rendered using shader model 2.0...period. Valve is releasing a [Half-Life 2] HDR level only for higher-end cards in coming time and the Source engine doesn't even support SM 3.0 yet.
 
I believe no one could sit here and try to downplay SM3, without being against NV.

Who in the hell is against superior/advanced technology?

Oh wait. I'll answer that.. ATI devotees in this case.
 
One thing I've learned from shader models in this generation is that supporting new shader models doesn't mean sht. I think that it should only be a small influence towards your purchasing decision. Not the huge 50% of the reason I'm buying 6x00 series - the other 50% is PureVideo.🙂
 
This is a noob question. Has SM 3.0 made an image quality difference yet compared to SM 2.0? I am under the impression it doesn't yet (I beleive it will in the future) that's why I went with the X800 because of better performing AA and AF.
 
Originally posted by: pyrosity
Originally posted by: Creig
Stating that a 6600GT will run UE3 "fine" because it has SM3.0 doesn't mean that an X800XL having twice as many pixel pipelines and vertex processors can't run it just as fast (or faster) with SM2.0b

Whoa there buddy, UE3 doesn't support "fallback" shaders, meaning that it runs using Shader Model 3.0 only. Don't jump to basic logic that more pixel and vertex units will automatically result in better performance. Also, UE3 hardly uses vertex processing, as almost everything is done on per-pixel level.

Yes, the X800 XL's extra pixel pipes will help it. I'd even go as far to say that differences between a card with 8 pipes and a card with 12 pipes will greatly expand, as pixel calculations are rediculously heavy on the UE3 engine.

Please, continue reading the following as well, to those of you that are about to reply in flames or even reply without reading the rest of this post, as I find it to be important. For starters, I'm an ATi fan at heart. However, I'm buying a 6600 GT soon because of its nice price/performance ratio and its SM 3.0 support. I do not expect the simple support to work wonders in next-generation games, however. Remember how well the 5xxx cards did/do with DX 9.0 games? History could very well repeat itself. Also, people need to realize that no Shader Model 3.0 game exists yet.

"Well, Far Cry has the 3.0 patch..."

No. It is not coded from the ground up in 3.0, therefore the example is a waste of time. Let us debate 3.0 performance difference margins when we have true data, but for now, let us recognize that 3.0 should at least help in the efficiency of the execution of shader paths and instructions.

As hans said (and he may have not said this if it were not for me 😛 ), if you plan on upgrading between now and "early 2006" (release of Epic's UE3 PC game), then don't worry about it. ATi and Nvidia will surely both support at least 3.0 in their next-gen parts. At the moment, the X800 XL is a freaking amazing buy if it can be found at MSRP ($300 for a card that competes with the competitor's $400 card - and $500 card in HL2?!? WOW. How could yearly-upgraders NOT be jumping all over this?).

I've always been under the assumpition that shader model itself is backwards compatable...

That'd be a massive blow and seems very unrealistic to be to see the entire x800 series not run games that use the next unreal engine.
 
Originally posted by: Rollo
Originally posted by: BFG10K
a video card based on an Nvidia 6600GT, the kind currently available for around $250, will be able to handle games based on the engine easily
I find that very hard to believe.

Nevertheless I've always been a strong supporter of shaders but fanbois will always be fanbois. First it was the nVidia ones making shiny pipes comments and now it's the ATi guys claiming nobody needs SM 3.0.

Smack smack smack?


ROFL!! For whom the bell tolls? 😀
 
On the subject of Unreal Engine 3:
Unreal Engine 1 came with Software, Direct X and Open GL renders.
Unreal Engine 2 came with Software, Dirext X and Open GL renders.
Unreal Engine 3 comes with.....? (DX9 for sure....anything else?)
 
SM3.0 isnt a needed feature for me, right now. Why? Because no games I play, use it. When the Farcry add-on ever comes out, I will be playing that, but with my experience with it, its drops frames much to low for me to enjoy HDR. At my res of 1920x1200, I wouldnt get playable frames no matter what NV card I had, at least not with the settings cranked up like I like them. SLI doesnt help HDR. Kind of ironic that a feature (HDR) that was, and still is so loudly supported bu NV fans, doesnt benefit from SLI, another loudly supported feature by NV fans. But thats another story. PS2.0+ gets the same speed boost as 3.0. So it isnt an issue for me for that game. The only other games on that list (didnt EA pull 3.0?) I would play again, is HL2, again when they put out an add-on. I play some DM from time to time, but not much.

Would I like my X850XT/PE to have PS/SM3.0? Yeah, I would. I wont lie about it. But at the same time, I would have zro benefit from it right now. So I am not worried about it. How many times did I actually get to use 3.0 in the 6 months or so I had my 6800GT? From what I remember, just once. When testing HDR in Farcry.

If the rumors are true, and the R520 is released in 3-4 months and has WGF 1.0 (DX Next) would it really matter to me when the 6800U doesnt support it? Again, not at this time. Why? Same reasons as above, I wouldnt see any benefits from it when it comes out.

Now, if you play one of the supported games, and there is in fact a graphical, or boost in speed over a comparable ATi card, then its a very valid reason to opt for the NV card. I have yet to see such a game however. The new SC demo takes a huge (about 50%) perfmance hit when comparing 1.1 to 3.0. Does it look better? Yeah I think so. Is it worth the hit on frames? Not to me. But to each their own.
 
Originally posted by: Ackmed
Now, if you play one of the supported games, and there is in fact a graphical, or boost in speed over a comparable ATi card, then its a very valid reason to opt for the NV card.

This is what it all boils down to in the end. Remember, there aren't any games on the market yet that support full-on pixel shader 3.0. For people like me, the decision on which graphics card to buy is a very difficult one. I have limited funds so I have to be very, very cautious and keep the future in mind as best I can. The prices for an x800 XL card at MSRP are prices that I can reach, though that would be the absolute highest I'd be willing to pay. Unfortionately I'm limited to the AGP bus, and X800 XL cards designed for it that reach MSRP or go under MSRP even (like the Rosewill card at newegg at the moment at $290...yes, it's in stock, and has free shipping) likely won't show up for a couple more months. I think I might go ahead and buy a 6600 GT.

I'll also start researching the best ways to sell used computer parts 😉 as the next generations of graphics cards roll around. I'd go from socket 754 to 939 and look into a dual-core AMD solution, and would probably stick with the $200 solution of the next ATi line-up.
 
Remember that the x800xl only has Sm2.0...so performance of today's games might be better, but SM3.0 DOES look better and WILL be needed for next gen games (unless you prefer the crappy old style graphics of 2005 lol)

Sm3.0 has displacement mapping. read my first post to know what it is. SM2.0 doesnt have that, and next gen games rely HEAVILY on it.

Remember, there are two sides to this. If you are going to buy a card, and don't upgrade often, you need to either wait for the R520 cards (which are uber expensive) or get a 6xxx card to be safe for the future. If you upgrade every year, it hardly matters what you get. X800 or 6800, they'll both do fine.

Now, you must understand that new ati and nvidia cards will most likely be out soon, so the 6xxx cards will be outdated. BUT, i do not want to talk about next gen cards, as they are really just rumors about their specs and what not so far.

One point i want to make, the 6xxx will do better and look better than the x800 cards will in next gen games. Will the 6xxx do great? I don't know, maybe not. That is for next gen cards. You are generally assured more "future-proofness" with a SM3.0 card.

um yeah...totally forgot what i was gonna say, but keep up the discussion, maybe i'll remember 😉
 
Originally posted by: hans030390
Remember that the x800xl only has Sm2.0...so performance of today's games might be better, but SM3.0 DOES look better and WILL be needed for next gen games (unless you prefer the crappy old style graphics of 2005 lol)

Sm3.0 has displacement mapping. read my first post to know what it is. SM2.0 doesnt have that, and next gen games rely HEAVILY on it.

Remember, there are two sides to this. If you are going to buy a card, and don't upgrade often, you need to either wait for the R520 cards (which are uber expensive) or get a 6xxx card to be safe for the future. If you upgrade every year, it hardly matters what you get. X800 or 6800, they'll both do fine.

Now, you must understand that new ati and nvidia cards will most likely be out soon, sothe 6xxx cards will be outdated. BUT, i do not want to talk about next gen cards, as they are really just rumors about their specs and what not so far.

One point i want to make, the 6xxx will do better and look better than the x800 cards will in next gen games. Will the 6xxx do great? I don't know, maybe not. That is for next gen cards. You are generally assured more "future-proofness" with a SM3.0 card.

um yeah...totally forgot what i was gonna say, but keep up the discussion, maybe i'll remember 😉

If you are saying that a 6800GT may be faster in some games than a X800XL in two years due to SM3.0, I'd agree with you. In that way, the 6800GT is more future proof than the X800XL.

If you are saying that a 6600GT is more future proof than an X800XL - which it seems you are - I'd say you are completely wrong. The raw power of the X800XL makes it a much more future proof card, SM3.0 or not.
 
Originally posted by: sellmen
Originally posted by: hans030390
Remember that the x800xl only has Sm2.0...so performance of today's games might be better, but SM3.0 DOES look better and WILL be needed for next gen games (unless you prefer the crappy old style graphics of 2005 lol)

Sm3.0 has displacement mapping. read my first post to know what it is. SM2.0 doesnt have that, and next gen games rely HEAVILY on it.

Remember, there are two sides to this. If you are going to buy a card, and don't upgrade often, you need to either wait for the R520 cards (which are uber expensive) or get a 6xxx card to be safe for the future. If you upgrade every year, it hardly matters what you get. X800 or 6800, they'll both do fine.

Now, you must understand that new ati and nvidia cards will most likely be out soon, sothe 6xxx cards will be outdated. BUT, i do not want to talk about next gen cards, as they are really just rumors about their specs and what not so far.

One point i want to make, the 6xxx will do better and look better than the x800 cards will in next gen games. Will the 6xxx do great? I don't know, maybe not. That is for next gen cards. You are generally assured more "future-proofness" with a SM3.0 card.

um yeah...totally forgot what i was gonna say, but keep up the discussion, maybe i'll remember 😉

If you are saying that a 6800GT may be faster in some games than a X800XL in two years due to SM3.0, I'd agree with you. In that way, the 6800GT is more future proof than the X800XL.

If you are saying that a 6600GT is more future proof than an X800XL - which it seems you are - I'd say you are completely wrong. The raw power of the X800XL makes it a much more future proof card, SM3.0 or not.

assuming the game has SM2.x fallback shaders:

if you're talking about performance future-proof, yes.
but visual quality future-proof, no.
but then again, the 6600GT might not be able to take advantage of its extra visual quality if it doesn't have the performance to run it, making the feature worthless. it's a big mess.

so a good rule of thumb for everyone that upgrades on a 2-year basis on a limited budget:

- get the mid-range card every 2 years and you'll be set.
- don't get such a high-end card today that you can't afford a new more future-proof card tomorrow

so this year, waiting for the price of a 6800NU to drop and then getting it for roughly $225 would be the best choice.

---------------------------------------------------------

to those saying 'you'll need to run at 2048x1536 8xAA 16xAF to take advantage of the r520!!!ONE!!' are wrong. i highly doubt a 9800XT (last gen's flagship) can even run Doom3 at 1280x1024 2xAA with 60+ FPS, which would be the ideal setting for me, or higher. My 9500 PRO is dwarfed by Doom 3 even at 640x480 and medium settings. I can manage maybe 35 FPS. Even 60 FPS is bare minimum. What if a game is optimized for nVidia cards, like Doom3? Then I probably will need an R520 to play next year's nVidia-optimized game, at decent settings.
 
Originally posted by: xtknight
Originally posted by: sellmen
Originally posted by: hans030390
Remember that the x800xl only has Sm2.0...so performance of today's games might be better, but SM3.0 DOES look better and WILL be needed for next gen games (unless you prefer the crappy old style graphics of 2005 lol)

Sm3.0 has displacement mapping. read my first post to know what it is. SM2.0 doesnt have that, and next gen games rely HEAVILY on it.

Remember, there are two sides to this. If you are going to buy a card, and don't upgrade often, you need to either wait for the R520 cards (which are uber expensive) or get a 6xxx card to be safe for the future. If you upgrade every year, it hardly matters what you get. X800 or 6800, they'll both do fine.

Now, you must understand that new ati and nvidia cards will most likely be out soon, sothe 6xxx cards will be outdated. BUT, i do not want to talk about next gen cards, as they are really just rumors about their specs and what not so far.

One point i want to make, the 6xxx will do better and look better than the x800 cards will in next gen games. Will the 6xxx do great? I don't know, maybe not. That is for next gen cards. You are generally assured more "future-proofness" with a SM3.0 card.

um yeah...totally forgot what i was gonna say, but keep up the discussion, maybe i'll remember 😉

If you are saying that a 6800GT may be faster in some games than a X800XL in two years due to SM3.0, I'd agree with you. In that way, the 6800GT is more future proof than the X800XL.

If you are saying that a 6600GT is more future proof than an X800XL - which it seems you are - I'd say you are completely wrong. The raw power of the X800XL makes it a much more future proof card, SM3.0 or not.

assuming the game has SM2.x fallback shaders:

if you're talking about performance future-proof, yes.
but visual quality future-proof, no.
but then again, the 6600GT might not be able to take advantage of its extra visual quality if it doesn't have the performance to run it, making the feature worthless. it's a big mess.

I'd say its even "visual quality future proof", because an X800XL will let you turn on AA/AF, detail, and resolution settings that a 6600GT won't. Even if a game has no fallback shaders, odds are the game will look better on the X800XL.

You'd certainly miss out on the SM3.0 image quality enhancements though.
 
Originally posted by: sellmen
Originally posted by: xtknight
Originally posted by: sellmen
Originally posted by: hans030390
Remember that the x800xl only has Sm2.0...so performance of today's games might be better, but SM3.0 DOES look better and WILL be needed for next gen games (unless you prefer the crappy old style graphics of 2005 lol)

Sm3.0 has displacement mapping. read my first post to know what it is. SM2.0 doesnt have that, and next gen games rely HEAVILY on it.

Remember, there are two sides to this. If you are going to buy a card, and don't upgrade often, you need to either wait for the R520 cards (which are uber expensive) or get a 6xxx card to be safe for the future. If you upgrade every year, it hardly matters what you get. X800 or 6800, they'll both do fine.

Now, you must understand that new ati and nvidia cards will most likely be out soon, sothe 6xxx cards will be outdated. BUT, i do not want to talk about next gen cards, as they are really just rumors about their specs and what not so far.

One point i want to make, the 6xxx will do better and look better than the x800 cards will in next gen games. Will the 6xxx do great? I don't know, maybe not. That is for next gen cards. You are generally assured more "future-proofness" with a SM3.0 card.

um yeah...totally forgot what i was gonna say, but keep up the discussion, maybe i'll remember 😉

If you are saying that a 6800GT may be faster in some games than a X800XL in two years due to SM3.0, I'd agree with you. In that way, the 6800GT is more future proof than the X800XL.

If you are saying that a 6600GT is more future proof than an X800XL - which it seems you are - I'd say you are completely wrong. The raw power of the X800XL makes it a much more future proof card, SM3.0 or not.

assuming the game has SM2.x fallback shaders:

if you're talking about performance future-proof, yes.
but visual quality future-proof, no.
but then again, the 6600GT might not be able to take advantage of its extra visual quality if it doesn't have the performance to run it, making the feature worthless. it's a big mess.

I'd say its even "visual quality future proof", because an X800XL will let you turn on AA/AF, detail, and resolution settings that a 6600GT won't. Even if a game has no fallback shaders, odds are the game will look better on the X800XL.

You'd certainly miss out on the SM3.0 image quality enhancements though.

I doubt AA/AF will compensate for what SM3.0 will bring. Look at the Far Cry screenshots at HardOCP. I think I'd notice that more than I'd notice a little gray on rough edges. Even without AF I still think the SM3.0 would look better. I don't know...if you don't have AF on, floors can look like crap ahead of you. Detail certainly makes a difference though. It all depends on what the 6600GT can do and what it can't. We'll just have to wait and see.
 
I dont know why ppl are saying ATi must think SM3 is important otherwise they wouldnt be using it in the R520 or ATi were downplaying that you needed SM3

They never said u didnt need SM3. They said that you didnt need SM3 NOW as in the last 6 months or more. They always said they were implementing SM3 in their next gen after the R400 cores.
 
Originally posted by: xtknight
Originally posted by: sellmen
Originally posted by: xtknight
Originally posted by: sellmen
Originally posted by: hans030390
Remember that the x800xl only has Sm2.0...so performance of today's games might be better, but SM3.0 DOES look better and WILL be needed for next gen games (unless you prefer the crappy old style graphics of 2005 lol)

Sm3.0 has displacement mapping. read my first post to know what it is. SM2.0 doesnt have that, and next gen games rely HEAVILY on it.

Remember, there are two sides to this. If you are going to buy a card, and don't upgrade often, you need to either wait for the R520 cards (which are uber expensive) or get a 6xxx card to be safe for the future. If you upgrade every year, it hardly matters what you get. X800 or 6800, they'll both do fine.

Now, you must understand that new ati and nvidia cards will most likely be out soon, sothe 6xxx cards will be outdated. BUT, i do not want to talk about next gen cards, as they are really just rumors about their specs and what not so far.

One point i want to make, the 6xxx will do better and look better than the x800 cards will in next gen games. Will the 6xxx do great? I don't know, maybe not. That is for next gen cards. You are generally assured more "future-proofness" with a SM3.0 card.

um yeah...totally forgot what i was gonna say, but keep up the discussion, maybe i'll remember 😉

If you are saying that a 6800GT may be faster in some games than a X800XL in two years due to SM3.0, I'd agree with you. In that way, the 6800GT is more future proof than the X800XL.

If you are saying that a 6600GT is more future proof than an X800XL - which it seems you are - I'd say you are completely wrong. The raw power of the X800XL makes it a much more future proof card, SM3.0 or not.

assuming the game has SM2.x fallback shaders:

if you're talking about performance future-proof, yes.
but visual quality future-proof, no.
but then again, the 6600GT might not be able to take advantage of its extra visual quality if it doesn't have the performance to run it, making the feature worthless. it's a big mess.

I'd say its even "visual quality future proof", because an X800XL will let you turn on AA/AF, detail, and resolution settings that a 6600GT won't. Even if a game has no fallback shaders, odds are the game will look better on the X800XL.

You'd certainly miss out on the SM3.0 image quality enhancements though.

I doubt AA/AF will compensate for what SM3.0 will bring. Look at the Far Cry screenshots at HardOCP. I think I'd notice that more than I'd notice a little gray on rough edges. Even without AF I still think the SM3.0 would look better. I don't know...if you don't have AF on, floors can look like crap ahead of you. Detail certainly makes a difference though. It all depends on what the 6600GT can do and what it can't. We'll just have to wait and see.

Let me just copy-paste part of Crieg's post regarding Far Cry:

Q: 7) What aspects of the screenshots seen at the launch event are specific examples of the flexibility and power of Shader 3.0?

A: In current engine there are no visible difference between PS2.0 and PS3.0. PS3.0 is used automatically for per-pixel lighting depending on some conditions to improve speed of rendering.

Q: 8) Is the same level of image quality seen when using Shader 3.0 possible using Shader 2.0? If so, what dictates which Shader you decide to use?

A: In current generation engine quality of PS3.0 is almost the same as PS2.0. PS3.0 is used for performance optimization purposes.

So the Far Cry screens are not useful for comparing IQ differences.

I doubt that SM3.0 will provide an image quality enhancement that is greater than AA, AF, detail levels, and resolution combined.
 
Originally posted by: Noob
Do you guys think the 6800's will even have the power to run next generation games that are SM 3.0 supported?

I do.
I can run insane resolutions with HDR on in Farcry today.

Not only that, but the NV50 will likely be able to even better.

Its the R520 you should worry about, it is ATI's first gen SM3.. while NV will be on their 2nd revision SM3, having much more experience with it than ATI.

So if you are looking for your next card to be SM3, I'd stick with NV50.
 
Originally posted by: housecat
Originally posted by: Noob
Do you guys think the 6800's will even have the power to run next generation games that are SM 3.0 supported?

I do.
I can run insane resolutions with HDR on in Farcry today.

You're running a pair of 6800GT OC's in SLI, not a single GT6800.



Not only that, but the NV50 will likely be able to even better.

Its the R520 you should worry about, it is ATI's first gen SM3.. while NV will be on their 2nd revision SM3, having much more experience with it than ATI.

2nd revision SM3? What exactly is THAT supposed to be? SM3.0 is SM3.0, the specs don't change simply because there's a new core design.



So if you are looking for your next card to be SM3, I'd stick with NV50.

As neither next generation card have been released yet it's way too early to start making predictions which card will be better.
 
Back
Top