Why does AT say the X1900's take the performance crown?

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Wreckage

Banned
Jul 1, 2005
5,529
0
0
Originally posted by: munky
Originally posted by: Wreckage
Originally posted by: munky
you'd know that either the driver/profile needs more tweaking

I thought crossfire did not need a profile? That it would just work with any game. :roll:

Did you also think that about SLI?

Nope.

But all the ATI Dongle freaks were saying that it does not need a profile that it has a supa dupa default rendering mode. Of course real world tests have shown this to be otherwise.
 
Apr 17, 2003
37,622
0
76
Originally posted by: keysplayr2003
Originally posted by: M0RPH
Wow, your logic escapes me. So you think that because SLI is faster than Crossfire, Nvidia will have the luxury of keeping their cards expensive. You realize of course that most people can't afford to buy two video cards? And you realize that people are going to be buying a lot of ATI's 1900 cards because they're faster than anything Nvidia has available to sell? Now tell me again why Nvidia could afford to keep their cards expensive.

Your right. Just change the names of the cards and makers, and your statement works perfectly for last half of 2005 as well. Observe:

Wow, your logic escapes me. So you think that because SLI is faster than Crossfire, ATI will have the luxury of keeping their cards expensive. You realize of course that most people can't afford to buy two video cards? And you realize that people are going to be buying a lot of Nvidia's 7800 cards because they're faster than anything ATI has available to sell? Now tell me again why ATI could afford to keep their cards expensive.

Anyway, I'm just playing. ATI has something like a six week window to sell these X1900XT/XTX/XTXTX. Very good cards, but they didn't do as well as they were hyped to. A bit dissappointing as I wanted it to absolutely destroy anything out there by wide margins. So, ATI will have a six week profit growth, just as Nvidia had about 4 months uncontested in '05 with the 7800's. If I felt the X1900XTX was strong enough to hold off the G71, I would probably buy one. But seeing as it is just ahead of a 512GTX for the most part, I can't buy it.


you are assuming that Ati isnt going to launch anything in response to G71?
 

Cuular

Senior member
Aug 2, 2001
804
18
81
The op says:
I gave BF2 to nVidia b/c it's within 1.9fps, and the nVidia rig is running at 8x AA vs ATI's 6x AA).

Where the review he takes his numbers for the comparison says:

In our max quality tests, we will compare ATI parts with 16xAF, 6xAA, adaptive AA, high quality AF and as little catalyst AI as possible enabled to NVIDIA parts with 16xAF, 4x or 8xS AA (depending on reasonable support in the application), transparency AA, and no optimizations (high quality) enabled. In all cases, ATI will have the image quality advantage with angle independent AF and 6x MSAA.

Text bolded by me to bring the point out.

So even though nvidia had the "higher AA number" the quailty is lower because of the algorithms used. So if you "give" bf2 back to the one with the higher image quality and higher FPS, then you have ATI winning 4 and nividia winning 3. <shrug>
 

OvErHeAtInG

Senior member
Jun 25, 2002
770
0
0
Cuular, you just quoted what I was about to quote. The review specifically states that the "Maximum Quality" tests are apples-to-oranges in some regards, since they're just measuring the highest-quality settings you can pretty much run. It is a more realistic test in some respects, but also you can't really compare the framerate results as if they were teh same settings. And there's still more to it.

Crossfire at maximum quality is clearly an anomaly. Look at BF2 for example - running Crossfire with the maximum quality gets you lower performance than running a single card. However, that is not the case with the "lower quality" (like 4xaa/8xAF) settings, where Crossfire gives you 50% performance increase over a single XT. This is definitely a problem with ingame code or a driver issue, either of which can be fixed.
 

dredd2929

Senior member
Jun 4, 2005
230
0
0
Originally posted by: Cuular
So even though nvidia had the "higher AA number" the quailty is lower because of the algorithms used. So if you "give" bf2 back to the one with the higher image quality and higher FPS, then you have ATI winning 4 and nividia winning 3. <shrug>

You're absolutely right, Cuular, I apologize for that oversight.

For my own education, will you (or someone out there) please explain to me why ATI's 6x MSAA is better than nVidia's 8xS AA (or at least point me to where I can read about it)? Also, looking at the Black and White 2 images, there is a side-by-side comparison between ATI's and nVidia's "Maximum AA". Is this the "apples to oranges" scenario that the reviewer is referring to? I'm assuming these Maximum AA parameters are not one in the same as those used in the Quake 4 image comparisons, because nVidia's 8X AA is clearly of higher quality than ATI's 6X AA with respect to the screenshots posted in the review.
 

xtknight

Elite Member
Oct 15, 2004
12,974
0
71
Originally posted by: dredd2929
For my own education, will you (or someone out there) please explain to me why ATI's 6x MSAA is better than nVidia's 8xS AA (or at least point me where I can read about it)?

Difference in algorithm. The algorithm uses a different 'AA grid' to do the antialiasing perhaps.
 

dredd2929

Senior member
Jun 4, 2005
230
0
0
Originally posted by: xtknight
Difference in algorithm. The algorithm uses a different 'AA grid' to do the antialiasing perhaps.

So ATI doesn't do AA the same way nVidia does? Which one is more computationally intensive, and which one produces the best image quality? Does it depend on the game?

Also, xtknight: your post indicates that the question about AA algorithms was asked by Cuular, but it was me (dredd2929) that asked it. Not that big a deal to me but Cuular may not appreciate being quoted on something he didn't say. :)
 

Matthias99

Diamond Member
Oct 7, 2003
8,808
0
0
Originally posted by: dredd2929
Originally posted by: xtknight
Difference in algorithm. The algorithm uses a different 'AA grid' to do the antialiasing perhaps.

So ATI doesn't do AA the same way nVidia does? Which one is more computationally intensive, and which one produces the best image quality? Does it depend on the game?

Also, xtknight: your post indicates that the question about AA algorithms was asked by Cuular, but it was me (dredd2929) that asked it. Not that big a deal to me but Cuular may not appreciate being quoted on something he didn't say. :)

ATI's AA is generally better at the 'same' level, but NVIDIA's 8xS (which combines SSAA and MSAA) should be better than ATI's 6xAA (which is just MSAA with six samples per pixel). However, 8xS mode is generally VERY slow.

More detail:

ATI uses 'jittered-grid' antialiasing for its MSAA, which means the samples are offset slightly. This makes it *generally* look better on things that are tilted at a small angle. NVIDIA uses ordered-grid antialiasing, which can sometimes miss antialiasing objects that are slightly tilted, but does about the same on edges at a sharper angle. ATI's MSAA is generally equal, and sometimes does a significantly better job with the 'same' number of samples (ie, 4xAA versus 4xAA).

They're about equal in computational resources, but ATI has more flexbile AA capabilities (things like 6xMSAA, Temporal AA, and the SuperAA modes using Crossfire) -- but they either cannot or will not enable SSAA. NVIDIA cards can use SSAA modes for higher quality (though with an extremely large performance hit, to the point where it is generally unusable except on very old games).

In terms of comparing the numbers in the review -- AT said (if you read the beginning part of the review carefully, which apparently many people did NOT) that the 'Maximum Quality' numbers for NVIDIA used 4xAA on some games and 8xSAA on others. Since they don't indicate where each one is in use, you can't really compare between NVIDIA and ATI in those charts.
 

hooflung

Golden Member
Dec 31, 2004
1,190
1
0
Originally posted by: dredd2929
For single card performance yes, it is the fastest. But as far as pure performance it seems the 7800GTX 512MB SLI rig takes the cake in most of the "max settings" benchmarks (ATI: 3 vs nVidia: 4; I gave BF2 to nVidia b/c it's within 1.9fps, and the nVidia rig is running at 8x AA vs ATI's 6x AA). I was a little surprised by this, considering the amount of time ATI has had to work on their "GeForce 7 ouster..."

I'm not intending this as flamebate, but why do ya'll think this is? The X1900 appears to be WAY faster on paper...


Well both SLI and Crossfire are novelty technologies at this time. It would take DDR2-1000 to supply enough bandwith for 2 GPUs, over 2xPCIe X16 whether they hop over HyperTransport on AMD or Hit the Northbridge then Memory on Intel. Problem is AMD doesn't use DDR2-1000 yet and will not likely support this for atleat a year and Intel might support DDR2-1000 before it kills off current P4 but then you have the problem of Intel just sucking as a CPU altogether for games and workstation graphics.

So the X1900XT is right now the most cost effective choice for performance unless you want to be the idiot who bought into marketing hype and got 2 cards on a tired, saturated data transit system.
 

Matthias99

Diamond Member
Oct 7, 2003
8,808
0
0
Originally posted by: hooflung
Originally posted by: dredd2929
For single card performance yes, it is the fastest. But as far as pure performance it seems the 7800GTX 512MB SLI rig takes the cake in most of the "max settings" benchmarks (ATI: 3 vs nVidia: 4; I gave BF2 to nVidia b/c it's within 1.9fps, and the nVidia rig is running at 8x AA vs ATI's 6x AA). I was a little surprised by this, considering the amount of time ATI has had to work on their "GeForce 7 ouster..."

I'm not intending this as flamebate, but why do ya'll think this is? The X1900 appears to be WAY faster on paper...


Well both SLI and Crossfire are novelty technologies at this time. It would take DDR2-1000 to supply enough bandwith for 2 GPUs, over 2xPCIe X16 whether they hop over HyperTransport on AMD or Hit the Northbridge then Memory on Intel. Problem is AMD doesn't use DDR2-1000 yet and will not likely support this for atleat a year and Intel might support DDR2-1000 before it kills off current P4 but then you have the problem of Intel just sucking as a CPU altogether for games and workstation graphics.

So the X1900XT is right now the most cost effective choice for performance unless you want to be the idiot who bought into marketing hype and got 2 cards on a tired, saturated data transit system.

Huh? Why would the video cards be talking to main memory? Any data they have to send back and forth is either done through the bridge/dongle, or sent directly over the PCIe bus.
 

Matte979

Junior Member
Jan 24, 2006
20
0
0
Doesnt anybody care about picture quality.. Who cares if FPS if its over 40. I think the 1900XTX will have the best quality for a single card and for what I have heard from G71 they will still not be able to do AA and HDR at the same time, which I think is really strange.

Navida needs to address this.

Aother factor is the XBOX factor game companies will do games optimized for XBox. And Xbox also made by ATI and has 48 shaders..

So the conclusion of this is that for me ATI is the best at this time and for atleast a couple of months. I still hope Nvidia will have a better card in G71 as it will force ATIs hands to get the next card out and so on.. Got to love free enterprise.. A looking at the CRYTEK new engine we are going to need it..

Lets start saving....... :)
 

firewolfsm

Golden Member
Oct 16, 2005
1,848
29
91
there has to be time for better drivers and games that make use of the shaders better (think of it as dual core)
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,005
126
NVIDIA uses ordered-grid antialiasing, which can sometimes miss antialiasing objects that are slightly tilted, but does about the same on edges at a sharper angle
Starting with the NV4x nVidia uses rotated grid AA like ATi.

but they either cannot or will not enable SSAA
ATi's adaptive AA does SSAA on transparent textures. This feature is available on R5xx and higher chipsets.
 

M0RPH

Diamond Member
Dec 7, 2003
3,302
1
0
Originally posted by: Wreckage
Originally posted by: M0RPH
In this case, when they are talking about the "performance crown", they are talking about for single cards. It's as simple as that.

It's not really that simple. Because the Dual GT is the fastest single card on the planet. At least until the dual GTX from Dell comes out.

This thing is SLI slapped onto a single board. It's generally not considered when people talk about who has the fastest single video card. If it makes you feel better, we can say that the X1900XTX takes the crown for the fastest single GPU video card.
 

flexy

Diamond Member
Sep 28, 2001
8,464
155
106
Blah blah blah

and X1900XT STILL has the performance crown. What is your point ? Want to "surprise" us with the fact that 7800GTX in SLI is often faster ? DUH ;)

And.."NO" the x1900XT is not "faster on paper"...is it SO HARD to swallow the fact that right now the ATI X1900XT is the f@cken fastest card around ?

YES ? Is it ? Is it ?

Jeeezzzuss...



 

flexy

Diamond Member
Sep 28, 2001
8,464
155
106
When a customer wants to have the fastest possible performance, they'll go with the one with the "performance crown..."

exactly. which is the X1900XT.

If i want to compare against SLI..which is frickin TWO cards..then please do me a favor and ALSO include X1900XT(X) in CROSSFIRE

Your statement 'nvidia is faster'..you're tlking about Nvidia in SLI ?

The problem then is the tester...but they compared SLI to the SINGLE 1900 just to have an idea...and in ONE or so games the card is actually even faster compared to NV in SLI (FEAR)...OTHERWISE i think a single card comparison against SLI is dumb. ANd as said..you can ALWAYS get TWO 1900 and run them in crossfire.

ANd yes, again, X1900XT(X) is *still*the fastest card and ATI does not need an asterisk or sidenotes ort is cheating or whatsoever.
 

xtknight

Elite Member
Oct 15, 2004
12,974
0
71
Originally posted by: dredd2929
So ATI doesn't do AA the same way nVidia does? Which one is more computationally intensive, and which one produces the best image quality? Does it depend on the game?

It doesn't depend on the game so much as it depends if you do multisampling or supersampling. I believe 8xS is actually 4xMSAA and 2xSSAA. I'm not sure about ATI's 6xAA. I'm not even sure how you can supersample something and end up with a multiple of something other than a power of two (6?) They must have to make some sacrifices in the blending algorithms.

For alpha textures, supersampling is the only option. Multisampling cannot be done because it's processing a 'raster' image, not a 'vertex' one, a set of vertices and points. Multisampling is blending the edges of a line so that it matches with the background better. Supersampling is rendering at double the resolution, and scaling it back down by blending pixels. This generates a smoother image. This also requires more memory. 2xSSAA requires double the area so in reality it needs 4x (2²) as much memory to do as no AA! I'm not certain about MSAA but I don't think it's a memory hog as much a GPU one. I'm not completely sure how it works but if you Google Beyond3D and antialiasing maybe you can dig up some really good articles on it.

In any case this looks like a good read: http://www.nvnews.net/vbulletin/showthread.php?s=&threadid=15821

Also, xtknight: your post indicates that the question about AA algorithms was asked by Cuular, but it was me (dredd2929) that asked it. Not that big a deal to me but Cuular may not appreciate being quoted on something he didn't say. :)

Oops. Fixed that.

Originally posted by: BFG10K
Starting with the NV4x nVidia uses rotated grid AA like ATi.

I thought that was only for the Quadros (for default anyway)? I can go in nHancer and there is a specially labeled rotated grid AA mode while the others don't say rotated grid. Then again, they don't say ordered grid either so I don't know. The fact that just a couple modes were labeled rotated grid made me think it was some kind of exclusive feature.

What I wonder is how the driver determines that the texture has alpha components in it? Does it spend cycles determining that or is it flagged somehow?
 

flexy

Diamond Member
Sep 28, 2001
8,464
155
106
Do you think nVidia has been sitting on their arses for 7 months? How do you think the next nVidia part will perform?

i think you're REALLY just trolling.

Just another post where i have to say, "hey dude this is a HARDWARE forum and THIS is the video section..and YES i THINK that quite a bunch people here KNOW about the upcoming NV cards, especially G71...and YES, it WILL be a fast card"/ What is your point, AGAIN ?

The X1900XT does NOT get slower because NV will come out with G71 soon.

And, no, why on earth should someone think NV sat on their a$$ for 7 months..who said that ? I personally THINK that g71 and X1900XT will be pretty on PAR actually.


 

dredd2929

Senior member
Jun 4, 2005
230
0
0
Originally posted by: flexy
And.."NO" the x1900XT is not "faster on paper"...

Actually yes, it is. It's also faster in real world benchmarks too.

Originally posted by: flexy
...is it SO HARD to swallow the fact that right now the ATI X1900XT is the f@cken fastest card around ?

No. I don't care either way. I don't own stock in either company (ATI or nVidia).

Originally posted by: flexy
If i want to compare against SLI..which is frickin TWO cards..then please do me a favor and ALSO include X1900XT(X) in CROSSFIRE

I did...that's the whole reason for my original post. I was surprised that nVidia's GTX 512 MB SLI rig still won in a few benchmarks.

Originally posted by: flexy
ANd yes, again, X1900XT(X) is *still* the fastest card and ATI does not need an asterisk or sidenotes ort is cheating or whatsoever.

Only in most cases though, which is what puzzled me. With the specs of the X1900 series, I was expecting a landslide victory, which isn't the case. Also the asterisk comment was directed toward whoever titled the AT review article, not ATI. And, if you had actually read the entire thread, you would have learned that I was corrected on this by a few of the responses, to which I readily admitted.

Originally posted by: flexy
i think you're REALLY just trolling.

That's fine that you think that. You're wrong.

Originally posted by: flexy
What is your point, AGAIN ?

It's that, like many other people, I'm underwhelmed by ATI's new GPUs and I was hoping they'd be faster. But again, as others have stated, ATI's drivers for the X1900 are still in beta, so performance won't go anywhere but up.

Originally posted by: flexy
I personally THINK that g71 and X1900XT will be pretty on PAR actually.

I hope you're right.
 

IlllI

Diamond Member
Feb 12, 2002
4,929
11
81
you also have to take into consideration that nvidia had many months to optomize their drivers, where on the other hand, ati just released a new video card and the drivers are still pretty 'raw' i guess you could say.

 

Matthias99

Diamond Member
Oct 7, 2003
8,808
0
0
Originally posted by: BFG10K
NVIDIA uses ordered-grid antialiasing, which can sometimes miss antialiasing objects that are slightly tilted, but does about the same on edges at a sharper angle
Starting with the NV4x nVidia uses rotated grid AA like ATi.

You sure about that? I was pretty sure that the GF5/6 still used ordered-grid AA, at least. If they changed it in the GF7 they sure haven't talked much about it.

but they either cannot or will not enable SSAA
ATi's adaptive AA does SSAA on transparent textures. This feature is available on R5xx and higher chipsets.

Well, okay, there is that (on R5XX), but you still can't do full-scene SSAA like the NVIDIA boards can. I don't think it's that big a deal (since it's not usable on most newer games), but it's a little odd.

It doesn't depend on the game so much as it depends if you do multisampling or supersampling. I believe 8xS is actually 4xMSAA and 2xSSAA. I'm not sure about ATI's 6xAA. I'm not even sure how you can supersample something and end up with a multiple of something other than a power of two (6?) They must have to make some sacrifices in the blending algorithms.

ATI's 6xAA is MSAA with six samples per pixel, no supersampling (unless you have Transparency AA enabled, and then only on alpha textures). NVIDIA's "8xS" mode is, indeed, 4xMSAA with 2xSSAA applied on top of it.

The numbers don't match up at all once you start comparing the SLI-AA and SuperAA modes. NVIDIA and ATI use different "multipliers" to describe the same combinations of SSAA and MSAA.

What I wonder is how the driver determines that the texture has alpha components in it? Does it spend cycles determining that or is it flagged somehow?

I believe the application specifies whether it is RGB or RGBA. There are a number of texture formats you can use, and only some of them have alpha channels.
 

Snakexor

Golden Member
Feb 23, 2005
1,316
16
81
Originally posted by: dredd2929
Originally posted by: MegaWorks

No one is forcing you to buy it.

That doesn't mean I can't wish there was better competition, which would force prices down faster, instead of the 7800GTX costing ~$500 for almost 6 months.

and you still bought one, so whats your point
 

Munky

Diamond Member
Feb 5, 2005
9,372
0
76
The numbers don't match up at all once you start comparing the SLI-AA and SuperAA modes. NVIDIA and ATI use different "multipliers" to describe the same combinations of SSAA and MSAA.

Indeed they dont. Looking further at the b3d reviews of x1800 and x1900, you can see that Crossfire has these SuperAA modes: 8x, 10x, 12x, 14x. Among those, only 10x and 14x have 2x SSAA, the others just use more samples for MSAA. So, while we have seen multiple reviews show SuperAA 8x perform much higher than SLIAA 8x, it can be partially attributed to SLIAA doing some form of SSAA, while Crossfire is not. In 14x and 16x modes, however both solutions are doing SSAA to an extent, but I'm not sure if SLI is doing 4x or 2x SSAA. If it was doing 4x SSAA, that again would be part of the reason why SLI in 16x falls so far behind Crossfire's 14x.