Firingsquad Publishes HDR + AA Performance Tests

Page 10 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Polish3d

Diamond Member
Jul 6, 2005
5,500
0
0
How come? I've read it tends to give good performance gains w/o much if any image quality degradation

 

dug777

Lifer
Oct 13, 2004
24,778
4
0
Originally posted by: Frackal
How come? I've read it tends to give good performance gains w/o much if any image quality degradation

Indeed.

AT could find no visual differences between it on or off iirc, and i looked with my 9800 pro, found none, and so i always ran with it. Free fps gain, however minor, is a free fps gain ;)
 

Polish3d

Diamond Member
Jul 6, 2005
5,500
0
0
PS, thanks to Schneiderguy for the AAA suggestion.

I now think that AAAx2xAA in the settings I described above are the best for me for Oblivion. I may try 4xAA + AAA but I'm already stretchingt the card with these super rez texture mods.

2xAA+AAA is really nice looing though!! Better than 4xAA alone imo
 

redbox

Golden Member
Nov 12, 2005
1,021
0
0
Originally posted by: Frackal
Neither of what you say is valid in this case.


The same cable from monitor to video card (it's DVI) is being used in both cases

And I am familiar with digital vibrance / saturation, and it's not the case either.


How exactly is it that the ATI card gives a sharper picture in my bios (where I can see each pixel rather than a white sem-blurry letter) if it's just "digital vibrance?"

Read the post man.

I don't know what the difference is, but I still think the ATI "part" responsible is simply higher quality than the nvidia one, which is unacceptable on nvidia's part due to the high cost of the cards.

Your the only one I have ever heard that has a problem with the signal quality. I swear it is all in your head.
 

Polish3d

Diamond Member
Jul 6, 2005
5,500
0
0
Originally posted by: redbox
Originally posted by: Frackal
Neither of what you say is valid in this case.


The same cable from monitor to video card (it's DVI) is being used in both cases

And I am familiar with digital vibrance / saturation, and it's not the case either.


How exactly is it that the ATI card gives a sharper picture in my bios (where I can see each pixel rather than a white sem-blurry letter) if it's just "digital vibrance?"

Read the post man.

I don't know what the difference is, but I still think the ATI "part" responsible is simply higher quality than the nvidia one, which is unacceptable on nvidia's part due to the high cost of the cards.

Your the only one I have ever heard that has a problem with the signal quality. I swear it is all in your head.

It isn't just me. Others mentioned it before me. I think perhaps (no offense) it takes a higher quality monitor to highlight the difference. Go check Anand's reveiw of the 2005fpw, it's a top quality monitor (why I bought it) and has a superb panel. From seeing it directly I can understand why it might not show up on all monitors.


 

josh6079

Diamond Member
Mar 17, 2006
3,261
0
0
Originally posted by: Frackal
Originally posted by: josh6079
Originally posted by: josh6079
Okay, here is a clip of me deer hunting in the woods, submerged in foliage near some temple ruins:

Click

Now, here is what happend when I found those deer :) :

Click

I flubbed up and accidently took out my torch :eek: but I got my bow out soon enough and started shooting. The frames were good enough for me to accurately hit the deer on the run while I was jumping over a rock, then land and hit another deer on the run.

4xAA+HDR, 16xHQAF, 1680x1050, max Oblivion settings except for self shadows and grass shadows, Vsync+Triple Bufffering, 2048x2048 LOD mod, enhanced tree shadows, Jarrods re-Texture mod, Natural Environment mod, (many other mods but none else that would effect the frame performance i.e. weapons and armor and such)

Did........um.......no one see.......those................clips..............

My system specs are in my sig, but just in case a certain crow thinks that I'm gaming with a $1000 CPU, I'm not. Opteron 148 Venus presently at stock.


Hey is that "Lord of the Rings" music? How'd you get that?

I replaced the mp3's that the Oblivion uses by default with the Lord of the Rings mp3's that I have. I have them set to have the Nine Riders song play when I get into fights and things as well as the Bridge of Khaza-Dum or whatever you want to call it. Basically, I went through and set the appropriate songs with the appropriate atmosphere's. The Hobbit's Shire song is set to play in cities and sometimes when exploring and things too.
 

imported_michaelpatrick33

Platinum Member
Jun 19, 2004
2,364
0
0
Originally posted by: josh6079
Originally posted by: Frackal
Originally posted by: josh6079
Originally posted by: josh6079
Okay, here is a clip of me deer hunting in the woods, submerged in foliage near some temple ruins:

Click

Now, here is what happend when I found those deer :) :

Click

I flubbed up and accidently took out my torch :eek: but I got my bow out soon enough and started shooting. The frames were good enough for me to accurately hit the deer on the run while I was jumping over a rock, then land and hit another deer on the run.

4xAA+HDR, 16xHQAF, 1680x1050, max Oblivion settings except for self shadows and grass shadows, Vsync+Triple Bufffering, 2048x2048 LOD mod, enhanced tree shadows, Jarrods re-Texture mod, Natural Environment mod, (many other mods but none else that would effect the frame performance i.e. weapons and armor and such)

Did........um.......no one see.......those................clips..............

My system specs are in my sig, but just in case a certain crow thinks that I'm gaming with a $1000 CPU, I'm not. Opteron 148 Venus presently at stock.


Hey is that "Lord of the Rings" music? How'd you get that?

I replaced the mp3's that the Oblivion uses by default with the Lord of the Rings mp3's that I have. I have them set to have the Nine Riders song play when I get into fights and things as well as the Bridge of Khaza-Dum or whatever you want to call it. Basically, I went through and set the appropriate songs with the appropriate atmosphere's. The Hobbit's Shire song is set to play in cities and sometimes when exploring and things too.

Cool stuff! I have the 7900GTX SLI and I am happy for you with your ATI card. I only hope you are enjoying Oblivion as much as I am! :beer:
 

Wreckage

Banned
Jul 1, 2005
5,529
0
0
Originally posted by: tuteja1986
This thread needs to die ;( its a total flame war :!

This forum has become a flamewar. Without moderation it will continue as such.
 

redbox

Golden Member
Nov 12, 2005
1,021
0
0
Originally posted by: Frackal
Originally posted by: redbox
Originally posted by: Frackal
Neither of what you say is valid in this case.


The same cable from monitor to video card (it's DVI) is being used in both cases

And I am familiar with digital vibrance / saturation, and it's not the case either.


How exactly is it that the ATI card gives a sharper picture in my bios (where I can see each pixel rather than a white sem-blurry letter) if it's just "digital vibrance?"

Read the post man.

I don't know what the difference is, but I still think the ATI "part" responsible is simply higher quality than the nvidia one, which is unacceptable on nvidia's part due to the high cost of the cards.

Your the only one I have ever heard that has a problem with the signal quality. I swear it is all in your head.

It isn't just me. Others mentioned it before me. I think perhaps (no offense) it takes a higher quality monitor to highlight the difference. Go check Anand's reveiw of the 2005fpw, it's a top quality monitor (why I bought it) and has a superb panel. From seeing it directly I can understand why it might not show up on all monitors.

I am looking at a 2005fpw right now. I am going to check out those links that nts linked maybe I have been out of the loop a little bit. Either way is it really a make or break issue? Is it that bad?

EDIT: Ok I just looked at that link NTS gave me and it looks as though you will notice problems with older lcds and longer cables but from those results they are very close

Nvidia 6800 results:
On the first display, the MSI NX6800 Ultra-T2D256 produces the UXGA resolution at a frequency of only 141 MHz, i.e. with a reduced blanking interval. This can lead to problems with older monitors (see MSI FX-5700 Ultra-TD128). Other than that, the eye diagrams show a very homogeneous distribution of data.
Result: DVI compliant at 162MHz

ATI results:
The ATI Radeon X800 XT PE (Built by ATI) is also fully DVI compliant at 162MHz (UXGA). The eye diagrams show a very homogeneous distribution of the data. An exemplary result.
Result: DVI compliant at 162MHz

I don't know if anything has changed i.e. the placement of the TMDS chip on the pcb or even the brand of it, but they do say this:
As the tests further on in this article will show, a general statement can't always be made about the quality of the DVI output on ATi and NVIDIA cards. Even if a separate TMDS chip known for its quality is used on a card - that does not automatically mean that every card that features this chip will offer a good DVI signal. Even the placement on the PCB of the graphics card can have a huge impact on the end result!

One thing I found interesting from that article it looks like you would need a dual link DVI connection for anything above 1600x1200 on either ATI or Nvidia.
The DVI standard specifies a maximum pixel frequency of 165 MHz (Single Link). Thanks to the ten-fold multiplication of the frequency described above, this results in a peak data rate of 1.65GB/s, which is enough for a resolution of 1600x1200 at 60Hz. If higher resolutions are required, the display would need to be connected via Dual Link DVI, which uses two DVI transmitters together resulting in twice the bandwidth.
I thought we have been using 24" lcds with just one DVI connection.

Overall I don't think that signal quality is a feature to get all wraped up in, but if that is your cup of tea then all the power to you.
 

Polish3d

Diamond Member
Jul 6, 2005
5,500
0
0
I wish I could give you a good answer. It is a make or break issue for ME, because there is a subtle higher quality to the picture that I am just not willing to give up unless the reasons are really compelling. I see details in BF2 that simply didnt show up on the nvidia card, same for HL2 EP1, as well as in 2d in pictures and such.

When debating between the XTX and the 7950GX2, I was currently borrowing an XTX in place of my 7800GTXOC.

Weighing anand's review of the GX2, where adding 15% to the XT scores (accounting for overclock) and a possible drop of GX2 scores due to lower quality default settings vs. ATI defaults, and all the image quality benefits of the ATI card, HDR+AA, HQAF (really good), and most importantly to me, the picture quality difference, I decided upon the XTX and at this point would not trade.

The GX2 is an awesome card, I wouldn't mind trying one out vs. my XTX.

Also, G80 will probably have an HQAF version (I figure) and almsot definitely HDR+AA... so who knows. I will probably hold off for r600 though, although the temptation might get the best of me :D if G80 is a real monster.


Overall though, I will probably not be going back to nvidia unless they:

- Fix ridiculous hit with AA
- Fix huge swings of max/minimum frames
- Fix large performance loss when quality settings are brought to ATI level
- Fix signal/picture quality (whatever you want to call it) given that the damn cards cost 500 bucks or more


It's just a big difference in experience. When I buoght the GTX in June 05, it was the top end card undisputed. I thought it:

- Underperformed for the price, took tremendous hit with AA enabled (ridiculous)
- Had terrible frame rate swings
- Since I had upgraded from a 6600GT I didnt know about image quality difference

with my XTX I found:

- It performed greater than I expected (subjective), I was blown away to see over 120+fps in BF2 with all image quality settings on high, crazy (this after oc)

- Very little performance hit with AA enabled, extremely impressive

- Excellent 'stamina' for not dropping to ass-low frames when busy, unlike GTX which had huge sways in framerate
- Unexpectedly superior signal/picture quality

- Bad fan noise

-HDR+AA + HQ-AF both great features
 

redbox

Golden Member
Nov 12, 2005
1,021
0
0
Ya I am kinda in the same boat as you Frackal ignoring the signal issue. My brother's XTX is one beast of a card and it makes everything soooo pretty. He had 7800gt's in SLI before and the XTX keeps up with them, better IQ, less problems, and can run his dual 20" widescreens.

I am just waiting for DX10 cards, and either AMD to drop the price on an X2 or C2D to ship and motherboards for it to mature. I can't wait till x-mas!
 

nullpointerus

Golden Member
Apr 17, 2003
1,326
0
0
Originally posted by: Frackal
Neither of what you say is valid in this case.

The same cable from monitor to video card (it's DVI) is being used in both cases

And I am familiar with digital vibrance / saturation, and it's not the case either.
Signal quality cannot be an issue in a pure digital connection. This isn't open to debate. Logically, you must be tying together unrelated problems and/or seeing things. Then again, you haven't said anything remotely useful in identifying the root cause of your "poor signal quality" problem.

How exactly is it that the ATI card gives a sharper picture in my bios (where I can see each pixel rather than a white sem-blurry letter) if it's just "digital vibrance?"
It is because you're introducing an entirely unrelated "problem." The fonts drawn when in the system BIOS are stored in the video card's BIOS, and they vary between ATI and nVidia cards. It's called a character display mode.

The system BIOS, text-mode OS, boot loader, or whatever plots a string of character codes (i.e. "AGP Multiplier:", "4x") along with a few bits to determine color, background color, and effects (e.g. blinking), leaving the actual font and effect rendering up to the video BIOS. (There are even special ASCII character codes for the lines you see.)

Now, if you prefer the ATI text-mode fonts, that's entirely up to you, but it has nothing to do with signal quality.

Read the post man.
I did. Know what you're talking about before you set ultimatums for a company, man.

I don't know what the difference is, but I still think the ATI "part" responsible is simply higher quality than the nvidia one, which is unacceptable on nvidia's part due to the high cost of the cards.
I think it's unacceptable to hold a company responsible for an unspecified problem.
 

nullpointerus

Golden Member
Apr 17, 2003
1,326
0
0
Originally posted by: nts
Originally posted by: redbox
Your the only one I have ever heard that has a problem with the signal quality. I swear it is all in your head.

ATi signal quality > NVIDIA signal quality

Here is an old article, http://www.tomshardware.com/2004/11/29/the_tft_connection/index.html

AFAIK nothing has changed and it is more noticeable now as the resolutions scale up.

Try a shorter DVI cable for NVIDIA cards...
You might try reading the article. FYI,

1. Your article explicitly refutes your conclusion, as redbox pointed out.

2. Your article states that the only possible signal quality problems with DVI are sync and jitter, neither of which apply to Frackal's vague ramblings.

3. Your article only refers to nVidia cards two generations ago when DVI was still young and only alludes to apparently nonexistant future articles exploring modern cards.
 

Polish3d

Diamond Member
Jul 6, 2005
5,500
0
0
Overall I don't think that signal quality is a feature to get all wraped up in, but if that is your cup of tea then all the power to you.


Actually, I hope this is the case. I gather from that article that it's a "luck of the draw" thing? I'm not quite sure whether that's really what is going on in my case, somehow I think not, but perhaps so. I may want to get the G80 when it comes out, so if I don't have this stipulation things would be easier
 

Polish3d

Diamond Member
Jul 6, 2005
5,500
0
0
Originally posted by: nullpointerus
Originally posted by: Frackal
Neither of what you say is valid in this case.

The same cable from monitor to video card (it's DVI) is being used in both cases

And I am familiar with digital vibrance / saturation, and it's not the case either.
Signal quality cannot be an issue in a pure digital connection. This isn't open to debate. Logically, you must be tying together unrelated problems and/or seeing things. Then again, you haven't said anything remotely useful in identifying the root cause of your "poor signal quality" problem.

How exactly is it that the ATI card gives a sharper picture in my bios (where I can see each pixel rather than a white sem-blurry letter) if it's just "digital vibrance?"
It is because you're introducing an entirely unrelated "problem." The fonts drawn when in the system BIOS are stored in the video card's BIOS, and they vary between ATI and nVidia cards. It's called a character display mode.

The system BIOS, text-mode OS, boot loader, or whatever plots a string of character codes (i.e. "AGP Multiplier:", "4x") along with a few bits to determine color, background color, and effects (e.g. blinking), leaving the actual font and effect rendering up to the video BIOS. (There are even special ASCII character codes for the lines you see.)

Now, if you prefer the ATI text-mode fonts, that's entirely up to you, but it has nothing to do with signal quality.

Read the post man.
I did. Know what you're talking about before you set ultimatums for a company, man.

I don't know what the difference is, but I still think the ATI "part" responsible is simply higher quality than the nvidia one, which is unacceptable on nvidia's part due to the high cost of the cards.
I think it's unacceptable to hold a company responsible for an unspecified problem.


If this is correct than that's good news. No one (with sense) wants to be limited to one company over another.


This was the progression of my experience: I take out nvidia card, plug in ATI card. Turn on PC. First thing I notice is the letters in my bios appear sharper and brighter. Whereas with the nvidia card all I saw was a white letter, the ATI card showed each subtle pixel. Nothing to brag about alone, but then:

Fast forward to game, Battlefield 2. I notice on medic packs, supply packs, characters, little tiny texturing details that were not present period on the nvidia card (even on Nvid. HQ mode)
HL2 EP1 had a different look as well, along the same lines.

What I'm talking about is hard to explain, but easy to show, that's why it sounds vague. (Watch your contempt if you actually want to continue this discussion.)



The best way I can characterize would be like this:

If you wear contacts or glasses, look at a wall with your contacts/glasses out/off. You can see the wall, maybe some detail. Then you put you contacts and glasses on, and all of a sudden on the same wall you can see more detail, everything is sharper, less washed out. More clarity. That's the closest I can come to describing what I mean without showing someone side by side comparisons.






I am not the only person who noticed this. I can't find the other member's post which mentioned the same thing; he phrased it roughly as 'a little like getting a new LCD' ... this could all be bull, but there is a difference here. Perhaps its simply as straightforward as different letters in the bios, and then regular old better image quality on ATI's part in games.




 

nullpointerus

Golden Member
Apr 17, 2003
1,326
0
0
Originally posted by: Frackal
This was the progression of my experience: I take out nvidia card, plug in ATI card. Turn on PC. First thing I notice is the letters in my bios appear sharper and brighter. Whereas with the nvidia card all I saw was a white letter, the ATI card showed each subtle pixel.
One glaring omission in my previous post was the mention of output scaling. As you're undoubtedly aware, LCD monitors have a fixed native resolution. AFAIK, video cards still boot at 640x480 resolution, so the output has to be scaled somehow.

This can be done in a number of ways depending on the display, video card, firmware used to select one scaling method over the others, and of course the scaling algorithm to be used. Scaling quality can differ dramatically, and that's probably part of what you saw.

In any case, the scaling algorithm's results in text mode and in a graphical resolution with 3D rendering are not comparable to one another. What's good for text can be bad for graphics, and vice versa. Also, one card may default to using its internal scaler while the other card defaults to using the monitor's scaler. IMO text-mode scaling is not important.

Now, if you regularly game at non-native resolutions, then scaling quality might be an issue. But it's not related to signal quality.

Fast forward to game, Battlefield 2. I notice on medic packs, supply packs, characters, little tiny texturing details that were not present period on the nvidia card (even on Nvid. HQ mode)
HL2 EP1 had a different look as well, along the same lines.
It sounds like the game settings are different, either because the game detected a new card and picked defaults for you or because the game has different paths and/or incomparable settings for ATI and nVidia cards. 3D IQ really isn't my thing, though. Wasn't there a thread specifically comparing the IQ of ATI and nVidia cards?

I don't play many FPS's, actually. I do play F.E.A.R. on max CPU and GPU detail (except for the resolution) on a 7900GT, so if you've got a screenshot or a particular memory of an object's quality changing between ATI and nVidia cards, you could post it here, and I'll try to get to it in the game. Otherwise, you'll have to discuss this with someone else.

What I'm talking about is hard to explain, but easy to show, that's why it sounds vague.
Yet you should have been able to pick out one part of a scene, one blotchy-looking part of an XP window or image, or one particular letter that shows what you are talking about. Specifics, man. And if it's a scaling problem, I don't think it'll show up in an screenshot, but even then you should be able to hold your eyes close to the screen and explain the differences as best you can.

(Watch your contempt if you actually want to continue this discussion.)
Reasons for my contempt:

1. You're railing on a company for something you clearly don't understand and can't clearly define. (Hey, how come when I try to make a play on words, it never comes out that well?) Anyway, you gave me a lot of seemingly-unrelated problems with an untenable conclusion in an angrily-worded post and next to no information about the real problem. What can I respond with? What can I even evaluate for myself?

2. I found "Read the post man." to be offensive and hypocritical since you completely ignored what I said to you about signal quality not being your problem.

3. I didn't exactly appreciate having to read that 21-page article and debunk the poster's conclusions. It wouldn't have angered me if he'd simply said, "Here's what I think this article says," but instead he tried to pass it off as proof of imagined (or ill-founded) community consensus. Maybe he just posted the link here because he got it from someone else who originally made the claim, but it's still very irritating.

The best way I can characterize would be like this:

If you wear contacts or glasses, look at a wall with your contacts/glasses out/off. You can see the wall, maybe some detail. Then you put you contacts and glasses on, and all of a sudden on the same wall you can see more detail, everything is sharper, less washed out. More clarity. That's the closest I can come to describing what I mean without showing someone side by side comparisons.
And you see this on the Windows desktop? With a digital connection? I doubt it.

If on the other hand signal quality isn't a problem and you're really just referring to, for example, the quality of lighting in 3D scenes, that's something entirely up to you. (But this can also be seen in screenshots, so it should be possible to post proof.)

I am not the only person who noticed this. I can't find the other member's post which mentioned the same thing; he phrased it roughly as 'a little like getting a new LCD' ... this could all be bull, but there is a difference here. Perhaps its simply as straightforward as different letters in the bios, and then regular old better image quality on ATI's part in games.
FWIW, it's also possible that there were some nVidia cards with minor defects.

There was a case recently where an ATI card was incompatible with a particular LCD monitor but not with other monitors, yet a nVidia card worked flawlessly with the same LCD monitor. But I don't see how we can generalize anything about the companies from experiences like these.

I've had ATI and nVidia cards drawing the Windows desktop to the same display at their respective default color settings, and there's literally zero difference between the cards. Well, except for the little taskbar icon. The issue here - if it's anything objective - is not signal quality.
 

Polish3d

Diamond Member
Jul 6, 2005
5,500
0
0
The glasses analogy was meant to apply to 3d only.


Why, if there is never a difference in quality in these devices, does This tweakguides article say:


"Display Adapter Scaling - The scaling unit on your graphics card will rescale the image before it reaches your monitor. If you have a high-end graphics card and a relatively middle-to-low end monitor, this option results in the best image quality and is the one most recommended.

" Monitor Scaling - If you have a high-end monitor, try this form of scaling versus the Display Adapter Scaling option above to see which is best. Otherwise usually the scalers in monitors are not as good as those on high end graphics hardware. "


So what if the scaler in an ATI card is a better quality one than the one on the nvidia card? Obviously the author seems to think there is a difference in scalers from high end GPUs to lower end GPUs/monitors... which implies that this type of thing can indeed vary.

What is the variable here in image quality he is referring to when he says a scaler in the GPU will probably have "best image quality?" It seems at least to be the same thing I am talking about.

And you're right, I do not know the intimates of this topic. Because of that I also do not know whether you know what you're talking about or not. I do know what I've seen however.




I am hoping you are correct because I'd like to have nvidia cards as an open option. But I flaty notice what I described as well, something I wasn't anticipating either. I've had nvidia cards only on for years prior.
 

Nightmare225

Golden Member
May 20, 2006
1,661
0
0
I'm just happy that you're not a fanboy, Frackal, and can make decisions based on actual features, which is more than I can say for some people here. :roll:
Me, I'm more a performance-type of person, so I'm probably gonna go for a 7950GX2. But hey, if you're happy with your X1900XTX, then more power to you ;)
But, I might spring for an X1900XTX if this quality difference is really what you make it out to be. Is HDR+AA really that much better? (in your opinion, of course.)
 

Polish3d

Diamond Member
Jul 6, 2005
5,500
0
0
Well, here's my view;

HDR+AA is great, but I only use it in Oblivion at the moment.

Now, if you're going to be keeping y our 7950GX2 for a long time, keep in mind that many more games may come out that will have HDR+AA, such as Crysis and UT2007. By that time G80 will be out and there will be wider support for it.

I am a performance nut too. It's almost pointless atm for me to go conroe, but I am because I have a great deal of fun with computer components that are really fast. (Although I do think with this next gen or its refresh that 1600x1200 may become the new 1200x1024, ie, where CPU limitation can begin to show up.)


So XTX vs. GX2... ?

hmm ... Legit Reviews has a review of an overclocked GX2 versus a stock XTX, they don't test all the games I'd have liked, but the XTX holds its own quite well against an overclocked GX2.

If you go to Anand's review of the GX2, they compare it to a stock XT . I add 15% to the XT's scores because I have my memory at 20% over the XT's stock and my core at 10%, and have found that splitting the difference between memory and core OC's tends to be close to its actual gain. (ie, 10% core, 20% mem, = about 15% boost in performance).

Comparing those scores, you'll find the XTX performing very close to the GX2 in most situations. I think you could also figure that since the XT is benched at higher stock image settings and the Gx2 is benched at lower image quality settings the GX2 MAY not perform as well apples to apples as it seems to in the benchmarks.

I looked at it like this to summarize:

Getting XTX over Gx2

Pros: (remember this is in my judgement)

- Possibly very close in performance in many situations (see above) - HDR+AA in Oblivion and future games (though I doubt I'll keep XTX past G80 or R600)
- Better picture quality per that "scaler" discussion (may not even be worth worrying about, not resolved yet)
- HQ-AF which is pretty nice but not a dealmaker/breaker IMO
- Will put less heat into case

Cons: (reason to go Gx2)

- GX2 ought to be faster overall. If you play CoD2 for instance, the Gx2 seems much faster than anything else. I tend to think it may not be faster in Oblivion because an XTX beats even 7800GTX 512 in SLI, and matches 7900GTX SLI in minimum frames, and probably approaches it in max frames once OC'ed a bit. You can use HDR+AA in Oblivion too with ATI. I think Gx2 should be a bit faster in BF2, although again, most benchies DON'T use equal quality settings, and when they do, Nvidia cards lose a fair chunk of performance (10%-20% oftentimes)
- If you want to overclock XTX, and turn the fan up to do it, it can be loud. That's really only if you want to push it to the limit though. Otherwise, it's not that loud.



It's really tough to say. The above was basically my decision-making process. I decided that the Gx2 was not necessarily all that much faster than the XTX in the games I play. When at equal settings for image quality, a 700mhz/1800mhz 7900GTX gets beaten pretty badly (all things considered) by an XTX at stock, that says something.

Also, the XTX would (should) allow me to use HDR+AA in Crysis, Oblivion and UT2007, was cheaper, and had better image quality across the board. I game at 1680x1050 so an XTX is often overkill, a Gx2 would be more so. I mean, I don't REALLY need 120+fps in BF2, 92-95fps in DOD-S, etc with everything cranked up, but it's fun as hell to see it though, which is why I think a Gx2 would be a lot of fun also for that reason.


Remember, this post will seem more pro x1900 becuase I'm giving you the reasoning I used that led me to pick the x1900 over the Gx2. I had had the opportunity to borrow an XTX at the time the Gx2 came out so I had a unique chance to try the XTX and then decide whether to buy one of my own, or go Gx2. I wish I could have tried the Gx2. At this point I definitely wouldn't change my decision becuase the XTX gives a better visual experience, and will be plenty until G80 or R600 comes along, especially when I put it on water.

If you lived in CO maybe we could trade for a week :D and you can try XTX and I can try Gx2 becuase I am curious about it
















 

Polish3d

Diamond Member
Jul 6, 2005
5,500
0
0
btw, there is someone here who bought a GX2 to replace and sell his XTX but he ended up selling the GX2 instead, IIRC. Maybe he'll notice this thread and mention why he decided to do so
 

Nightmare225

Golden Member
May 20, 2006
1,661
0
0
Originally posted by: Frackal
Well, here's my view;

HDR+AA is great, but I only use it in Oblivion at the moment.

Now, if you're going to be keeping y our 7950GX2 for a long time, keep in mind that many more games may come out that will have HDR+AA, such as Crysis and UT2007. By that time G80 will be out and there will be wider support for it.

I am a performance nut too. It's almost pointless atm for me to go conroe, but I am because I have a great deal of fun with computer components that are really fast. (Although I do think with this next gen or its refresh that 1600x1200 may become the new 1200x1024, ie, where CPU limitation can begin to show up.)


So XTX vs. GX2... ?

hmm ... Legit Reviews has a review of an overclocked GX2 versus a stock XTX, they don't test all the games I'd have liked, but the XTX holds its own quite well against an overclocked GX2.

If you go to Anand's review of the GX2, they compare it to a stock XT . I add 15% to the XT's scores because I have my memory at 20% over the XT's stock and my core at 10%, and have found that splitting the difference between memory and core OC's tends to be close to its actual gain. (ie, 10% core, 20% mem, = about 15% boost in performance).

Comparing those scores, you'll find the XTX performing very close to the GX2 in most situations. I think you could also figure that since the XT is benched at higher stock image settings and the Gx2 is benched at lower image quality settings the GX2 MAY not perform as well apples to apples as it seems to in the benchmarks.

I looked at it like this to summarize:

Getting XTX over Gx2

Pros: (remember this is in my judgement)

- Possibly very close in performance in many situations (see above) - HDR+AA in Oblivion and future games (though I doubt I'll keep XTX past G80 or R600)
- Better picture quality per that "scaler" discussion (may not even be worth worrying about, not resolved yet)
- HQ-AF which is pretty nice but not a dealmaker/breaker IMO
- Will put less heat into case

Cons: (reason to go Gx2)

- GX2 ought to be faster overall. If you play CoD2 for instance, the Gx2 seems much faster than anything else. I tend to think it may not be faster in Oblivion because an XTX beats even 7800GTX 512 in SLI, and matches 7900GTX SLI in minimum frames, and probably approaches it in max frames once OC'ed a bit. You can use HDR+AA in Oblivion too with ATI. I think Gx2 should be a bit faster in BF2, although again, most benchies DON'T use equal quality settings, and when they do, Nvidia cards lose a fair chunk of performance (10%-20% oftentimes)
- If you want to overclock XTX, and turn the fan up to do it, it can be loud. That's really only if you want to push it to the limit though. Otherwise, it's not that loud.



It's really tough to say. The above was basically my decision-making process. I decided that the Gx2 was not necessarily all that much faster than the XTX in the games I play. When at equal settings for image quality, a 700mhz/1800mhz 7900GTX gets beaten pretty badly (all things considered) by an XTX at stock, that says something.

Also, the XTX would (should) allow me to use HDR+AA in Crysis, Oblivion and UT2007, was cheaper, and had better image quality across the board. I game at 1680x1050 so an XTX is often overkill, a Gx2 would be more so. I mean, I don't REALLY need 120+fps in BF2, 92-95fps in DOD-S, etc with everything cranked up, but it's fun as hell to see it though, which is why I think a Gx2 would be a lot of fun also for that reason.


Remember, this post will seem more pro x1900 becuase I'm giving you the reasoning I used that led me to pick the x1900 over the Gx2. I had had the opportunity to borrow an XTX at the time the Gx2 came out so I had a unique chance to try the XTX and then decide whether to buy one of my own, or go Gx2. I wish I could have tried the Gx2. At this point I definitely wouldn't change my decision becuase the XTX gives a better visual experience, and will be plenty until G80 or R600 comes along, especially when I put it on water.

If you lived in CO maybe we could trade for a week :D and you can try XTX and I can try Gx2 becuase I am curious about it

Sorry, but I live in NY :D. But that was a very comprehensive and interesting post. I''ll take this into consideration when I choose the cards, which should be early next week. It looks like the XTX would provide a performance near to the GX2 at better quality settings... However, an eVGA card would offer the advantage of a step-up program, and if G80 is really going to come out in September, or even October, I'll be able to get it without giving out all that dough again. Hmmm, I'll have to sleep on it.

 

Polish3d

Diamond Member
Jul 6, 2005
5,500
0
0
IMO, if you are confident enough in the step up thing, and don't need HDR+AA, I'd get the GX2 and wait for G80, which I am wiating for too with some anticipation. (r600 too but supposedly it'll be out later)
 

Nightmare225

Golden Member
May 20, 2006
1,661
0
0
Ya, we'll have to wait a few more months for it, based on current rumors. I really can't wait until Crysis. :)