Firingsquad Publishes HDR + AA Performance Tests

Page 11 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

beggerking

Golden Member
Jan 15, 2006
1,703
0
0
facts:

1. When computer boots in VGA mode, It is scaled through LCD monitor. This can be verified by push the correct button on your notebook( standard/expand image button).

2. Gx2 is ~25+% faster than XTX in most situations. XTX does provide HDR+AA and IMO better image quality.

3. Display Adapter Scaling(aka software scaling) = fullscreen in media player.
 

Polish3d

Diamond Member
Jul 6, 2005
5,500
0
0
Originally posted by: beggerking
facts:



2. Gx2 is ~25+% faster than XTX in most situations. XTX does provide HDR+AA and IMO better image quality.


No it isn't. Take Anand's GX2 review against merely a XT let alone XTX

All at 1600x1200 4xaa

Battlefield 2: GX2 wins by mere 12%

Black and White 2: GX2 wins by mere 12.5%

Fear: GX2 wins by 30% (weird but hey)

HL2 EP 1:GX2 wins by 28%

Quake 4: GX2 wins by 25%

SCCT: GX2 wins by 20%

X3: GX2 wins by 15%


Now, an XTX typically gains about 5-7% performance over the XT, ADD 5-7% to the XT's frames (thanks beggarking ;) ) from those wins by the Gx2.

Now remember again that if Nvidia's image quality settings were raised to meet ATI's, it would lose even more performance. Subtract a conservative 5% from each of those scores, I would subtract at least 10% at a minimum from my experience with the GTX.



Go to the Legit Reviews review
of a GX2 OVERCLOCKED to 570/1600mhz which is pitted against a *STOCK* x1900XTX, and the XTX wins in one or two cases, and is very close in the others. OC that XTX to meet the OC'ed GX2, and you'll see some interesting results I think.

Legit Reviews tends to raise nvidia's default lower settings to ATI's level.

I agree with this. A 500-700$ top end card ought to be benched at High Quality settings, like its competitor, not some bullshit "Quality" settings with various optimizations and lower quality texture/mipmap settings. Why not bench the ATI card at 2xAA vs 4xAA for nvidia? Same thing going on here with this Q vs. HQ thing IMO.




Frankly, you can just glance at benchmarks or you can be interested in getting accurate information. Given that nvidia loses performance when moved from Q to HQ, it's relevant.

I truly wonder why some of you continue to be fooled by the fact that nvidia lowers its image quality settings by default. Remember, both companies have used optimizations to
win the benchmark game, there were controversies about it in the past, so they DO pay attention to this quite a bit! IMO Nvidia probably does this on purpose. Don't ask me why ATI doesn't.













 

beggerking

Golden Member
Jan 15, 2006
1,703
0
0
Originally posted by: Frackal

Now, an XTX typically gains about 5-7% performance over the XT, so subtract 5-7% from those wins by the Gx2. I did that next to each score.

Now remember again that if Nvidia's image quality settings were raised to meet ATI's, it would lose even more performance. Subtract a conservative 5% from each of those scores, I would subtract at least 10% at a minimum from my experience with the GTX.

I believe your calculation is incorrect. the 5% increase should be added on top of XT frame rate, then you get the difference between 2 cards to get the correct performance difference.
 

Polish3d

Diamond Member
Jul 6, 2005
5,500
0
0
Whoops, actually you are correct. :eek: Boy I feel like a dope for that. Anyway, it makes a difference in the scores, in a few of them, not much, in others, mine were off by 6% or more.
 

thilanliyan

Lifer
Jun 21, 2005
12,039
2,251
126

fierydemise

Platinum Member
Apr 16, 2005
2,056
2
81
Originally posted by: beggerking
2. Gx2 is ~25+% faster than XTX in most situations.
as it should considering it costs a good deal more, the cheapest X1900XTX costs $424 (with an XT costing $325) and the cheapest GX2 is $529. All these numbers are from AT's RTPE.
 

Polish3d

Diamond Member
Jul 6, 2005
5,500
0
0
My honest view is that with image quality at EQUAL settings, an XTX at an overclock that most can get (such as mine, maybe not quite as high on mem but def core), is going to be pretty close in most situations to the GX2, and perhaps beat it in Oblivion


I come to this conclusion based upon the Legit Reviews reviews of an OCed GX2 vs. stock XTX at what I assume are equal image settings, and anand's article where I add 15% to the XT's frames to account for my overclock, and then speculate between 5-15% drop in the GX2's frames to account for the inequality in default image quality settings.
 

beggerking

Golden Member
Jan 15, 2006
1,703
0
0
Originally posted by: fierydemise
Originally posted by: beggerking
2. Gx2 is ~25+% faster than XTX in most situations.
as it should considering it costs a good deal more, the cheapest X1900XTX costs $424 (with an XT costing $325) and the cheapest GX2 is $529. All these numbers are from AT's RTPE.

As usual, you spend premium for the fastest.

price/performance ratio wise, XT > GX2 >> XTX
 

beggerking

Golden Member
Jan 15, 2006
1,703
0
0
Originally posted by: Frackal
I come to this conclusion based upon the Legit Reviews reviews of an OCed GX2 vs. stock XTX at what I assume are equal image settings, and anand's article where I add 15% to the XT's frames to account for my overclock, and then speculate between 5-15% drop in the GX2's frames to account for the inequality in default image quality settings.

1. Anand's review 7950 isn't OCed.
2. 5-15% drop account for inequality setting is invalid here . High Quality mode, 4X antialiasing, 8X anisotropic filtering, 7950 can be up to 40% faster. in Quake 4. and 35% faster in HL2 for 1,920x1,440 (4X antialiasing, 16X anisotropic filtering).
 

Polish3d

Diamond Member
Jul 6, 2005
5,500
0
0
Originally posted by: beggerking
Originally posted by: Frackal
I come to this conclusion based upon the Legit Reviews reviews of an OCed GX2 vs. stock XTX at what I assume are equal image settings, and anand's article where I add 15% to the XT's frames to account for my overclock, and then speculate between 5-15% drop in the GX2's frames to account for the inequality in default image quality settings.

1. Anand's review 7950 isn't OCed.
2. 5-15% drop account for inequality setting is invalid here . High Quality mode, 4X antialiasing, 8X anisotropic filtering, 7950 can be up to 40% faster. in Quake 4. and 35% faster in HL2 for 1,920x1,440 (4X antialiasing, 16X anisotropic filtering).



1. I didn't say it was, I'm talking about the Legit Reviews article where a GX2 @ 570/1600 is compared to a STOCK x1900TX

2. Can you clarify where they explain the driver quality settings in that review? Didn't see it. For instance, High Quality mode in Oblivion (where XT essentially ties GX2 btw in 1600x1200) probably means "High Quality mode in Oblivion" which a game and not driver setting.
 

nullpointerus

Golden Member
Apr 17, 2003
1,326
0
0
Originally posted by: Frackal
The glasses analogy was meant to apply to 3d only.
OK, cool.

Why, if there is never a difference in quality in these devices, does This tweakguides article say:


"Display Adapter Scaling - The scaling unit on your graphics card will rescale the image before it reaches your monitor. If you have a high-end graphics card and a relatively middle-to-low end monitor, this option results in the best image quality and is the one most recommended.

" Monitor Scaling - If you have a high-end monitor, try this form of scaling versus the Display Adapter Scaling option above to see which is best. Otherwise usually the scalers in monitors are not as good as those on high end graphics hardware. "


So what if the scaler in an ATI card is a better quality one than the one on the nvidia card? Obviously the author seems to think there is a difference in scalers from high end GPUs to lower end GPUs/monitors... which implies that this type of thing can indeed vary.

What is the variable here in image quality he is referring to when he says a scaler in the GPU will probably have "best image quality?" It seems at least to be the same thing I am talking about.
I already said that scaling could be an issue. It only happens when you set your card to display a nonnative resolution.

A CRT uses an electron gun to draw the screen line by line from one side to the other. It has no native resolution since all that needs to be done to display different resolutions is to vary the timing and duration of the lines. IOW it can shoot a lot of pixels out in small increments to hit a very large resolution such as 1600x1200, or it can shoot a few pixels out in large increments to hit a very small resolution such as 640x480.

A LCD, by contrast, has physically a fixed number of pixels which corresponds exactly to its native resolution. That is, a 20" nonwidescreen LCD has 1600x1200 pixels, and that's the resolution which the display will be tied to forever, period. To get around this, monitors and video cards can upscale the image of a lower resolution into the image of the native resolution. By default, in Windows at least, the video card does the scaling.

But scaling and signal quality are separate issues. Signal quality is only concerned with transmitting the image - scaled or unscaled - to the monitor as accurately as possible. Scaling deals with the actual image enlargement. What I said is that scaling could be your problem - do you play at nonnative resolutions, or not? - but that signal quality cannot be your problem because DVI doesn't have signal quality issues.

And you're right, I do not know the intimates of this topic. Because of that I also do not know whether you know what you're talking about or not. I do know what I've seen however.
I know what I'm talking about. You can trust me! :D

Seriously, if you distrust me that much, you could ask someone else who knows the tech, like maybe xtknightx.

I am hoping you are correct because I'd like to have nvidia cards as an open option. But I flaty notice what I described as well, something I wasn't anticipating either. I've had nvidia cards only on for years prior.
(I think this was mangled by the typo monster.)
 

Polish3d

Diamond Member
Jul 6, 2005
5,500
0
0
That last line was indeed.


So why is it that Tweakguides claims that there is a difference in image quality based upon the scaler you are using? Is becuase different scaler devices will do a better/worse job of simply resizing the picture? So the scaler issue only applies (to an LCD) when running a non-native resolution? It has no impact on image quality when running at a native resolution? Wouldn't it still be scaled by the GPU to 1680x1050, by this same device?

I don't trust or distrust you, I don't have the knowledge to evaulate what you are saying so I cannot know if you are accurate or innaccurate


BTW I alway game at the native res
 

beggerking

Golden Member
Jan 15, 2006
1,703
0
0
Originally posted by: Frackal
1. I didn't say it was, I'm talking about the Legit Reviews article where a GX2 @ 570/1600 is compared to a STOCK x1900TX where a 570/1600mhz GX2 is compared to a stock XTX


2. Can you clarify where they explain the driver quality settings in that review? Didn't see it. For instance, High Quality mode in Oblivion (where XT essentially ties GX2 btw in 1600x1200) probably means "High Quality mode in Oblivion" which a game and not driver setting.

1. usually its not a 1 to 1 scaling in mhz vs performance...
2. Title above each graph... max quality setting/ high quality settings.. crappy review though..
 

Polish3d

Diamond Member
Jul 6, 2005
5,500
0
0
1. What are you talking about? I NEVER SAID IT WAS 1:1, in fact if you read the previous page I said it was not.

2. I do not see where it shows anything about driver settings in that review. In 2 games:

Oblivion they say "High Quality mode" which is also a graphics setting in Oblivion, unrelated to drivers, so who knows

Same for Quake, who knows whether they are referring to in-game or drivers, or what driver settings were used, etc. How could you possibly use this review to make the claim that nvidia losing performance from Q to High Quality is an irrelevant issue?
 

Polish3d

Diamond Member
Jul 6, 2005
5,500
0
0
Let's try another review? Are you trying to prove your points or find out how these cards really perform?

That last article was baloney, so is this one IMO. 3DMark, Far Cry?

It shows an X1800XT beating an x1900xtx AND Gx2 in Far Cry, and they only bench Far Cry and DOOM 3... what the hell for? Why not some modern games.


I honestly don't want to spend the rest of the night arguing about this, especially with these crap-ass review sites that don't even really prove the point you're trying to prove.
 

nullpointerus

Golden Member
Apr 17, 2003
1,326
0
0
Originally posted by: Frackal
So why is it that Tweakguides claims that there is a difference in image quality based upon the scaler you are using?
Right-click on your desktop and lower the resolution. Tell me when you think the scaler kicks in. :)

Is becuase different scaler devices will do a better/worse job of simply resizing the picture?
Yes, exactly. So at boot time, scaling factors in. When you get to your Windows desktop, scaling turns off, and everything becomes crystal clear because you're running at a native resolution. But when you load a game at 1024x768, scaling kicks in again.

So the scaler issue only applies (to an LCD) when running a non-native resolution?
Yes, exactly.

It has no impact on image quality when running at a native resolution?
Yes, exactly.

Wouldn't it still be scaled by the GPU to 1680x1050, by this same device?
scaling = resizing

You don't need to resize an image when it's already the correct size.

Note that we're talking about logical size here, not physical size. It wouldn't matter if you had a hypothetical 80" LCD running at 1600x1200 or a 20" LCD running at 1600x1200 - as long as the video card outputs at 1600x1200, the native resolution of both displays, there would be no scaling involved for either of them.
 

Polish3d

Diamond Member
Jul 6, 2005
5,500
0
0
Originally posted by: nullpointerus
Originally posted by: Frackal
So why is it that Tweakguides claims that there is a difference in image quality based upon the scaler you are using?
Right-click on your desktop and lower the resolution. Tell me when you think the scaler kicks in. :)
.

I appreciate your input, but not your unwarranted contempt and patronizing attitude. It is pissing me off. I know the scaler will kick in when I change the resolution. Aside from that, that line of yours wasn't even related to what I said so I have no clue why you said it.


I don't know the cause of what I observed, or why someone else here said they saw the same.


You could be totally correct in all you say and yet be missing something that would indeed show that the difference in quality I'm talking about does indeed exist.


I see something, you tell me I am not seeing it and explain why. I do not know if your explanation is valid or applicable. I know someone else mentioned they noticed the same thing I did.

Maybe it is the case, maybe not. I guess if I get a G80 I'll be able to compare, but I have sold my GTX already.
 

nullpointerus

Golden Member
Apr 17, 2003
1,326
0
0
Originally posted by: Frackal
Originally posted by: nullpointerus
Originally posted by: Frackal
So why is it that Tweakguides claims that there is a difference in image quality based upon the scaler you are using?
Right-click on your desktop and lower the resolution. Tell me when you think the scaler kicks in. :)
.
I appreciate your input, but not your unwarranted contempt and patronizing attitude. It is pissing me off. I know the scaler will kick in when I change the resolution. Aside from that, that line of yours wasn't even related to what I said so I have no clue why you said it.
That wasn't contemptuous, nor was it patronizing. If you would have done as I said, you would have seen what exactly scaling does and had a nice demonstration of all your other questions in that post being answered without any doubt that I was misleading you. I put the smiley there to indicate a total lack of ill will. *shrugs*

Now, I've asked twice, and you still haven't told me whether you game at nonnative resolutions, nor will you actually scale your desktop to see the impact first hand, so I still can't tell you whether scaling has any impact on your game image quality.

I also have no idea whether ATI or nVidia has better scalers or whether there's even a significant difference between the two, nor do I recall claiming to know this.

I don't know the cause of what I observed, or why someone else here said they saw the same.
I said that if you saw anything, it's probably a set of issues unrelated to your conclusion about poor signal quality being the root cause. I think most of your nVidia vs. ATI game IQ questions would have been answered by looking up that thread I suggested - I believe it was entitled "calling all nv7x owners" or something wierd like that - where there were screenshots, links, and arguments being exchanged on exactly that subject. Personally, I really can't discuss game IQ at a serious level because I'm not that knowledgeable about the subject, as I said earlier.

You could be totally correct in all you say and yet be missing something that would indeed show that the difference in quality I'm talking about does indeed exist.
I can only address points that I have definite knowledge of. If you really want to get to the bottom of this, maybe you could start a thread on this subject and attract some more knowledgeable people to it? We're way off topic as it is.

I see something, you tell me I am not seeing it and explain why. I do not know if your explanation is valid or applicable. I know someone else mentioned they noticed the same thing I did.
To the contrary, I said that either you were grouping together unrelated issues under a false conclusion (i.e. poor signal quality) or seeing things. That's two possibilities, not one, and it leaves allowance for you seeing things that are true for different reasons that you originally though. For example, we acknowledged the difference in the BIOS text; I only refuted the idea that it was caused by poor signal quality.

Maybe it is the case, maybe not. I guess if I get a G80 I'll be able to compare, but I have sold my GTX already.
I don't encourage anyone to buy particular hardware in the absence of knowledge about their own circumstances and plausible community consensus on performance and image quality. If you want to buy a G80, feel free, but it doesn't matter either way to me. You don't seem particularly interested in getting to the bottom of this issue.
 

Polish3d

Diamond Member
Jul 6, 2005
5,500
0
0
Originally posted by: nullpointerus
Originally posted by: Frackal
Originally posted by: nullpointerus
Originally posted by: Frackal
So why is it that Tweakguides claims that there is a difference in image quality based upon the scaler you are using?
Right-click on your desktop and lower the resolution. Tell me when you think the scaler kicks in. :)
.
I appreciate your input, but not your unwarranted contempt and patronizing attitude. It is pissing me off. I know the scaler will kick in when I change the resolution. Aside from that, that line of yours wasn't even related to what I said so I have no clue why you said it.
That wasn't contemptuous, nor was it patronizing. If you would have done as I said, you would have seen what exactly scaling does and had a nice demonstration of all your other questions in that post being answered without any doubt that I was misleading you. I put the smiley there to indicate a total lack of ill will. *shrugs*

Now, I've asked twice, and you still haven't told me whether you game at nonnative resolutions, nor will you actually scale your desktop to see the impact first hand, so I still can't tell you whether scaling has any impact on your game image quality.


I'm not sure there's a point to doing that. I do game only at native res, If I lower my res as you suggest it may affect image quality, but what does it matter? I'm comparing ATI to Nvidia, not ATI at 1680x1050 and ATI at lower non-native res.

I think we are talking past one another.

I asked:

"So why is it that Tweakguides claims that there is a difference in image quality based upon the scaler you are using?" ie, they are saying (AFAIK) that a scaler in a monitor may be lower quality than that of a GPU.

Changing the resolution as you suggest would not shift between the monitor and GPU scalers, it would just change the res. Not related. It did seem contemptuous, but if you didn't mean it that way let's just move on.




I also have no idea whether ATI or nVidia has better scalers or whether there's even a significant difference between the two, nor do I recall claiming to know this.

But you do say that at an LCD's native resolution, it won't be a factor in picture quality, right?



I don't know the cause of what I observed, or why someone else here said they saw the same.
I said that if you saw anything, it's probably a set of issues unrelated to your conclusion about poor signal quality being the root cause. I think most of your nVidia vs. ATI game IQ questions would have been answered by looking up that thread I suggested - I believe it was entitled "calling all nv7x owners" or something wierd like that - where there were screenshots, links, and arguments being exchanged on exactly that subject. Personally, I really can't discuss game IQ at a serious level because I'm not that knowledgeable about the subject, as I said earlier.

You could be totally correct in all you say and yet be missing something that would indeed show that the difference in quality I'm talking about does indeed exist.
I can only address points that I have definite knowledge of. If you really want to get to the bottom of this, maybe you could start a thread on this subject and attract some more knowledgeable people to it? We're way off topic as it is.

I see something, you tell me I am not seeing it and explain why. I do not know if your explanation is valid or applicable. I know someone else mentioned they noticed the same thing I did.
To the contrary, I said that either you were grouping together unrelated issues under a false conclusion (i.e. poor signal quality) or seeing things. That's two possibilities, not one, and it leaves allowance for you seeing things that are true for different reasons that you originally though. For example, we acknowledged the difference in the BIOS text; I only refuted the idea that it was caused by poor signal quality.

Maybe it is the case, maybe not. I guess if I get a G80 I'll be able to compare, but I have sold my GTX already.
I don't encourage anyone to buy particular hardware in the absence of knowledge about their own circumstances and plausible community consensus on performance and image quality. If you want to buy a G80, feel free, but it doesn't matter either way to me. You don't seem particularly interested in getting to the bottom of this issue.[/quote]

Well that's a load of crap, I am interested in getting to the bottom of it or I wouldn't bother with all of this. And I never said you were misleading me, just that you may be right in what you say, but I don't know enough to evaluate it, or, that you may be right but there may be more going on you aren't aware of. Or I could be mistaken. If I didn't see what I'm talking about fairly plainly I wouldn't be this resistant.

 

nullpointerus

Golden Member
Apr 17, 2003
1,326
0
0
Originally posted by: Frackal
Originally posted by: nullpointerus
Originally posted by: Frackal
Originally posted by: nullpointerus
Originally posted by: Frackal
So why is it that Tweakguides claims that there is a difference in image quality based upon the scaler you are using?
Right-click on your desktop and lower the resolution. Tell me when you think the scaler kicks in. :)
I appreciate your input, but not your unwarranted contempt and patronizing attitude. It is pissing me off. I know the scaler will kick in when I change the resolution. Aside from that, that line of yours wasn't even related to what I said so I have no clue why you said it.
That wasn't contemptuous, nor was it patronizing. If you would have done as I said, you would have seen what exactly scaling does and had a nice demonstration of all your other questions in that post being answered without any doubt that I was misleading you. I put the smiley there to indicate a total lack of ill will. *shrugs*

Now, I've asked twice, and you still haven't told me whether you game at nonnative resolutions, nor will you actually scale your desktop to see the impact first hand, so I still can't tell you whether scaling has any impact on your game image quality.
I'm not sure there's a point to doing that. I do game only at native res, If I lower my res as you suggest it may affect image quality, but what does it matter? I'm comparing ATI to Nvidia, not ATI at 1680x1050 and ATI at lower non-native res.

I think we are talking past one another.

I asked:

"So why is it that Tweakguides claims that there is a difference in image quality based upon the scaler you are using?" ie, they are saying (AFAIK) that a scaler in a monitor may be lower quality than that of a GPU.

Changing the resolution as you suggest would not shift between the monitor and GPU scalers, it would just change the res. Not related. It did seem contemptuous, but if you didn't mean it that way let's just move on.
As I said, changing the desktop resolution would have demonstrated that my answers to your other questions in that particular post were, in fact, correct, and additionally it would allow you to see first-hand how scaling of any kind affects image quality. If you keep questioning why I wrote that, I'll just keep repeating my reason.

The scaling options you were talking about only impact image quality insofar as they:

1. Switch between the monitor scaler, GPU scaler, or no scaler.
2. Prevent resolutions in one aspect ratio from being scaled to another aspect ratio.

3D rendering is the process of projecting 3D geometry onto a 2D image. Scaling only affects the resulting 2D image as a whole; scaling has no impact on the 3D renderer. Is that what you wanted to know?

But you do say that at an LCD's native resolution, it won't be a factor in picture quality, right?
Correct.

Well that's a load of crap, I am interested in getting to the bottom of it or I wouldn't bother with all of this. And I never said you were misleading me, just that you may be right in what you say, but I don't know enough to evaluate it, or, that you may be right but there may be more going on you aren't aware of. Or I could be mistaken. If I didn't see what I'm talking about fairly plainly I wouldn't be this resistant.
You're not going about it the right way. I keep telling you that my knowledge is too limited to get to the bottom of the 3D image quality differences between the cards, yet that is the only issue which remains unexplained. If you really want it resolved, you must get other people involved. Start a new thread, post something here to that effect, and more people will likely join in.
 

Polish3d

Diamond Member
Jul 6, 2005
5,500
0
0
Originally posted by: nullpointerus
Originally posted by: Frackal
Originally posted by: nullpointerus
Originally posted by: Frackal
Originally posted by: nullpointerus
Originally posted by: Frackal
So why is it that Tweakguides claims that there is a difference in image quality based upon the scaler you are using?
Right-click on your desktop and lower the resolution. Tell me when you think the scaler kicks in. :)
I appreciate your input, but not your unwarranted contempt and patronizing attitude. It is pissing me off. I know the scaler will kick in when I change the resolution. Aside from that, that line of yours wasn't even related to what I said so I have no clue why you said it.
That wasn't contemptuous, nor was it patronizing. If you would have done as I said, you would have seen what exactly scaling does and had a nice demonstration of all your other questions in that post being answered without any doubt that I was misleading you. I put the smiley there to indicate a total lack of ill will. *shrugs*

Now, I've asked twice, and you still haven't told me whether you game at nonnative resolutions, nor will you actually scale your desktop to see the impact first hand, so I still can't tell you whether scaling has any impact on your game image quality.
I'm not sure there's a point to doing that. I do game only at native res, If I lower my res as you suggest it may affect image quality, but what does it matter? I'm comparing ATI to Nvidia, not ATI at 1680x1050 and ATI at lower non-native res.

I think we are talking past one another.

I asked:

"So why is it that Tweakguides claims that there is a difference in image quality based upon the scaler you are using?" ie, they are saying (AFAIK) that a scaler in a monitor may be lower quality than that of a GPU.

Changing the resolution as you suggest would not shift between the monitor and GPU scalers, it would just change the res. Not related. It did seem contemptuous, but if you didn't mean it that way let's just move on.
As I said, changing the desktop resolution would have demonstrated that my answers to your other questions in that particular post were, in fact, correct, and additionally it would allow you to see first-hand how scaling of any kind affects image quality. If you keep questioning why I wrote that, I'll just keep repeating my reason.

The scaling options you were talking about only impact image quality insofar as they:

1. Switch between the monitor scaler, GPU scaler, or no scaler.
2. Prevent resolutions in one aspect ratio from being scaled to another aspect ratio.

3D rendering is the process of projecting 3D geometry onto a 2D image. Scaling only affects the resulting 2D image as a whole; scaling has no impact on the 3D renderer. Is that what you wanted to know?



Well let's just drop that part then, I've already played with different resolutions and between the nvidia scaling and monitor scaling in the past. Doing it now wouldn't have any real purpose and I don't want to spend time sorting through past posts to figure out which one of us is right on this.

But you do say that at an LCD's native resolution, it won't be a factor in picture quality, right?
Correct.

Well that's a load of crap, I am interested in getting to the bottom of it or I wouldn't bother with all of this. And I never said you were misleading me, just that you may be right in what you say, but I don't know enough to evaluate it, or, that you may be right but there may be more going on you aren't aware of. Or I could be mistaken. If I didn't see what I'm talking about fairly plainly I wouldn't be this resistant.
You're not going about it the right way. I keep telling you that my knowledge is too limited to get to the bottom of the 3D image quality differences between the cards, yet that is the only issue which remains unexplained. If you really want it resolved, you must get other people involved. Start a new thread, post something here to that effect, and more people will likely join in.


Alright, so what we've resolved so far is that the bios lettering is not an issue in the sense that I thought it was, but the sharper picture quality in 3d (and IMO 2d) may or may not be a reality. Yes I will probably start a thread but later on, I'm tired of posting tonight :)
 

beggerking

Golden Member
Jan 15, 2006
1,703
0
0
Originally posted by: Frackal
Let's try another review? Are you trying to prove your points or find out how these cards really perform?

That last article was baloney, so is this one IMO. 3DMark, Far Cry?

It shows an X1800XT beating an x1900xtx AND Gx2 in Far Cry, and they only bench Far Cry and DOOM 3... what the hell for? Why not some modern games.


I honestly don't want to spend the rest of the night arguing about this, especially with these crap-ass review sites that don't even really prove the point you're trying to prove.

are you saying legitreview is the only review you deem as valid? out of millions of reviews out there....
 

Polish3d

Diamond Member
Jul 6, 2005
5,500
0
0
No, I'm just saying don't go searching for every backwater site out there that you might be able to squeeze out a point with. I've seen different sites with totally opposite numbers where there shouldn't be. I tend to trust anandtech, I think Legit Reviews probably has a decent reputation but can't promise it