• We should now be fully online following an overnight outage. Apologies for any inconvenience, we do not expect there to be any further issues.

FiringSquad does image quality comparisons...again

Page 7 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Gstanfor

Banned
Oct 19, 1999
3,307
0
0
Originally posted by: CaiNaM
Originally posted by: Gstanfor
Once again Ackmed, yes, the HQ mode does turn OFF all the optimizations, advanced view or not. You are on a hiding to nothing here, trust me.

here's a reasonable question: since nv hardware is incapable of "angle independant" AF (the GF4 did, which is why their AF was so much better than r300), how do you turn "angle dependant" AF off?

You can't turn angler dependent AF off since it is hardwired into the chip and there is no angle independent AF to enable...
 

redbox

Golden Member
Nov 12, 2005
1,021
0
0
Originally posted by: CaiNaM
Originally posted by: Gstanfor
Once again Ackmed, yes, the HQ mode does turn OFF all the optimizations, advanced view or not. You are on a hiding to nothing here, trust me.

here's a reasonable question: since nv hardware is incapable of "angle independant" AF (the GF4 did, which is why their AF was so much better than r300), how do you turn "angle dependant" AF off?

You switch to an ATI card.
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,003
126
it's tricky, but the optimizations will still be enabled even if you switch to HQ in the NV control panel and you have a third party OC utilty installed(Rivatuner) ...the optimizations will still be enabled in the Intellisample tab for both Direct 3D tweaks and OpenGL tweaks. you will have to manually disable them.
Under HQ it doesn't matter what 3rd party utilities report them as.

When HQ is selected the driver ignores the three settings and forces them off. It also forces LOD clamp on.
 

CaiNaM

Diamond Member
Oct 26, 2000
3,718
0
0
Originally posted by: Gstanfor
Originally posted by: CaiNaM
Originally posted by: Gstanfor
Once again Ackmed, yes, the HQ mode does turn OFF all the optimizations, advanced view or not. You are on a hiding to nothing here, trust me.

here's a reasonable question: since nv hardware is incapable of "angle independant" AF (the GF4 did, which is why their AF was so much better than r300), how do you turn "angle dependant" AF off?

You can't turn angler dependent AF off since it is hardwired into the chip and there is no angle independent AF to enable...

then that would mean HQ does not turn off all opts, since angle dependant AF (which is an optimization) is "hardwired"...

thank you.
 

Cookie Monster

Diamond Member
May 7, 2005
5,161
32
86
Originally posted by: CaiNaM
Originally posted by: Gstanfor
Once again Ackmed, yes, the HQ mode does turn OFF all the optimizations, advanced view or not. You are on a hiding to nothing here, trust me.

here's a reasonable question: since nv hardware is incapable of "angle independant" AF (the GF4 did, which is why their AF was so much better than r300), how do you turn "angle dependant" AF off?

Didnt the FX series have angle independent AF? Im sure the FX series although being quite slow at DX9 games, did have crisper hence better texture quality compared to ATi's R300 angle dependent AF.
 

Polish3d

Diamond Member
Jul 6, 2005
5,500
0
0
Originally posted by: coldpower27
Originally posted by: gersson
This argument keeps going because some people REFUSE to accept that the nvidia default drivers favor speed over image quality. All it takes is a few tweaks and that is changed. However, nvidia currently takes a big hit when doing that.

Since a lot of review sites do not do this, it gives the false impression that the games are running @ equal image quality settings and skews the fps results. NOT in ATI's favor, I might add.

This is ATI's problem then that the reviews do not beleive it's enough of an issue to warrant, turning Q to HQ on Nvidia video cards.

Image Quality remains a subjective issue since it is a qualitative quanitity.


You have no idea why different review sites do what they do.

 

josh6079

Diamond Member
Mar 17, 2006
3,261
0
0
Image Quality remains a subjective issue since it is a qualitative quanitity.
:confused:
then that would mean HQ does not turn off all opts, since angle dependant AF (which is an optimization) is "hardwired"...
He answers one thing correctly and you throw it in his face with something that can't be helped by any software? While ATI's AF may in fact do more work than Nvidia's, the same could be said in comparison of AA between the two. Does the fact that ATI's AA tops out at 6x mean that it is an optimization since Nvidia's can do 8x?
 

redbox

Golden Member
Nov 12, 2005
1,021
0
0
Originally posted by: CaiNaM
Originally posted by: Gstanfor
Originally posted by: CaiNaM
Originally posted by: Gstanfor
Once again Ackmed, yes, the HQ mode does turn OFF all the optimizations, advanced view or not. You are on a hiding to nothing here, trust me.

here's a reasonable question: since nv hardware is incapable of "angle independant" AF (the GF4 did, which is why their AF was so much better than r300), how do you turn "angle dependant" AF off?

You can't turn angler dependent AF off since it is hardwired into the chip and there is no angle independent AF to enable...

then that would mean HQ does not turn off all opts, since angle dependant AF (which is an optimization) is "hardwired"...

thank you.

:confused: I don't think AF is an optimization. It would be like trAA with Nvidia vs regular AA you wouldn't say regular AA is an optimization, but rather just a different way of computing the AA.

If you realy want to turn Nvidia's angle dependent AF off there is a way, turn off all AF.
 

Gstanfor

Banned
Oct 19, 1999
3,307
0
0
Originally posted by: CaiNaM
Originally posted by: Gstanfor
Originally posted by: CaiNaM
Originally posted by: Gstanfor
Once again Ackmed, yes, the HQ mode does turn OFF all the optimizations, advanced view or not. You are on a hiding to nothing here, trust me.

here's a reasonable question: since nv hardware is incapable of "angle independant" AF (the GF4 did, which is why their AF was so much better than r300), how do you turn "angle dependant" AF off?

You can't turn angler dependent AF off since it is hardwired into the chip and there is no angle independent AF to enable...

then that would mean HQ does not turn off all opts, since angle dependant AF (which is an optimization) is "hardwired"...

thank you.

:disgust: No more hardwired than what you'll find in your precious R3xx/R4xx series GPU's...
 

coldpower27

Golden Member
Jul 18, 2004
1,676
0
76
Originally posted by: Frackal
Originally posted by: coldpower27
Originally posted by: gersson
This argument keeps going because some people REFUSE to accept that the nvidia default drivers favor speed over image quality. All it takes is a few tweaks and that is changed. However, nvidia currently takes a big hit when doing that.

Since a lot of review sites do not do this, it gives the false impression that the games are running @ equal image quality settings and skews the fps results. NOT in ATI's favor, I might add.

This is ATI's problem then that the reviews do not beleive it's enough of an issue to warrant, turning Q to HQ on Nvidia video cards.

Image Quality remains a subjective issue since it is a qualitative quanitity.


You have no idea why different review sites do what they do.

I have quite a good idea of why the review sites review the way they do thanks.
 

CaiNaM

Diamond Member
Oct 26, 2000
3,718
0
0
Originally posted by: Cookie Monster
Originally posted by: CaiNaM
Originally posted by: Gstanfor
Once again Ackmed, yes, the HQ mode does turn OFF all the optimizations, advanced view or not. You are on a hiding to nothing here, trust me.

here's a reasonable question: since nv hardware is incapable of "angle independant" AF (the GF4 did, which is why their AF was so much better than r300), how do you turn "angle dependant" AF off?

Didnt the FX series have angle independent AF? Im sure the FX series although being quite slow at DX9 games, did have crisper hence better texture quality compared to ATi's R300 angle dependent AF.

yes, GF4 was the "angle independant" AF. the FX series added some opt, however this was "optimized" further in nv40, and yet again in 7800 series; 7900 did not change one way or the other AFAIK.

 

coldpower27

Golden Member
Jul 18, 2004
1,676
0
76
Originally posted by: josh6079
Image Quality remains a subjective issue since it is a qualitative quanitity.
:confused:
then that would mean HQ does not turn off all opts, since angle dependant AF (which is an optimization) is "hardwired"...
He answers one thing correctly and you throw it in his face with something that can't be helped by any software? While ATI's AF may in fact do more work than Nvidia's, the same could be said in comparison of AA between the two. Does the fact that ATI's AA tops out at 6x mean that it is an optimization since Nvidia's can do 8x?

Ok, maybe if I wored it as Qualitative Factor, you wouldn't have a problem with it.
 

CaiNaM

Diamond Member
Oct 26, 2000
3,718
0
0
Originally posted by: Gstanfor
:disgust: No more hardwired than what you'll find in your precious R3xx/R4xx series GPU's...

so facts disgust you? i can see that from the pattern of your posts.

first of all, the r3/4xx series are not "precious" to me; i don't fall for fanboy loyalty.

secondly, what they did or did not do are completely irrelevant to this topic or our discussion. the GF4 had superb texture filtering, but that's neither here nor there. neither is the fact that, even tho it's more relevant to the discussion, current generation ATI hardware does not force angle dependant filtering (so you can acutally turn it all off).

what is relevant is your steadfast claim that you can turn off opts on current gen NV hardware is utterly incorrect, as it's (by your own admission) "hard-wired" into hardware - NV is incapable of running full texture filtering by design.
 

Gstanfor

Banned
Oct 19, 1999
3,307
0
0
The optimizations themselves are not hardwired in, but rather a different way (compared to previous nvidia GPU's) of performing AF is hardwired into nvidia's gpu's. This was in direct response to ATi's AF optimizations starting with R200 and the way consumers reacted to them. You can read nvidia's statement on this subject for yourself.

Oh, and BFG10K keeps comparing his G71 to his X800 in this thread (and others), so R3/4xx comparisons most certainly are valid for this thread, and precedent for this was NOT set by me...
 

Keysplayr

Elite Member
Jan 16, 2003
21,219
55
91
Originally posted by: Creig
Originally posted by: keysplayr2003
Originally posted by: josh6079
It was a question, not an accusation. Thankyou for clarifying an answer.

You still have yet to clarify:

[*]How it is a game engine defect when the driver settings can determine whether it happens or not.

[*]How does the developers favor for ATI hardware support your claim that the problem is nonexistent on Nvidia hardware at default Q settings.

[*]How Open GL extensions, deviceid's, or capbits relate to the wiggling textures apparent on Nvidia's default driver settings.

[*]Why you suggest turning the conformant texture clamp on (which is already on to begin with as well as Open GL restricted) yet also claim that it is the game's graphic engine problem and not the drivers.

[*]Why you are so defensive to what others claim to be poor default driver settings, even if the problem can be fixed within its other settings.

Josh, Josh, Josh..... Is it really worth another two weeks (if you're lucky) to keep this up? I mean, it's as if you don't care, or at least don't remember your vacation. It's not worth it dude. Soon this argument will intensify into insults, and then flat out become abomidable. For pages and pages and pages. Just like what happened with you and beggerking.
Point is........ Know when to call it a day.

I see nothing in Josh's response that is derogatory or inflammatory. Perhaps your time would be better spent warning people like housecat or Rollo.





Oh, wait....

So witty. If only you knew what you were talking about. ;)

Both you and Josh took me out of context. Let me clear this up for you both. I warned ONLY Josh to not continue (where things can get derogatory or inflammatory) because I don't want to see him get another vacation, or worse. Gstanfor on the other hand? Well, the less I have to view his bile posts the better.

 

CaiNaM

Diamond Member
Oct 26, 2000
3,718
0
0
Originally posted by: Gstanfor
The optimizations themselves are not hardwired in, but rather a different way (compared to previous nvidia GPU's) of performing AF is hardwired into nvidia's gpu's. This was in direct response to ATi's AF optimizations starting with R200 and the way consumers reacted to them. You can read nvidia's statement on this subject for yourself.

of course it's in the hardware. by design the hardware is limited to angle dependant AF, which is an optimization.

'angle dependant' means that (at the basic level), rather than applying whatever level of AF designated by the user via the driver to the entire scene, the fx/nv40/g7x cards from nvidia use an angle dependant mechanism in hardware to determine how much filtering a particular area requires.

each poly within the scene is 'examined' and depending on the angle at which it 'slopes', it (the hardware, not the user via the driver cp) chooses an 'appropriate' level of filtering from 2x all the way up to whatever level is specified by the user via the driver cp or the application.

that is certainly an 'optimization', and it runs regardless of wether HQ is selected or not - meaning the user cannot under any circumstance turn it off. this means despite your reluctiance to admit you are wrong, proves you are.

now if you want to actually change the subject and start discussing where this 'method' became prevelant, then yes, it was with the r300 (9700) and continued thru the x850 series (there's actually a couple year old thread here where i compared the x800 and 6800 opts, and while they both used it, it was pretty clear nv40 still offered better AF). frankly until ati's r5xx, the last decent card for textrue filtering was the GeForce 4 - all cards from then until x1k offered very poor texture filtering, regardless of whether they were red or green.

however neither the 6800GT or x800 showed the 'shimmering' issue; that came to bear with whatever changes nv made in their 7xxx series. in fact, it was outright terrible until (iirc) one of the 78.xx series drivers were released. this reduced the shimmering when using HQ mode (the default mode was still terrible), however it did not eliminate it.

as far as nvidia's statement, i don't need to read it, as i am very familiar with how this all trainspired, having owned each gen of hardware from both ati and nvidia since geforce2 and radeon 32 DDR.

Oh, and BFG10K keeps comparing his G71 to his X800 in this thread (and others), so R3/4xx comparisons most certainly are valid for this thread, and precedent for this was NOT set by me...

depends on what you're comparing. the x800 doesn't shimmer; the g71 does. OTOH you're participating in this discussion from a fan/loyalist point of view, and rather than looking at the facts objectively, you are jumping around topics, generations, etc. in an effort to make excuses (and failing rather miserably i might add).

ati is not without it's faults, but on this particular subject it's nvidia that fails, not ati. your constant rhetoric and fallacies do not change this fact, nor does it change the fact you are emphatically incorrect in your assertation that selecting HQ from the nv CP turns off all texture filtering optimizations - it can't, as unlike ati's current cards, g70/71's hardware is incapable of applying full AF across the entire scene.
 

coldpower27

Golden Member
Jul 18, 2004
1,676
0
76
Originally posted by: CaiNaM
'angle dependant' means that (at the basic level), rather than applying whatever level of AF designated by the user via the driver to the entire scene, the fx/nv40/g7x cards from nvidia use an angle dependant mechanism in hardware to determine how much filtering a particular area requires.

I think the the filtering was a bit more optimized in the NV30/NV35 but it was still Angle Independent AF, NV40 was the first Nvidia's GPU with the Angle Dependent Aniostopic Filtering.

http://www.hardocp.com/article.html?art=NDcyLDcsLA==

I hope the G80 fixes the issue and reintorduced Angle Independent Filtering again.

 

Gstanfor

Banned
Oct 19, 1999
3,307
0
0
NV40 didn't have shimmering issues? What are you smoking cainam? It was a large issue for nv40 and much discussed on forums early on until nvidia sorted the issue about a third of the way into the 60 series drivers. Frankly I find it highly amusing that the usual fanatics are busily dredging up the nearly 3 year old past on this subject...
 

josh6079

Diamond Member
Mar 17, 2006
3,261
0
0
Ok, maybe if I wored it as Qualitative Factor, you wouldn't have a problem with it.
I don't have a problem with it anyway, it was just a paradox; I was confused.
NV is incapable of running full texture filtering by design.
Sure, we know ATI has the AF crown right now. Does the fact that ATI can't do 8xAA mean that its own 6xAA is an optimization as well?
Both you and Josh took me out of context. Let me clear this up for you both. I warned ONLY Josh to not continue (where things can get derogatory or inflammatory) because I don't want to see him get another vacation, or worse.
ah.......keys...............really?.....................:heart: :thumbsup:........thanks bro...............sorry for misreading..........
I don't think AF is an optimization. It would be like trAA with Nvidia vs regular AA you wouldn't say regular AA is an optimization, but rather just a different way of computing the AA.
Agreed. An optimization is a way of modifying a system to improve its efficiency and there will always be tradeoffs. It is hard to modify something that is imbedded into the GPU and has a hardware limitation. Saying that the AF for Nvidia's 7 series is an optimization would only be a valid point of view if one were comparing mulitple generations of GPU's in which different algorithms were used. When glancing at a point in time where only one kind of each company's GPU's are compared, it shouldn't be considered an optimization but a feature. Features for GPU's change, but they cannot be forced to do more than they are capable of. Being able to change certain optimizations within a certain feature set is more to the context that we are discussing.
 

ViRGE

Elite Member, Moderator Emeritus
Oct 9, 1999
31,516
167
106
Originally posted by: CaiNaM

of course it's in the hardware. by design the hardware is limited to angle dependant AF, which is an optimization.

'angle dependant' means that (at the basic level), rather than applying whatever level of AF designated by the user via the driver to the entire scene, the fx/nv40/g7x cards from nvidia use an angle dependant mechanism in hardware to determine how much filtering a particular area requires.

each poly within the scene is 'examined' and depending on the angle at which it 'slopes', it (the hardware, not the user via the driver cp) chooses an 'appropriate' level of filtering from 2x all the way up to whatever level is specified by the user via the driver cp or the application.
A correction: you're confusing an adaptive filtering technique with the real difference between angle dependent and independent filtering, which is the math. The Euclidean method for calculating distance is independent, while the Manhattan method is dependent, however the latter is easier to calculate at the hardware level and is faster as a result. Adaptive filtering is a further layer on that, which decides ahead of time if something is worth filtering or not(and to what degree), but adaptive filtering has little to do with angles other than that it takes in to consideration angles in its calculations.

Ultimately, the reason NV4x/G7x can't do independent filtering is because they're using the Manhattan method, which limits them to dependent filtering no matter what they do.
 

Ackmed

Diamond Member
Oct 1, 2003
8,499
560
126
Originally posted by: Gstanfor
Once again Ackmed, yes, the HQ mode does turn OFF all the optimizations, advanced view or not. You are on a hiding to nothing here, trust me.


After reading BFG's post, it doesn seem I am wrong. I didnt think HQ turned off LOD clamp. I thought I had to do that manually, even with HQ. Did they just start this? In any event, a post like this, or his, is much better than your first with all the insults. You see, I can admit when wrong.

Originally posted by: schneiderguy

what am i doing here? all im saying is that you're incorrect

Yes, I was. As are you with this silly statement, "there is no visible shimmering on nvidia cards, unless you're purpusely putting your face an inch from the monitor to look for it."

The fact is, you dont have a clue. Perhaps you dont see shimmering on your small monitor, or the games you play. On my large monitor, and the games I played, I did see it easily still. I didnt put my face an inch from the monitor, and I didnt go looking for it. It was still easily visable.
 

schneiderguy

Lifer
Jun 26, 2006
10,801
91
91
Originally posted by: Ackmed


Originally posted by: schneiderguy

what am i doing here? all im saying is that you're incorrect

Yes, I was. As are you with this silly statement, "there is no visible shimmering on nvidia cards, unless you're purpusely putting your face an inch from the monitor to look for it."

The fact is, you dont have a clue. Perhaps you dont see shimmering on your small monitor, or the games you play. On my large monitor, and the games I played, I did see it easily still. I didnt put my face an inch from the monitor, and I didnt go looking for it. It was still easily visable.

you forgot part of my quote. :confused:

what i said was: with all optimizations OFF, there is no visible shimmering on nvidia cards, unless you're purpusely putting your face an inch from the monitor to look for it

there is shimmering on Quality mode (default settings) and lower. i never said there wasnt.
 

Ackmed

Diamond Member
Oct 1, 2003
8,499
560
126
We were talking about HQ, I didnt forget anything. Even with HQ, there is still visable shimmering to me. Get over that fact. Yes there is much less than with Q, but there is still shimmering with HQ. Again, your small monitor doesnt show shimmering near as badly as a large monitor, such as my 24" LCD. For you to claim that there is no shimmering, is complete ignorance. You simply do not have a clue, as you do not use a large monitor.

edit;
There you go, yes, I said it, ?shimmering.? Specifically ?texture crawling,? caused by either aggressing filtering or really bad LOD. I notice it using the Dell 2405FPW LCD. I believe the brighter contrast and crisper image coupled with the fact that the screen is just physically larger at a higher resolution all amplifies the problem and makes it extremely visible. I don?t notice it in all games, but there are a couple games in which it did negatively impact my overall level of gaming immersion. While this is another raging Green Vs. Red argument found many places on the Net, we have never specifically addressed it as we have never truly seen it impact our gameplay, but that is simply not the case with our 24? widescreen display.

http://enthusiast.hardocp.com/article.html?art=MTAwMSwxOCwsaGVudGh1c2lhc3Q=

As you can see, the same LCD as I have, they say shimmers worse than smaller screens. I do not recall the article, they compare the 2001FP to the 2405FPW as well. And again, say that shimmering is more noticable on the larger screen.
 

Gstanfor

Banned
Oct 19, 1999
3,307
0
0
Which only proves that large screen LCD's have serious issues, not seen with alternative (and superior) technologies...

EDIT: I've had my 7900GT running on a mates LG 50" HD plasma TV, 1366x768 res (using component input, naturally). I didn't notice an appreciable difference in shimmering compared to my normal monitor at all. I think the large screen argument is just a silly red herring.