AT ATI X1950 review: one factor made it a great disappointment to me

dug777

Lifer
Oct 13, 2004
24,778
4
0
I can't see why anyone would buy a 7950GX2 or a 7900GTX and use the default Q setting, since it is almost universally agreed that the q setting provides awful IQ, particularly with regards to texture crawl/shimmer, but more to the point, Q offers a signifaicant performance benefit over HQ from the tests both myself & others have done...I'd like to be able to see what the card was capable of at the settings people are going to run it at, and that includes driver settings.

It's sad that the reviewers haven't bothered to note that the matter has been getting significant discussion & interest in their own forums...in fact, the IQ matter appears to be almost totally ignored in the review.

Just my 2c at the end of the day, but FWIW i'm not impressed.
 

lopri

Elite Member
Jul 27, 2002
13,310
687
126
FiringSquad did an in-depth analysis of image quality difference between ATI and NV.

http://www.firingsquad.com/hardware/ati_nvidia_image_quality_showdown_august06/

I don't see the problem of using default setting for both cards. I also don't have a problem with reviewers using HQ for NV cards, either. Maybe it'll change the orders on the graphs by one at most, and if that's so important to you while reading reviews instead of getting the whole picture, then it's up to you. There are reviews that use HQ for NV cards and you can find them easily on line. Also according to AT review it seems clear that the new X1950 provides the best single-GPU gaming experience. (And that's what I mean by the 'whole picture')

 

dug777

Lifer
Oct 13, 2004
24,778
4
0
Originally posted by: lopri
FiringSquad did an in-depth analysis of image quality difference between ATI and NV.

http://www.firingsquad.com/hardware/ati_nvidia_image_quality_showdown_august06/

I don't see the problem of using default setting for both cards. I also don't have a problem with reviewers using HQ for NV cards, either. Maybe it'll change the orders on the graphs by one at most, and if that's so important to you while reading reviews instead of getting the whole picture, then it's up to you. There are reviews that use HQ for NV cards and you can find them easily on line. Also according to AT review it seems clear that the new X1950 provides the best single-GPU gaming experience. (And that's what I mean by the 'whole picture')

yeah well i'd expect AT to pick up on it & do it right.

I have a problem with it because as a 6600GT owner i'm well aware of the (IMHO, but i'm certainly not alone in this) extremely average default IQ settings of Nvidia drivers, as well as the not insignificant impact of running HQ (which essentially removes the vast majority of the IQ issues i have)...

As i said, it's just my 2c at the end of the day.
 

blckgrffn

Diamond Member
May 1, 2003
9,684
4,328
136
www.teamjuchems.com
Originally posted by: dug777
Originally posted by: lopri
FiringSquad did an in-depth analysis of image quality difference between ATI and NV.

http://www.firingsquad.com/hardware/ati_nvidia_image_quality_showdown_august06/

I don't see the problem of using default setting for both cards. I also don't have a problem with reviewers using HQ for NV cards, either. Maybe it'll change the orders on the graphs by one at most, and if that's so important to you while reading reviews instead of getting the whole picture, then it's up to you. There are reviews that use HQ for NV cards and you can find them easily on line. Also according to AT review it seems clear that the new X1950 provides the best single-GPU gaming experience. (And that's what I mean by the whole picture')

yeah well i'd expect AT to pick up on it & do it right.

I have a problem with it because as a 6600GT owner i'm well aware of the (IMHO, but i'm certainly not alone in this) extremely average default IQ settings of Nvidia drivers, as well as the not insignificant impact of running HQ (which essentially removes the vast majority of the IQ issues i have)...

As i said, it's just my 2c at the end of the day.


More like "mid day", it being 3:30 here when you posted :p

I too wish they would at least run a couple of the benches at HQ, to give us a good idea of what the performance difference is, in a testing manner we can all agree on.

Once it is documented by a site like anandtech, the "huge" performance drop for good IQ by enabling HQ mode that is, hopefully nvidia would work to make their Q mode more HQ and less performance. I mean, Quality is supposed to be quality, not high performance as it seemingly is right now.

Some sort of happy medium between Q and HQ would be cool... that's why there are 4 quality settings, right?
 

LittleNemoNES

Diamond Member
Oct 7, 2005
4,142
0
0
The problem can be 'solved' by ATI, from now on, making their driver defaults CAt AI advanced and quality (not HQ) mipmap settings. Easy! And we can lie to ourselves and say that both companies give us what we want

:(

At least it will be fair.
 

dug777

Lifer
Oct 13, 2004
24,778
4
0
Originally posted by: blckgrffn
Originally posted by: dug777
Originally posted by: lopri
FiringSquad did an in-depth analysis of image quality difference between ATI and NV.

http://www.firingsquad.com/hardware/ati_nvidia_image_quality_showdown_august06/

I don't see the problem of using default setting for both cards. I also don't have a problem with reviewers using HQ for NV cards, either. Maybe it'll change the orders on the graphs by one at most, and if that's so important to you while reading reviews instead of getting the whole picture, then it's up to you. There are reviews that use HQ for NV cards and you can find them easily on line. Also according to AT review it seems clear that the new X1950 provides the best single-GPU gaming experience. (And that's what I mean by the whole picture')

yeah well i'd expect AT to pick up on it & do it right.

I have a problem with it because as a 6600GT owner i'm well aware of the (IMHO, but i'm certainly not alone in this) extremely average default IQ settings of Nvidia drivers, as well as the not insignificant impact of running HQ (which essentially removes the vast majority of the IQ issues i have)...

As i said, it's just my 2c at the end of the day.


More like "mid day", it being 3:30 here when you posted :p

I too wish they would at least run a couple of the benches at HQ, to give us a good idea of what the performance difference is, in a testing manner we can all agree on.

Once it is documented by a site like anandtech, the "huge" performance drop for good IQ by enabling HQ mode that is, hopefully nvidia would work to make their Q mode more HQ and less performance. I mean, Quality is supposed to be quality, not high performance as it seemingly is right now.

Some sort of happy medium between Q and HQ would be cool... that's why there are 4 quality settings, right?

Indeed, and as i found in my testing here, to all intents and purposes Q is as fast as HP, and Q/P/HQ are essentially redundant as separate settings:

http://forums.anandtech.com/messageview...atid=31&threadid=1914268&enterthread=y
 

APCOR

Member
Feb 22, 2005
50
0
0
I dont think NV or ATI cares about the default driver setting as it pertains to benchmarks. Both companies are big boys and know their cards are being reviewed. Let them determine where the default settings should be. It is getting harder with each review to test on an even playing field with all of the varying factors (resolutions, in game settings, drivers). That is why there is so many conflicting reviews out now.
 

blckgrffn

Diamond Member
May 1, 2003
9,684
4,328
136
www.teamjuchems.com
Originally posted by: dug777
Originally posted by: blckgrffn
Originally posted by: dug777
Originally posted by: lopri
FiringSquad did an in-depth analysis of image quality difference between ATI and NV.

http://www.firingsquad.com/hardware/ati_nvidia_image_quality_showdown_august06/

I don't see the problem of using default setting for both cards. I also don't have a problem with reviewers using HQ for NV cards, either. Maybe it'll change the orders on the graphs by one at most, and if that's so important to you while reading reviews instead of getting the whole picture, then it's up to you. There are reviews that use HQ for NV cards and you can find them easily on line. Also according to AT review it seems clear that the new X1950 provides the best single-GPU gaming experience. (And that's what I mean by the whole picture')

yeah well i'd expect AT to pick up on it & do it right.

I have a problem with it because as a 6600GT owner i'm well aware of the (IMHO, but i'm certainly not alone in this) extremely average default IQ settings of Nvidia drivers, as well as the not insignificant impact of running HQ (which essentially removes the vast majority of the IQ issues i have)...

As i said, it's just my 2c at the end of the day.


More like "mid day", it being 3:30 here when you posted :p

I too wish they would at least run a couple of the benches at HQ, to give us a good idea of what the performance difference is, in a testing manner we can all agree on.

Once it is documented by a site like anandtech, the "huge" performance drop for good IQ by enabling HQ mode that is, hopefully nvidia would work to make their Q mode more HQ and less performance. I mean, Quality is supposed to be quality, not high performance as it seemingly is right now.

Some sort of happy medium between Q and HQ would be cool... that's why there are 4 quality settings, right?

Indeed, and as i found in my testing here, to all intents and purposes Q is as fast as HP, and Q/P/HQ are essentially redundant as separate settings:

http://forums.anandtech.com/messageview...atid=31&threadid=1914268&enterthread=y


I noticed this to some degree too, I mean, you would expect better performance out of HP over Q right? But you don't really, so what is the point?
 

Ackmed

Diamond Member
Oct 1, 2003
8,498
560
126
Originally posted by: lopri
FiringSquad did an in-depth analysis of image quality difference between ATI and NV.

Not really. Their next one will be;
In Part 2 we'd like to explore the control panel options offered by ATI and NVIDIA in more depth, rather than sticking to the basic settings. Be on the lookout for that article in the weeks ahead.

 

dug777

Lifer
Oct 13, 2004
24,778
4
0
Originally posted by: Ackmed
Originally posted by: lopri
FiringSquad did an in-depth analysis of image quality difference between ATI and NV.

Not really. Their next one will be;
In Part 2 we'd like to explore the control panel options offered by ATI and NVIDIA in more depth, rather than sticking to the basic settings. Be on the lookout for that article in the weeks ahead.

That will be interesting.