• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

HD6800 image quality settings vs. HD5800

tviceman

Diamond Member
I'm seriously NOT trying to start a fanboy flame war, I just want to know the truth. This was ripped from the comments on Anandtech's 6800 review:

"German magazine "PC Games Hardware" states the 68xx need "high quality" driver settings in order to reach 58xx image quality. Supposedly AMD confirmed changes regarding the driver's default settings.
Therefore they've tested in "high quality" mode and got less convincing results.

Details (german): http://www.pcgameshardware.de/aid,795021/Radeon-HD-6870-und-HD-6850-im-Test-AMDs-zweite-DirectX-11-Generation/Grafikkarte/Test/?page=4"

The one and only thing I am getting at is that I think comparisons should be done at equal levels - and to accurately compare these new cards vs. cypress/evergreen/gtx400, the exact same image quality settings should be used for all cards. I'm not entirely concerned with the benchmarks in the link provided as I am the general statement. I just think this should be answered definitively - Is this true? Are the default settings of the 6800 series lower than the default settings of the 5800 series? And if so, are the reviews around the web running the 5800 series benchmarks at higher image quality settings?
 
Last edited:
There is already a thread for HD6xxx reviews, you know... you might want to move this there. AFAIK there are some tweaks on by default in HD6k that do not visibly reduce IQ but give a slight speed bump. NV has been crying bloody murder and asking that reviewers turn that off. Or something. See Beyond3D for a discussion there about this very topic: http://forum.beyond3d.com/showthread.php?t=55313&page=156 esp. Dave Baumann's replies.
 
This isn't a thread about the reviews themselves - this is a thread about whether or not the default driver settings for the new 6k series cards are using the same default driver settings for the 5k series.

EDIT: Thanks for the link.
 
Last edited:
Wow...if true, this is bad.

No need to bury it in the review thread, this is a whole different topic.


I'd like to see other sources on this.
 
Last edited:
Ill wait to see what BFG10K has to say before I make a judgment. It is sneaky of them though. TBH though if either nVidia or AMD do this and I don't see a difference then I don't mind for the extra performance.
 
There is already a thread for HD6xxx reviews, you know... you might want to move this there.

Driver-related IQ topic deserves a separate thread IMO.

However, as such the topic needs to stay on-focus and not wander into the topic-space covered by the other threads.

Ill wait to see what BFG10K has to say before I make a judgment. It is sneaky of them though. TBH though if either nVidia or AMD do this and I don't see a difference then I don't mind for the extra performance.

Agreed.

And if anyone missed it, here's his latest IQ QA article (posted Oct 19): http://alienbabeltech.com/main/?p=21497
 
Wow...if true, this is bad.

No need to bury it in the review thread, this is a whole different topic.


I'd like to see other sources on this.

Idd, its bad

I mean i wouldnt class it in the same bad as Nvidias AA Batman fiasco, or in the really bad and apparant money under the desk HAWX2/ubisoft benchmark buyout by Nvidia or even in the same class as knowing you had faulty cards and still selling them to your customers like me (2 laptops which are not Asus, Dell or Hp that i just had to trow after 2 years of use..) by Nvidia.

But its bad.
Or well you know what. Its bad if average joe or average forum member like me and you notice it in every game.

If i dont notice it with a screen to screen comparison, i wouldnt care much if it speeds up AMDs or Nvidias product. I mean the most important thing is still a GPUs ability to push out graphics right?

Ofcourse we can argue that heat is an issue, powerdraw is an issue and image quality is an issue, and they are. But lets not butcher the cow till we know if its sick.

As i recall, none of the major review sites (like anandtech and hardocp) even had the correct driver update with that setting.

So: If HD6870 beat GTX460 and even 470 at just about every benchmark NOW, its apparant that with even more aggressive drivers, AMD will lead even more with their Barts XT and Cayman GPUs.

And imagine the drivers maturing, like they did for GTX480/470 in the coming year. Good times ahead.

All in all, im reserving judgement on this till i see the reviews. If i notice an image quality loss. Expect me to keep on to my GTX460.
 
If Barts is just a tweaked Cypress, like is being said, I don't think 6870 and 6850 are going to see much driver improvement.

It doesn't matter if you don't notice it all the time. If AMD is altering the IQ in order to appear faster in benchmarks, that is about as low as it gets.

I need to hear it from multiple sources before I get the pitchfork and torch out though 😉
 
It would make sense as this doesnt appear to be a huge upgrade from the 5000 series. I would imagine the underlying hardware issue is still there causing texture filtering to be poorer than Nvidia.
 
If Barts is just a tweaked Cypress, like is being said, I don't think 6870 and 6850 are going to see much driver improvement.

It doesn't matter if you don't notice it all the time. If AMD is altering the IQ in order to appear faster in benchmarks, that is about as low as it gets.

I need to hear it from multiple sources before I get the pitchfork and torch out though 😉

If its as bad YOU make it out to be, then I'm right behind you. Xbit are very thorough with their reviews. They are the ones that found that the the AA in Metro2033 lowered texture detail. They also found the that the textures were lower detail with GTX400 cards (When no other review could catch it, don't remember seeing you with any pitchforks)

"Quality" is the new default setting, ie, the target render Replacements active. We noticed at the beginning of the test show that the anisotropic filter of the HD 6800 cards compared with a Radeon HD 5870 flickers stronger. We noticed at the beginning of the test show that the anisotropic filter of the HD 6800 cards compared with a Radeon HD 5870 flickers stronger. After consulting with AMD informed us that the default "quality" aggressive filtering than the previous driver standard (AI standard, where the texture filtering optimizations in the HD 5800 series have already disabled). After consulting with AMD informed us that the default "quality" aggressive filtering than the previous standard driver (AI standard, where the texture filtering optimizations in the HD 5800 series have already disabled). Only the "High Quality" bring the improvements to previously unknown levels, and is similar to the previous AI standard of HD 5800 cards. Only the "High Quality" bring the improvements to previously unknown levels, and is similar to the previous AI standard of HD cards 5800th The HD 6000 cards filter that is standard on the level of a HD 5000 card with AI Advanced / Advanced. The HD 6000 cards that filter is standard on the level of a HD 5000 card with AI Advanced / Advanced. AMD gives its new cards a fps advantage at the expense of image quality. AMD gives its new cards a fps advantage at the expense of image quality. Is "High quality" is activated, the AF unsightly banding disappears almost completely, the flicker is also reduced - the frame rate of course. Is "High quality" is activated, the AF unsightly banding disappears almost completely, so the flicker is reduced - the frame rate of course.

In short: "High Quality" on a Radeon HD 6000 is based on our assessment, most likely with Nvidia Drivers Standard (Quality / Quality including "Trilinear Optimization") is comparable - and so therefore we test. In short: "High Quality" on a Radeon HD 6000 is based on our assessment, most likely with Nvidia Drivers Standard (Quality / Quality including "Trilinear Optimization") is comparable - and so Therefore we tested.

The bottom line is that the development is positive: The bottom line is that the development is positive:
- The banding problem of evergreens, according to AMD, a hardware bug is fixed, - The banding problem of evergreens, according to AMD, a hardware bug is fixed,
- The AF-flicker is not as strong as with a HD 4000 and the render target optimizations can be switched off if desired. - The AF-flicker is not as strong as with a HD 4000 and the render target optimizations can be switched off if desired.
- The users are given more freedom than ever. - The users are given more freedom than ever.

While Nvidia is better as regards the AFs remain as close as the Radeon HD 6000 AMD was the Geforce cards but not a long time. While Nvidia is better as regards the AFs remain as close as the Radeon HD 6000 AMD what the Geforce cards but not a long time. If you value as clean as possible is filtered textures, but not around a Geforce including HQ AF. If you value as clean as possible is filtered textures, but not around a Geforce including HQ AF.

We will see in the AF under expanded testing in more detail and publish articles. We will see in the AF under expanded testing in more detail and publish articles. As part of the few test days this is our preliminary conclusion. As part of the test few days this is our preliminary conclusion.

Essentially they are saying that the default driver profile for the 6800series is the same as Catalyst AI on advanced on the 5800series (which everyone used anyway)

They even say that its the same as nVidias default driver settings.
 
Essentially they are saying that the default driver profile for the 6800series is the same as Catalyst AI on advanced on the 5800series (which everyone used anyway).

Well that's fine then. All reviewers should have been running benchmarks with Catalyst AI set to advanced on all Radeon cards in the first place. Not doing so is about as silly as leaving vsync on.
 
Well that's fine then. All reviewers should have been running benchmarks with Catalyst AI set to advanced on all Radeon cards in the first place. Not doing so is about as silly as leaving vsync on.

Yeah, especially when the optimizations in the nV driver are on by default and it's hard to turn of ALL of them off.

I don't mind either way really, I don't notice these things anyway. I mean I can't even see the difference between 4xAA and 8xAA.
 
I'm seriously NOT trying to start a fanboy flame war, I just want to know the truth. This was ripped from the comments on Anandtech's 6800 review:

"German magazine "PC Games Hardware" states the 68xx need "high quality" driver settings in order to reach 58xx image quality. Supposedly AMD confirmed changes regarding the driver's default settings.
Therefore they've tested in "high quality" mode and got less convincing results.

Details (german): http://www.pcgameshardware.de/aid,795021/Radeon-HD-6870-und-HD-6850-im-Test-AMDs-zweite-DirectX-11-Generation/Grafikkarte/Test/?page=4"

The one and only thing I am getting at is that I think comparisons should be done at equal levels - and to accurately compare these new cards vs. cypress/evergreen/gtx400, the exact same image quality settings should be used for all cards. I'm not entirely concerned with the benchmarks in the link provided as I am the general statement. I just think this should be answered definitively - Is this true? Are the default settings of the 6800 series lower than the default settings of the 5800 series? And if so, are the reviews around the web running the 5800 series benchmarks at higher image quality settings?

History did not start yesterday. I think once nvidia got caught cheating on benchmarks all bets were off.
 
I don't see the problem if AMD gives you the option to turn off the optimizations. This is just analogous to another level of fine tuning the balance between performance and IQ - much the same as having the option to choose between 2x, 4x, 8x, 16x AF. Or choosing among the various levels of AA.
 
I don't see the problem if AMD gives you the option to turn off the optimizations. This is just analogous to another level of fine tuning the balance between performance and IQ - much the same as having the option to choose between 2x, 4x, 8x, 16x AF. Or choosing among the various levels of AA.

I don't really have an issue with it either. It's just a tiny bit sneaky to try to get better benchmark scores with the 'same' defaults. 58xx default IQ is better than 68xx default IQ it would seem, but both are capable of the same IQ once you customize your settings, at least that's what I'm gathering from this.
 
History did not start yesterday. I think once nvidia got caught cheating on benchmarks all bets were off.

You must not have finished reading what I said in my first post:

The one and only thing I am getting at is that I think comparisons should be done at equal levels - and to accurately compare these new cards vs. cypress/evergreen/gtx400, the exact same image quality settings should be used for all cards."
 
Well tviceman, you do know that they arent the same exact cards..right?

AMD was first out with superior image quality (and has actually been the brand with the best image quality if you go back and check all the way from Radeon 9700 etc) and Nvidia caught up with the 5xxx series with their nice new 400 series AA type.

There will be differences, but look at the number of games where one has noticed a difference between the image quality of AMD/Nvidia. Very few and thats how its gonna stay. Unless AMD really messes with this MOO AA thing 🙂 lets wait and see in a fey days /week
 
Well tviceman, you do know that they arent the same exact cards..right?

AMD was first out with superior image quality (and has actually been the brand with the best image quality if you go back and check all the way from Radeon 9700 etc) and Nvidia caught up with the 5xxx series with their nice new 400 series AA type.

There will be differences, but look at the number of games where one has noticed a difference between the image quality of AMD/Nvidia. Very few and thats how its gonna stay. Unless AMD really messes with this MOO AA thing 🙂 lets wait and see in a fey days /week

I was aware of all of this I just wanted to make sure all benchmarks were apples to apples comparisons that is all.
 
Guru3D review touches on IQ. Identical sums it up if you are comparing AMD 68xx IQ vs nVidia 4xx IQ.

The reviews are there, IQ isn't being debated as a hot topic. This topic of this thread is grapsing at straws in an attempt to sabatoge what AMD has achieved with it's recent release of the 6800 series.

The 6870 and 6850 are fast and provide X IQ (substitute whatever word you would use to describe nVidia IQ in it's current generation). The 68xx isn't getting it's speed from what is being subtly called a "cheat" in this thread.
 
The one and only thing I am getting at is that I think comparisons should be done at equal levels - and to accurately compare these new cards vs. cypress/evergreen/gtx400, the exact same image quality settings should be used for all cards.
That’s often not possible given hardware differences between the cards. About the only thing you can do is set all of the parts to their highest quality settings, then analyze any observable differences. I operate my cards like then when I game, so I benchmark exactly the same way.

As for the specific case of the 6000 series, what from I’ve seen so far, the highest quality settings are at least as good as nVidia’s at surface filtering, while also having superior angle coverage.

It’s possible the default 6000 settings look worse but that’s irrelevant given what I posted about using the highest quality settings, and also because I doubt any real performance change will be observed. nVidia’s default optimizations also offer little to no performance gain, so there’s absolutely no point is using them.

I don’t have a 6000 part yet but when I do, rest assured I’ll be taking a good look at its image quality as well as its performance.
 
Guru3D review touches on IQ. Identical sums it up if you are comparing AMD 68xx IQ vs nVidia 4xx IQ.
No disrespect to Guru3D, but it’s tough to take a website seriously on the topic of IQ if they infer that MAA is analogous to CSAA.
 
- The banding problem of evergreens, according to AMD, a hardware bug is fixed, - The banding problem of evergreens, according to AMD, a hardware bug is fixed,

This makes me so mad. Personally I think there should be a recall on the 5800s since the card is essentially broken hardware wise.
 
Last edited:
Back
Top