gtx970 vs. rx480 Doom.. peculiar image quality discrepency

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

boozzer

Golden Member
Jan 12, 2012
1,549
18
81
seems like testing it on a 7970 or 280x would answer the question. good suggestion.
 

Yakk

Golden Member
May 28, 2016
1,574
275
81
The engine itself does that by design.

I would assume so. And I'm saying this compromise may have been solved by AMD by being able to use Compute Threads through the Async Scheduler. While nvidia is stuck using context switching in their work queue, and the higher nvidia clocks cannot compensate the multitasking AMD GPUs are able to do.
 

Leyawiin

Diamond Member
Nov 11, 2008
3,204
52
91
He should go into the Nvidia Control Panel and set "Texture Filtering" from Quality to High Quality. This turns off any texture related performance optimizations. If it disappears he'd have his answer (driver vs. architecture). If its driver contact NV and let them know how glaringly poorer their IQ is vs. AMD because of the texture loading optimizations.
 

dogen1

Senior member
Oct 14, 2014
739
40
91
He should go into the Nvidia Control Panel and set "Texture Filtering" from Quality to High Quality. This turns off any texture related performance optimizations. If it disappears he'd have his answer (driver vs. architecture). If its driver contact NV and let them know how glaringly poorer their IQ is vs. AMD because of the texture loading optimizations.

This isn't related to texture filtering.
 

Stuka87

Diamond Member
Dec 10, 2010
6,240
2,559
136
He should go into the Nvidia Control Panel and set "Texture Filtering" from Quality to High Quality. This turns off any texture related performance optimizations. If it disappears he'd have his answer (driver vs. architecture). If its driver contact NV and let them know how glaringly poorer their IQ is vs. AMD because of the texture loading optimizations.

The fact that the default setting is not High Quality is an issue in itself. Every benchmarker uses the default driver settings (as they should). Which means the benchmark is run with lower quality settings than on an AMD card.
 

Leyawiin

Diamond Member
Nov 11, 2008
3,204
52
91
This isn't related to texture filtering.

You don't know for certain what optimizations it controls from game to game. Its a quick check.

The fact that the default setting is not High Quality is an issue in itself. Every benchmarker uses the default driver settings (as they should). Which means the benchmark is run with lower quality settings than on an AMD card.

That's not the point.
 

Thala

Golden Member
Nov 12, 2014
1,355
653
136
The fact that the default setting is not High Quality is an issue in itself. Every benchmarker uses the default driver settings (as they should). Which means the benchmark is run with lower quality settings than on an AMD card.

Is this an assumption or fact? If this is true, you would not compare likes with likes when running a benchmark.
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
I'm saying I don't know, but I don't expect it to.

Well, there's obviously other ways of determining performance than simply FPS. If, for example, we consider frame time variations unacceptable because it hurts immersion why wouldn't we consider this unacceptable and "poor performance"?
 

antihelten

Golden Member
Feb 2, 2012
1,764
274
126
This whole thing reminds me of the texture popping issues of Rage (another id software game I might add)
 

dogen1

Senior member
Oct 14, 2014
739
40
91
Well, there's obviously other ways of determining performance than simply FPS. If, for example, we consider frame time variations unacceptable because it hurts immersion why wouldn't we consider this unacceptable and "poor performance"?

We should. I just tend to say framerate or fps out of habit, instead of frametime.
 

Stuka87

Diamond Member
Dec 10, 2010
6,240
2,559
136
Is this an assumption or fact? If this is true, you would not compare likes with likes when running a benchmark.

Me saying "every" benchmarker, may be pushing it a bit. But the majority of big sites do not change driver settings in the control panels. On occasions where they do, they note what they changed.
 

antihelten

Golden Member
Feb 2, 2012
1,764
274
126
I understand people want to blame any issues on the Dev. If it was the game then it would happen on all cards.

In this case it may actually very well be the developers "fault". Although it wouldn't really be a "fault" as such, since it would be the intended behavior.

In the case of RAGE (id Tech 5), the issue was with the new megatexture technology and the fact that the texture cache was arguably set too low (later fixed by a patch).

I don't think DOOM (id Tech 6) is using megatextures, but it obviously still streams in textures. It may very well be that in DOOM the texture cache is set dynamically based on the reported VRAM amount, and the 970 just so happens to end up on the low end with its 4 GB.

It would be very interesting to see how a 4 GB RX 480 behaves compared to the 8 GB version, also how a 970 with a manually raised texture cache size performs (id software may have set the cache size overly conservative).
 

Erenhardt

Diamond Member
Dec 1, 2012
3,251
105
101
No wonder gtx 1080 is faster :D
BYrMJOM.jpg


VRAM issue? This card has twice as much as fury, and 2GB more than 980ti, both of which render in full quality.
 

Deders

Platinum Member
Oct 14, 2012
2,401
1
91
Could it be because Doom wants 6GB of Vram for full detail, something the tested Nvidia cards don't have?
 

gamervivek

Senior member
Jan 17, 2011
490
53
91
No wonder gtx 1080 is faster :D
BYrMJOM.jpg


VRAM issue? This card has twice as much as fury, and 2GB more than 980ti, both of which render in full quality.

What I don't get is why do nvidia cards show such anomalies when I see them using a full gig or even more memory than the Fury X in his videos? Lower LoD is fine but this is straight up blurry textures which shouldn't happen at any setting.

I remember similar stuff happened with Titan X vs. Fury X comparison last year in BF4 and the tester somehow managed to increase the performance gap after turning on higher quality settings in nvidia panel when it should have decreased.:sneaky:
 

Flapdrol1337

Golden Member
May 21, 2014
1,677
93
91
No wonder gtx 1080 is faster :D
BYrMJOM.jpg


VRAM issue? This card has twice as much as fury, and 2GB more than 980ti, both of which render in full quality.

Looks like the 0x anisotropic filtering agian. Though the rest of the scene seems unaffected. Maybe it's just a bug.

People shouldn't set the nvidia control panel to max performance. Keep it on quality. I just set it to max quality and disable the anisotropic sample option thingy. It doesn't really make a performance difference anyway.
 
Last edited:
Apr 30, 2016
45
0
11
DOOM's Vulkan implementation uses Async compute to speed up the MegaTexture texture streaming.

The GTX 970 is not running Async compute.

It's interesting to see idTech 5's virtual texturing finally be somewhat viable a few years later with the features of the new APIs such as Async Compute, Tiled Resources, and of course the faster hardware and memory of today.
 

TheRyuu

Diamond Member
Dec 3, 2005
5,479
14
81
No wonder gtx 1080 is faster :D
BYrMJOM.jpg


VRAM issue? This card has twice as much as fury, and 2GB more than 980ti, both of which render in full quality.

You can't use a youtube video as a valid comparison. The video compression artifacts make any comparison meaningless since it will taint the results. The video compression can make 3 identical scenes look different side by side, there's just now way to tell. You need to be comparing lossless screenshots.