sirmo
Golden Member
- Oct 10, 2011
- 1,014
- 391
- 136
yes and they were offline: https://www.reddit.com/r/Amd/comments/4m692q/concerning_the_aots_image_quality_controversy/d3tvaax
Looks like I do, yay meIt's a bit embarrassing because you don't seem to know what the word conjecture means.
So the 1080's better showing over the 980Ti/Titan X in DX12 is due to a dodgy driver that skips rendering shaders??
Wow. Cheating busted, where's the tech press up in arms about IHVs cheating with drivers?
*crickets chirping*
That's right. USA tech sites, all bought out. Leave it to EU tech sites to cover this..
Still waiting for your arguments. Do you have any evidence that contradicts what he says?
Can you explain where AMD got that press-only driver? Makes us wonder where that Geforce GTX 1080 came from, if a hardware website 'close' to them provided a review sample.
Can you explain where AMD got that press-only driver? Makes us wonder where that Geforce GTX 1080 came from, if a hardware website 'close' to them provided a review sample.
Can you explain where AMD got that press-only driver? Makes us wonder where that Geforce GTX 1080 came from, if a hardware website 'close' to them provided a review sample.
Does where AMD got it from have any relevance at all? It was the driver that NV released in order to benchmark the 1080. Once the NDA was lifted, I can't see that there was any reason not for AMD to have it since that was the driver that NV gave for reviews of their card. AMD was comparing the 480 to what had been reviewed in the press.
As for where they got it. Well, it was released to the public days before AMD's presentation at Computex. Instead of your insinuation that it was from some nefarious source, how about the simplest answer. They bought a few from a store.
Yeah. He called him out for misquoting right away. That was the only evidence he presented. So post is invalid.
Retail comes with a newer version, so no. Where they got it? Was it from someone who received a card from NVIDIA to review? They didn't even bother to update it.
Once again, let me ask. Does wherever AMD got the driver from have any relevance to the benchmark comparisons that they did? Was it in any way unfair for AMD to use the driver that NV gave to the press in order to benchmark the 1080 in order to benchmark the 480 against the 1080?
Conspiracy theories aside, was AMD's comparison of the 480 in CF against the 1080 in ANY way an unfair benchmark?
I wouldn't call it unfair, but considering how everybody knows this title favours their hardware and provides better than average multi-GPU scaling, it was very likely a best case scenario. Doesn't even make sense, they missed the opportunity to make it look like a great perf/$ by comparing it to Geforce GTX 960/970.
I don't know, are there any comparisons out? The fact that they weren't using up to date drivers tells they weren't paying too much attention at the NVIDIA system.
Retail comes with a newer version, so no. Where they got it? Was it from someone who received a card from NVIDIA to review? They didn't even bother to update it.
https://twitter.com/dankbaker/status/739880981612625920
No it's not, but let's pretend it is. Lots of valid points in his posts.
I don't know, are there any comparisons out? The fact that they weren't using up to date drivers tells they weren't paying too much attention at the NVIDIA system.
I wouldn't call it unfair, but considering how everybody knows this title favours their hardware and provides better than average multi-GPU scaling, it was very likely a best case scenario. Doesn't even make sense, they missed the opportunity to highlight the perf/$ by comparing it to Geforce GTX 960/970 instead.
It may be capable in AotS, but that might have been Oxide's intention. "AMD can do AC, NV can't, so we'll put that in the game". Not necessarily true, but it is possible in a GE title.I can't see how it's a best case scenario. AMD does do better on AOTS than NV, but from everything that I've read, that's not because the game is tilted to AMD, but because DX12 allows AMD's hardware to unlock it's full potential in a way that wasn't available in DX11. Additionally, NV had full access to the source code and was able to optimize it for their hardware. If anything, since both AMD and NV have been able to see and modify source code to optimize performance on their hardware since at least back to the Beta version, I tend to think of this as pretty much a good apples to apples comparison for what is possible on both brands of hardware under DX12.
It may be capable in AotS, but that might have been Oxide's intention. "AMD can do AC, NV can't, so we'll put that in the game". Not necessarily true, but it is possible in a GE title.
On the other hand, this game seems to make good use of AC, so maybe they just wanted to test it out.
http://www.overclock.net/t/1575638/...able-legends-dx12-benchmark/110#post_24475280Saying we heavily rely on async compute is a pretty big stretch. We spent a grand total of maybe 5 days on Async Shader support. It essentially entailed moving some ( a grand total of 4, IIRC) compute jobs from the graphics queue to the compute queue and setting up the dependencies. Async compute wasn't available when we began architecting (is that a word?) the engine, so it just wasn't an option to build around even if we wanted to. I'm not sure where this myth is coming from that we architected around Async compute. Not to say you couldn't do such a thing, and it might be a really interesting design, but it's not OUR current design.
Saying that Multi-Engine (aka Async Compute) is the root of performance increases on Ashes between DX11 to DX12 on AMD is definitely not true. Most of the performance gains in AMDs case are due to CPU driver head reductions. Async is a modest perf increase relative to that. Weirdly, though there is a marketing deal on Ashes with AMD, they never did ask us to use async compute. Since it was part of D3D12, we just decided to give it a whirl.
What exactly is wrong with this though?Can you explain where AMD got that press-only driver? Makes us wonder where that Geforce GTX 1080 came from, if a hardware website 'close' to them provided a review sample.
I love how this has turned from IQ was incorrect due to rendering issue because of Nvidia's drivers into a conspiracy around how AMD got the bad rendering drivers in the first place. If there is a few % difference in performance no one would have noticed and all the reviews would have been done with the bad driver.
But because people were so die hard on making AMD look bad because they thought that AMD was rendering incorrectly they are spinning the issue to once again try to make AMD look bad.
It may be capable in AotS, but that might have been Oxide's intention. "AMD can do AC, NV can't, so we'll put that in the game". Not necessarily true, but it is possible in a GE title.
On the other hand, this game seems to make good use of AC, so maybe they just wanted to test it out.