"Except you absolutely do have a way of telling how much vaseline it applies to what parts of the image, namely due to the fact that no encoder in existence will selectively "vaseline" specific textures and nothing else on the image."
So how does the encoder decide how much detail it removes from a scene to compress to a certain bitrate?
"Repeating this doesn't make it any more relevant or correct. Lossless images are still not necessary, no matter how many times you keep claiming it."
I didn't say it was a necessity, I said I disagree with using them at this point in the discussion. Did I actually say this? If so, I apologise, that's not what I meant to imply.
"The only notable difference between video compression and image compression is the possibility of interframe compression with videos, but again that will never result in the kind of errors we are seeing here."
Exactly, so on the video are you comparing a I frame, B frame or P frame?
Again, let me quote myself:
"But then, again, if you all want to discuss this based on Youtube footage, then I don't have a problem with this. I just agree with the criticisms that now an initial investigation has been done with Youtube footage, someone should get looking at lossless pictures."
So you guys can keep arguing with me if you like, but it won't change my opinion on this discussion.
So you guys can keep arguing with me if you like, but it won't change my opinion on this discussion.
"A few people have mentioned the RX 480 looks sharper and slightly different colour than the 1060 in some games. The reason for this is due to recording the display from a camcorder and at different times for each card. The white balance is different and focal range may not be exactly the same.
"
First comment on the video, by the uploader..
https://www.youtube.com/watch?v=NQmbtVz972A
Ofc people would ignore this "little fact" just to bash the other company, I'm sure if this was some "nvidia reddit" they would do the same, but still, its getting out of hand, people and their bias against one company.
Looks like it's not just limited to Doom.. people finding the same issue in Witcher 3 and Hitman on 1060:
https://www.reddit.com/r/Amd/comments/4ttkbg/rx_480_vs_gtx_1060_image_quality/
What do you think there is to gain at this point looking at lossless images? Do you think that will somehow help show what is causing the issue?
Your point is valid but, what's happening with that handkerchief has nothing to do with video compression.
Hmm, this is getting more and more interesting...
https://youtu.be/NQmbtVz972A?t=180
Time is the same in game. Not sure why using a camcorder would only make Nvidia look worse not AMD.
Has anyone confirmed that the missing snow bug was ever fixed in Nvidia drivers for Ashes? Seems like lots of reviews have shown AMD with higher IQ but it never gets investigated and either dispelled or confirmed.
It's definitely blurry but being out of focus doesn't magically make texture detail disappear. Look at the ground the 1060 has basically no detail in that texture.You can see that the text is blurry on the nvidia side so nothing with rendering, and even if there is, you cant know from this video.
He stated that he captured both videos at different time, clearly the focus was more accurate with the amd video and not the nvidia one, TBH, the question is why didn't he use a capture card like any normal person, instead of showing half assed out of focus video.
https://youtu.be/NQmbtVz972A?t=180
Time is the same in game. Not sure why using a camcorder would only make Nvidia look worse not AMD.
Has anyone confirmed that the missing snow bug was ever fixed in Nvidia drivers for Ashes? Seems like lots of reviews have shown AMD with higher IQ but it never gets investigated and either dispelled or confirmed.
It's definitely blurry but being out of focus doesn't magically make texture detail disappear. Look at the ground the 1060 has basically no detail in that texture.
Sent from my XT1575 using Tapatalk
Interestingly enough this looks like an example where video encoding at some stage of the creation of the video has had an effect. It looks like the one on the right has been encoded at a much lower rate than the one on the left before combining and uploading.
The encoding artifacts are huge on the right, yet non existent on the left.
https://www.youtube.com/watch?v=NQmbtVz972A
glancing through the video
see Witcher 3 difference but the Nvidia side is more zoomed in than the AMD side
see some kind of color / gamma / contrast / brightness difference between AMD & Nvidia on Hitman & Thief
reading the game footage is based off of a camera recording instead of in-game capture so there's no point putting a lot of stock into the visual differences
shouldn't be hard to take very similar still screenshots on both cards for comparison if someone cares enough to do so
Lol at people trying to discredit the legendary gtx970, o dont remember a gpu which sold more then this...looking at steam surveys the gtx 970 sells more then the entire amd gpu's line put together����
If this is not ownage i dont what else it could be, also these threads always coming after someone misses their goals by a huge amount...remember fury x vs gtx 980ti fury x got owned and thread about different quality image between those two on bf4 came out...and surprise surprise the same happens here, rx480 owned by the gtx1060 and this thread comes out...again nothing to see here just pure trolling...
I don't get this topic.
This is what megatexture has always done. If amd specific vulkan codepaths do it faster good for them, but it's not up to nvidia to "fix" this "issue".
Both screens look like crap to me, of course more so the Nvidia one. However I am very familiar with Witcher 3 and played it extensively on my 970 and it never looked remotely like either one of these screens. Everything was razor sharp throughout my gameplay and if anything remotely looked like that on one of the best looking games, I would give up gaming. Plus the right screen is so obviously bad it would have caused a scandal by now. Here we are 2 years later and someone comes up with this.You can see that the text is blurry on the nvidia side so nothing with rendering, and even if there is, you cant know from this video.
He stated that he captured both videos at different time, clearly the focus was more accurate with the amd video and not the nvidia one, TBH, the question is why didn't he use a capture card like any normal person, instead of showing half assed out of focus video.
Both screens look like crap to me, of course more so the Nvidia one. However I am very familiar with Witcher 3 and played it extensively on my 970 and it never looked remotely like either one of these screens. Everything was razor sharp throughout my gameplay and if anything remotely looked like that on one of the best looking games, I would give up gaming. Plus the right screen is so obviously bad it would have caused a scandal by now. Here we are 2 years later and someone comes up with this.https://youtu.be/NQmbtVz972A?t=180
Time is the same in game. Not sure why using a camcorder would only make Nvidia look worse not AMD.
Has anyone confirmed that the missing snow bug was ever fixed in Nvidia drivers for Ashes? Seems like lots of reviews have shown AMD with higher IQ but it never gets investigated and either dispelled or confirmed.
Texture times! :thumbsup:It looks like we need to start benchmarking texture loads.