This is what your post said before you edited it:
Indeed, I guessed wrong. I figured the one on the right was the GTX680 because that side has better IQ, and it's a TWIMTBP title. To me it seemed like the left side had better looking Ambient Occlusion.
So, splain dat wun.![]()
I'm not sure what the issue is. First, he still has what you posted in his post, it's just now hid as a spoiler. Second, his initial comment was about IQ, not about smoothness which is the point of the video. It looks like to me, he made his initial guess based on the image quality since it is an Nvidia sponsered game and he couldn't tell a difference in the smoothness. Then he followed that up with a comment on the smoothness at 25% speed. . .
For what it's worth, I noticed the extra stutter on the one side over the other, but couldn't have guessed which card was which for sure.
First of all, don't answer for him. Second, look harder and you'll see what I'm referring to. But you have to look at it with your eyes open.
Gotcha, missed the little difference there.
NH did quite a few of these videos. The one thing that stands out for me is just how much sharper and crisper the image looks on the Radeons.
If AMD optimised the way Nvidia does they could be a lot faster
This shows what I'm getting at more clearly
NH did quite a few of these videos. The one thing that stands out for me is just how much sharper and crisper the image looks on the Radeons.
If AMD optimised the way Nvidia does they could be a lot faster
This shows what I'm getting at more clearly
These results also seem to correlate quite well with reports from other sites such as Tech Report which examined the phenomenon
which means:Det går att märka av effekterna vid vanligt spelande, men troligtvis finns det många som inte skulle se några problem om de inte pekades ut.
Det är väldigt individuellt hur mycket man märker av och påverkas av ojämna renderingstider. För undertecknads del är det inte något som känts som ett akut problem och när jag exempelvis spelade upp vårt blindtest några dagar efter videorenderingen blev jag först osäker på vilket kort som var vilket.
Sums it up pretty nicely me thinks.It is possible to detect the differences during regular game play, but there are likely many that would not notice any problem unless they were pointed out to them.
How much one notices and is affected by irregular rendering times is very personal. For my own part this is not something that has felt like an acute problem, and when I for example played through our blind test a couple of days after the video rendering I was uncertain which card was which.
I agree with all that. It is just that it is worth pointing out that this is hardly the End of the World as we know it situation, contrary to what certain trolls are trying to make of it. I think we have a lot more important things to worry about with our graphics cards.Imho,
There is no doubt that many wouldn't notice -- many don't see texture aliasing as well. But, the key some do notice and may be important -- having more data to decide choice is welcomed.
Looks like NordicHardware first needs to learn how to operate camera and then how to encode, because none of that greenish hue on Nvidia shows on Techreport's comparison capture.
Secondly you need uncompressed or at least high bitrate video to spot anisotropic-filtering difference between the two manufacturer.
If Youtube with its 5 Mbps limit is able to display the difference (which BTW has never ever been flattering for AMD) - you're doing it all wrong.
NH did quite a few of these videos. The one thing that stands out for me is just how much sharper and crisper the image looks on the Radeons.
If AMD optimised the way Nvidia does they could be a lot faster
This shows what I'm getting at more clearly
I've said the same for years, get ready to be called a fanboy :awe:Wow! Big difference in terms of quality!
Great videos! TBH, I think both hitch now and then, and it's tough for me to say which one more so. I do think too much quality is lost in YouTube's compression, especially due to the "smoothing" effects applied when they re-encode, and therefore it's difficult to make an accurate conclusion. What's more obvious to me is the difference in the color temperature of the cards (assuming they're using the same PC, monitor, and camera for the trials).
I think we're moving into a level of specificity that's going to depend on the subjective experience more than anything else. Some people may see no difference, some people may see a huge difference. Like I said, I've seen over the last few years that AMD cards produce a sharper image on my 1600p monitor than NVIDIA counterparts. Someone else may disagree completely. I think having some scientifically-sound (and hopefully irrefutable) data will help iron out some details, but it's still largely going to depend on the user-experience. For example, people are comparing the smoothness of Skyrim at ~70FPS. I generally play Skyrim at 20-30FPS because of all the graphics mods I have. The game is very smooth and perfectly playable (thanks motion blur and Bokeh!), but someone else may very well hate those settings.To me the fact that there is a difference is important enough and would definatly dictate which card I were to purchase. With that being said I'm still inclined to believe this may be a game by game basis. I have owned both a 680 and multiple 7970's. and neither stuck out as being smoother than the other. Could be the that I'm an avid frame limiter kind of guy which has its own smoothing advantages.
It's fairly likely they know how to do both better than you.Looks like NordicHardware first needs to learn how to operate camera and then how to encode, because none of that greenish hue on Nvidia shows on Techreport's comparison capture.
It's fairly likely they know how to do both better than you.
NVDA image quality sure looks like it could use some work,maybe they've sacrificed some IQ for frame rates or something...:\
Link to IQ tests HD7970 GHz Edition Versus GTX680
http://www.youtube.com/watch?feature=player_detailpage&v=CqTmx1V47WE
How do I get the green ambient lighting on my GTX 680? :|
IQ imo is an important enough criteria to ditch one GPU over another in a heartbeat. I dont care if a card is 100% faster than another, I will not even consider it if IQ is even a tiny bit off. Very easy to get (superficial) differences if there is the slightest driver setting differences in contrast, gamma to give the impression that something is sharper or more crisp. These can always be fine tuned to ones preferences. I have a 6850 in one machine and a (newly bought) GTX 660ti in my main PC. Cant say I prefer one over another in motionless scenery, but I sometimes add about 5% contrast and -2 gamma to give that sharper, crisper look (and AFx16).
Quite a few peeps here who own/have owned both 680 and 7970. I dont think any of them have bought into this IQ issue.
NH did quite a few of these videos. The one thing that stands out for me is just how much sharper and crisper the image looks on the Radeons.