Techreport 7950 vs. GTX 660 Ti "Smoothness" videos

Page 14 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Hitman928

Diamond Member
Apr 15, 2012
6,663
12,300
136
This is what your post said before you edited it:
Indeed, I guessed wrong. I figured the one on the right was the GTX680 because that side has better IQ, and it's a TWIMTBP title. To me it seemed like the left side had better looking Ambient Occlusion.

So, splain dat wun. :colbert:

I'm not sure what the issue is. First, he still has what you posted in his post, it's just now hid as a spoiler. Second, his initial comment was about IQ, not about smoothness which is the point of the video. It looks like to me, he made his initial guess based on the image quality since it is an Nvidia sponsered game and he couldn't tell a difference in the smoothness. Then he followed that up with a comment on the smoothness at 25% speed. . .

For what it's worth, I noticed the extra stutter on the one side over the other, but couldn't have guessed which card was which for sure.
 

Keysplayr

Elite Member
Jan 16, 2003
21,211
50
91
I'm not sure what the issue is. First, he still has what you posted in his post, it's just now hid as a spoiler. Second, his initial comment was about IQ, not about smoothness which is the point of the video. It looks like to me, he made his initial guess based on the image quality since it is an Nvidia sponsered game and he couldn't tell a difference in the smoothness. Then he followed that up with a comment on the smoothness at 25% speed. . .

For what it's worth, I noticed the extra stutter on the one side over the other, but couldn't have guessed which card was which for sure.

First of all, don't answer for him. Second, look harder and you'll see what I'm referring to. But you have to look at it with your eyes open.
 

Hitman928

Diamond Member
Apr 15, 2012
6,663
12,300
136
First of all, don't answer for him. Second, look harder and you'll see what I'm referring to. But you have to look at it with your eyes open.

Gotcha, missed the little difference there. Still could be an honest mistake though, guess we'll have to wait for his response.
 
Last edited:

ocre

Golden Member
Dec 26, 2008
1,594
7
81
Youtube smooths out hitches and this is why they are posting slow motion videos. The reason is because youtube videos are compiled at a much lower frame rate than what the actual user experiences. So if your playing a game at 60 to 120fps and capture the video. By the time its compiled to youtube at 30fps a lot is lost. You are not seeing what the game played like anymore as its been filtered down. recording a video in high speed helps to show the hitching because it stretches out the hitches. Record a video at 240fps so when its compiled to 30fps you will see hitching that would be lost in the sampling. Slowmotion gives 3x more frames so that the 30fps sampling catches more of the hitching that is otherwise lost.

Its not really exaggerating at all. the consistent low frame rate of youtube videos smooths out frame inconsistencies. Even the slowed down videos played back through youtube will be smoother. You are not seeing what the original experience was like at all on these youtube videos and at 30fps any gamer can see that it looks nothing like what we experience while playing directly. At 30fps both can look odd or off because we are used to playing in real time. These videos are a long way from real time.

The point they are trying to represent is that there is noticeable differences. While youtube videos can be used, they are really to poor to represent the actual experience. watching a full HD AVI from the camcorder would be many times better. Magnitudes. But even this captures samples at a steady frame rate and will not be capable of representing the actual experience 100%. It would be a million times closer but its steady sample rate can smooth out hitches. It samples a very consistent frame time that differs from the actual experience. The computer while displaying the game will not have rock solid frame times like the camera used to capture. At the same time our brain and eyes can see a lot faster.

So every time you sample and compile its changed. By the time we get these videos to youtube, a lot is lost. Its not the same as what was experience at all. I am not saying that its way worse or anything. I am just saying that its really hard to show what they are trying in this manner.
 
Last edited:

Leadbox

Senior member
Oct 25, 2010
744
63
91
NH did quite a few of these videos. The one thing that stands out for me is just how much sharper and crisper the image looks on the Radeons.
If AMD optimised the way Nvidia does they could be a lot faster
This shows what I'm getting at more clearly
 

cmaMath13

Platinum Member
Feb 16, 2000
2,154
0
60
NH did quite a few of these videos. The one thing that stands out for me is just how much sharper and crisper the image looks on the Radeons.
If AMD optimised the way Nvidia does they could be a lot faster
This shows what I'm getting at more clearly

Wow! Big difference in terms of quality!
 

f1sherman

Platinum Member
Apr 5, 2011
2,243
1
0
NH did quite a few of these videos. The one thing that stands out for me is just how much sharper and crisper the image looks on the Radeons.
If AMD optimised the way Nvidia does they could be a lot faster
This shows what I'm getting at more clearly

Looks like NordicHardware first needs to learn how to operate camera and then how to encode, because none of that greenish hue on Nvidia shows on Techreport's comparison capture.

Secondly you need uncompressed or at least high bitrate video to spot anisotropic-filtering difference between the two manufacturer.
If Youtube with its 5 Mbps limit is able to display the difference (which BTW has never ever been flattering for AMD) - you're doing it all wrong.
 
Last edited:

Rikard

Senior member
Apr 25, 2012
428
0
0
It also says:
Det går att märka av effekterna vid vanligt spelande, men troligtvis finns det många som inte skulle se några problem om de inte pekades ut.

Det är väldigt individuellt hur mycket man märker av och påverkas av ojämna renderingstider. För undertecknads del är det inte något som känts som ett akut problem och när jag exempelvis spelade upp vårt blindtest några dagar efter videorenderingen blev jag först osäker på vilket kort som var vilket.
which means:
It is possible to detect the differences during regular game play, but there are likely many that would not notice any problem unless they were pointed out to them.

How much one notices and is affected by irregular rendering times is very personal. For my own part this is not something that has felt like an acute problem, and when I for example played through our blind test a couple of days after the video rendering I was uncertain which card was which.
Sums it up pretty nicely me thinks.
 

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
Imho,

There is no doubt that many wouldn't notice -- many don't see texture aliasing as well. But, the key some do notice and may be important -- having more data to decide choice is welcomed.
 

Rikard

Senior member
Apr 25, 2012
428
0
0
Imho,

There is no doubt that many wouldn't notice -- many don't see texture aliasing as well. But, the key some do notice and may be important -- having more data to decide choice is welcomed.
I agree with all that. It is just that it is worth pointing out that this is hardly the End of the World as we know it situation, contrary to what certain trolls are trying to make of it. I think we have a lot more important things to worry about with our graphics cards.
 

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
To me, this is certainly not the end of the world but do appreciate the awareness over-all for smooth frame-rate and to go beyond just raw performance for gamers. Personally been consistent on this for many, many years - specifically when comparing AFR with single GPU performance.

The awareness only helps gamers based on nVidia and AMD may improve their products for gamers -- the bigger picture.
 

lavaheadache

Diamond Member
Jan 28, 2005
6,893
14
81
To me the fact that there is a difference is important enough and would definatly dictate which card I were to purchase. With that being said I'm still inclined to believe this may be a game by game basis. I have owned both a 680 and multiple 7970's. and neither stuck out as being smoother than the other. Could be the that I'm an avid frame limiter kind of guy which has its own smoothing advantages.
 

Leadbox

Senior member
Oct 25, 2010
744
63
91
Looks like NordicHardware first needs to learn how to operate camera and then how to encode, because none of that greenish hue on Nvidia shows on Techreport's comparison capture.

Secondly you need uncompressed or at least high bitrate video to spot anisotropic-filtering difference between the two manufacturer.
If Youtube with its 5 Mbps limit is able to display the difference (which BTW has never ever been flattering for AMD) - you're doing it all wrong.

I'm sure they shot the videos in exactly the same way so I'm not sure what they need to do differently for the Nvdia setup:\
Secondly your links point to texture shimmering/flickering nothing to do with how much richer the detail is on the radeon
 

f1sherman

Platinum Member
Apr 5, 2011
2,243
1
0
Well I am sorry that those links don't confirm your "richer the detail is on the radeon" thesis, but you have right there NV/AMD IQ comparison in multiple games via uncompressed videos and images.

But if you think that instead of those, < 5 Mbps Youtube clip should be basis for IQ discussion, more power to ya xD
 

MrK6

Diamond Member
Aug 9, 2004
4,458
4
81
NH did quite a few of these videos. The one thing that stands out for me is just how much sharper and crisper the image looks on the Radeons.
If AMD optimised the way Nvidia does they could be a lot faster
This shows what I'm getting at more clearly
Wow! Big difference in terms of quality!
I've said the same for years, get ready to be called a fanboy :awe:
Great videos! TBH, I think both hitch now and then, and it's tough for me to say which one more so. I do think too much quality is lost in YouTube's compression, especially due to the "smoothing" effects applied when they re-encode, and therefore it's difficult to make an accurate conclusion. What's more obvious to me is the difference in the color temperature of the cards (assuming they're using the same PC, monitor, and camera for the trials).
To me the fact that there is a difference is important enough and would definatly dictate which card I were to purchase. With that being said I'm still inclined to believe this may be a game by game basis. I have owned both a 680 and multiple 7970's. and neither stuck out as being smoother than the other. Could be the that I'm an avid frame limiter kind of guy which has its own smoothing advantages.
I think we're moving into a level of specificity that's going to depend on the subjective experience more than anything else. Some people may see no difference, some people may see a huge difference. Like I said, I've seen over the last few years that AMD cards produce a sharper image on my 1600p monitor than NVIDIA counterparts. Someone else may disagree completely. I think having some scientifically-sound (and hopefully irrefutable) data will help iron out some details, but it's still largely going to depend on the user-experience. For example, people are comparing the smoothness of Skyrim at ~70FPS. I generally play Skyrim at 20-30FPS because of all the graphics mods I have. The game is very smooth and perfectly playable (thanks motion blur and Bokeh!), but someone else may very well hate those settings.
 

Will Robinson

Golden Member
Dec 19, 2009
1,408
0
0
Looks like NordicHardware first needs to learn how to operate camera and then how to encode, because none of that greenish hue on Nvidia shows on Techreport's comparison capture.
It's fairly likely they know how to do both better than you.
NVDA image quality sure looks like it could use some work,maybe they've sacrificed some IQ for frame rates or something...:\

Link to IQ tests HD7970 GHz Edition Versus GTX680

http://www.youtube.com/watch?feature=player_detailpage&v=CqTmx1V47WE
 

amenx

Diamond Member
Dec 17, 2004
4,429
2,754
136
IQ imo is an important enough criteria to ditch one GPU over another in a heartbeat. I dont care if a card is 100% faster than another, I will not even consider it if IQ is even a tiny bit off. Very easy to get (superficial) differences if there is the slightest driver setting differences in contrast, gamma to give the impression that something is sharper or more crisp. These can always be fine tuned to ones preferences. I have a 6850 in one machine and a (newly bought) GTX 660ti in my main PC. Cant say I prefer one over another in motionless scenery, but I sometimes add about 5% contrast and -2 gamma to give that sharper, crisper look (and AFx16).

Quite a few peeps here who own/have owned both 680 and 7970. I dont think any of them have bought into this IQ issue.
 

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
IQ imo is an important enough criteria to ditch one GPU over another in a heartbeat. I dont care if a card is 100% faster than another, I will not even consider it if IQ is even a tiny bit off. Very easy to get (superficial) differences if there is the slightest driver setting differences in contrast, gamma to give the impression that something is sharper or more crisp. These can always be fine tuned to ones preferences. I have a 6850 in one machine and a (newly bought) GTX 660ti in my main PC. Cant say I prefer one over another in motionless scenery, but I sometimes add about 5% contrast and -2 gamma to give that sharper, crisper look (and AFx16).

Quite a few peeps here who own/have owned both 680 and 7970. I dont think any of them have bought into this IQ issue.

Image quality battle field, to me, is with motion anyway!
 

GotNoRice

Senior member
Aug 14, 2000
329
5
81
NH did quite a few of these videos. The one thing that stands out for me is just how much sharper and crisper the image looks on the Radeons.

Do we know that they even had the color adjustments on the two monitors calibrated identically? Since this is someone pointing a camera at an LCD...

Since that was not the focus of the article, I would say it's quite possible that they were not identically calibrated, at which point trying to make image quality comparisons between the two would be pointless.

People say smoothness doesn't matter, but many people are buying 7970s over GTX680s because when using the latest beta drivers you might get a handful more FPS with the 7970. Why would someone care about a handful of extra FPS? Smoothness. So yeah, people do care. The people who are buying 7970s for that slight performance edge should know that the beta drivers where that performance came from also cause severe frame latency issues. Would people still make the same decision if they had all the facts?

AMD knows people make purchases primarily based on benchmarks, and that is why they released the beta drivers that sacrificed smoothness for FPS right before the holiday shopping season. Anyone else wonder why there haven't been WHQL versions of the betas? It's pretty obvious...