[TECH Report] As the second turns: the web digests our game testing methods

Page 7 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Keysplayr

Elite Member
Jan 16, 2003
21,219
54
91
@Imouto

Tech Report did an excellent job. And now other sites are looking to contribute,
add and even improve upon methods of testing frame latency and smooth gameplay whatever the fps is.
And you're protesting too much. Way too much. ;) You know what they say.
Your entire post above reads like "head in the sand" type stuff. Sorry.

"Unplayable settings ain't in a bencher dictionary."

It is now. :D
 
Last edited:

willomz

Senior member
Sep 12, 2012
334
0
0
If TechReport did such a sloppy job why are they being taken seriously by respected figures from all over the place, including AMD themselves.

Fermi are terrible at running current games at top quality levels, is anyone surprised? So are legacy AMD cards. This is why serious gamers regularly upgrade their graphics cards.

Don't assume that everyone who is interested in frame latencies is some rabid Nvidia fanboy. This should interest anyone who is keen on graphics technology. In fact it should interest people with AMD hardware more as they will get the benefit from the improved drivers.
 

Keysplayr

Elite Member
Jan 16, 2003
21,219
54
91
If TechReport did such a sloppy job why are they being taken seriously by respected figures from all over the place, including AMD themselves.

Fermi are terrible at running current games at top quality levels, is anyone surprised? So are legacy AMD cards. This is why serious gamers regularly upgrade their graphics cards.

Don't assume that everyone who is interested in frame latencies is some rabid Nvidia fanboy. This should interest anyone who is keen on graphics technology. In fact it should interest people with AMD hardware more as they will get the benefit from the improved drivers.

They didn't. Imouto is just taking it as far back as he thinks people are taking it too far forward. He doesn't understand that the cat has scratched it's way out of the bag and is now the center of benchmark attention by quite a few sites by now and I'm sure more to follow. Imouto can call us, and them, ridiculous all he wants.
 

willomz

Senior member
Sep 12, 2012
334
0
0
You know that isn't an argument right? No more than your post about elephants was an argument.

I take it that you are trying to say you don't notice stutter. Well that's fine, but it doesn't mean that other people don't. Let people make up their own minds, and if most people are like you then it won't matter will it. But there isn't any harm into researching the issue and seeing if we can find a better measure than FPS to benchmark performance.
 

f1sherman

Platinum Member
Apr 5, 2011
2,243
1
0
This is The Emperor's New Clothes tale all over again.

Help yourself.

Imouto, do you mean:

Don't mind the naked king! *7950*
Look at those poorly-clothed peasants *570*, and their dirty half-naked children. *560 Ti, 470*
At that last year village festivities picture.

THAT'S THE REAL SCANDAL HERE
 
Last edited:

Leadbox

Senior member
Oct 25, 2010
744
63
91
If TechReport did such a sloppy job why are they being taken seriously by respected figures from all over the place, including AMD themselves.

Fermi are terrible at running current games at top quality levels, is anyone surprised? So are legacy AMD cards. This is why serious gamers regularly upgrade their graphics cards.

Don't assume that everyone who is interested in frame latencies is some rabid Nvidia fanboy. This should interest anyone who is keen on graphics technology. In fact it should interest people with AMD hardware more as they will get the benefit from the improved drivers.
i think they saw the variances in their frametime graphs and went looking to quantify it
Skyrim is a game of unlimited possibilities, is a camera panning over some brush the best they could do to highlight smoothness?
 

Imouto

Golden Member
Jul 6, 2011
1,241
2
81
No f1sherman, what I mean is:

The emperor (NV users) is looking for a new dress. The swindlers (TR and the ones to follow) will make him one (cherry picked benches at unplayable settings that that on the top of that don't mean a thing when you play). Once the dress is done everyone at the court (vocals at forums) tell the emperor how beautiful his new dress is. Then outside the children (skeptical graphics reviewers, amd and fermi users or just non-bubble boys) raise their voices about the emperor being naked. The emperor start to think the very same thing but continues his walk.

It's pretty sad having to explain this when it's a tale for children.
 

BrightCandle

Diamond Member
Mar 15, 2007
4,762
0
76
No f1sherman, what I mean is:

The emperor (NV users) is looking for a new dress. The swindlers (TR and the ones to follow) will make him one (cherry picked benches at unplayable settings that that on the top of that don't mean a thing when you play). Once the dress is done everyone at the court (vocals at forums) tell the emperor how beautiful his new dress is. Then outside the children (skeptical graphics reviewers, amd and fermi users or just non-bubble boys) raise their voices about the emperor being naked. The emperor start to think the very same thing but continues his walk.

It's pretty sad having to explain this when it's a tale for children.

I take it therefore you don't see microstutter. That is great you would be more than happy with these cards. But your opinion on whether its there or not isn't very useful because you can't see it. To those that can see it and find it a problem its a severe issue.

If I told you games were perfectly playable at 15 fps you would quite rightly point out you were not happy at that and I was likely very wrong about that. You might say 30 fps is smooth enough, or that anything less than 60 was not smooth. The same argument applies to microstutter. My threshold is about 6ms, which seems on the lower side than most. Yours presumably is much higher, if you have never seen it on Skyrim with a 7000 series card then its above 20ms. Now I personally think its unlikely you couldn't tell the difference between 0 and 20ms microstutter but it doesn't matter because it doesn't bother you.

But just because you think that is OK does not mean everyone does. The subjective feeling of lack of smoothness preceded techreports frame time method by many years. TechReport made the measure objective after of years of subjectively people having issues.

Do you realise you are naked yet sire?
 

willomz

Senior member
Sep 12, 2012
334
0
0
i think they saw the variances in their frametime graphs and went looking to quantify it
Skyrim is a game of unlimited possibilities, is a camera panning over some brush the best they could do to highlight smoothness?

The graphs already quantify it.

As you know they tested multiple games and more than one area within Skyrim and found stuttering in many of those games.

Don't try to make out that stuttering only occurs in one portion of one game.
If that was true why would AMD go to all the hassle of making a patch?
 

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
Yes, it is pretty clear that the 560Ti and 470 are unplayable over that benchmark at those settings. This is exactly why people choose to upgrade old cards, or at least dial the settings back a bit. If you only have a 560Ti you might want to turn Ultra off and maybe other options.

The hardware does operate properly, it just doesn't have the grunt available. Compared to the 470, the 7950 has almost double the memory bandwidth and almost triple the texel rate, it is supposed to be better.

If a 2-3 year old card doesn't perform well you shouldn't be surprised, if a high end current generation card doesn't then you should.

And the newer sku's have more ram, too!
 

Leadbox

Senior member
Oct 25, 2010
744
63
91
The graphs already quantify it.

As you know they tested multiple games and more than one area within Skyrim and found stuttering in many of those games.

Don't try to make out that stuttering only occurs in one portion of one game.
If that was true why would AMD go to all the hassle of making a patch?
Did they retest the section of their initial review?
NH did some testing too, confirming some of TR's findings but couldn't tell which was which in their blindtesting
As for AMD, well it would be silly to not be seen to be doing something given ppl's perception over their drivers
 

Keysplayr

Elite Member
Jan 16, 2003
21,219
54
91
Did they retest the section of their initial review?
NH did some testing too, confirming some of TR's findings but couldn't tell which was which in their blindtesting
As for AMD, well it would be silly to not be seen to be doing something given ppl's perception over their drivers

The graphs don't lie (unless specific graphs are posted purposefully to mislead) but peoples eyes sometimes lie. Those are the people who game on blissfully and rightly so.
Perhaps the blind tests were completed and the people who took the test could not tell? That is a possibility and one that can't be dismissed.
IMHO, the only people who should take the blind test are those people who CAN see the microstutter. Once it had been established that the panel of people who are taking the test are ABLE to see the latency problem, then let the tests commence.
It is fair to say, that perhaps the editors of H and TR are able to see the effect, but perhaps the editor of NH is unable, and therefore skeptical.
And on we go.
 

f1sherman

Platinum Member
Apr 5, 2011
2,243
1
0
The graphs don't lie (unless specific graphs are posted purposefully to mislead) but peoples eyes sometimes lie. Those are the people who game on blissfully and rightly so.
Perhaps the blind tests were completed and the people who took the test could not tell? That is a possibility and one that can't be dismissed.
IMHO, the only people who should take the blind test are those people who CAN see the microstutter. Once it had been established that the panel of people who are taking the test are ABLE to see the latency problem, then let the tests commence.
It is fair to say, that perhaps the editors of H and TR are able to see the effect, but perhaps the editor of NH is unable, and therefore skeptical.
And on we go.

Diablo 3. Millions of players.
And yet they all miss the freaking Stone Of Jordan ring graphic glitch.
Nvidia. Hundreds of millions of users, yet somehow they all miss memory leak bug (or don't care perhaps) , so again it's me who has to report it.
Among all my gamer friends, there is one guy whom I don't consider completely blind. World is populated with blindos :'(

the thresholds will definitely be different for different people, which is why the quartile graph that Scott uses is actually the most interesting IMHO. Time of 50ms is actually a pretty generous metric, as is 99% of frame times (at 60fps, that can still allow a spike of arbitrary magnitude every 2 seconds!).

Personally, I find jitters fairly obvious - that is to say that when they are frequent enough the game appears to be running at a lower frame rate than benchmarking claims. HardOCP has mentioned this in a pile of their articles (i.e. "the numbers say this, but I can tell you this one feels a lot smoother") and lots of people have noted issues with crossfire/SLI which obviously have another magnitude of this problem.


--Andrew Lauritzen--
--
 
Last edited:

Leadbox

Senior member
Oct 25, 2010
744
63
91
The graphs don't lie (unless specific graphs are posted purposefully to mislead) but peoples eyes sometimes lie. Those are the people who game on blissfully and rightly so.
Perhaps the blind tests were completed and the people who took the test could not tell? That is a possibility and one that can't be dismissed.
IMHO, the only people who should take the blind test are those people who CAN see the microstutter. Once it had been established that the panel of people who are taking the test are ABLE to see the latency problem, then let the tests commence.
It is fair to say, that perhaps the editors of H and TR are able to see the effect, but perhaps the editor of NH is unable, and therefore skeptical.
And on we go.

Different sites are going to be testing very different sections of games which is going to throw up different possibly conflicting results, at which point, we're right back to FPS.
Thats were we're going. It will be a fun ride, buckle up!
 
Last edited:

Keysplayr

Elite Member
Jan 16, 2003
21,219
54
91
Not a tetrachromat? I have bad news for you.

And the bad news is that you'd like everyone and all tech sites to grow up and not talk about this anymore? Does that include AMD? Cause they're talking about it to.
 

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
I'm talking about this

I use Force Vsync-ON in NVIDIA CP, because Blizzard's vsync is bugged and causes chat to scroll too-fast.


Haven't seen that yet, thanks, for the awareness.

What I did see was shadow and water 3d stereo artifacts/limitations with default 3d stereo and thankfully there is this mod from Helix:

http://helixmod.blogspot.com/2012/05/diablo-iii.html

Hehe, Emperor's Clothes to Tetrachromat!:)

I do agree with Andrew as well:

the thresholds will definitely be different for different people, which is why the quartile graph that Scott uses is actually the most interesting IMHO. Time of 50ms is actually a pretty generous metric, as is 99% of frame times (at 60fps, that can still allow a spike of arbitrary magnitude every 2 seconds!).

Personally, I find jitters fairly obvious - that is to say that when they are frequent enough the game appears to be running at a lower frame rate than benchmarking claims. HardOCP has mentioned this in a pile of their articles (i.e. "the numbers say this, but I can tell you this one feels a lot smoother") and lots of people have noted issues with crossfire/SLI which obviously have another magnitude of this problem.

--Andrew Lauritzen--

The key is some may disagree and have no problem!
 

willomz

Senior member
Sep 12, 2012
334
0
0
Some may disagree, but that doesn't mean they can tell the rest of us what we think.

I don't like talent shows on TV, but instead of campaigning against them and saying others shouldn't watch them, I just don't watch them myself.

Esoteric knowledge is not the same as making a legitimate point.
 

Keysplayr

Elite Member
Jan 16, 2003
21,219
54
91
Some may disagree, but that doesn't mean they can tell the rest of us what we think.

I don't like talent shows on TV, but instead of campaigning against them and saying others shouldn't watch them, I just don't watch them myself.

Esoteric knowledge is not the same as making a legitimate point.

Agreed. I do like watching "The Voice" though. Interesting for me. Anyway...
Those opposed to this or think that it's silly to pursue, shouldn't even be posting. I mean why would they?
 

Imouto

Golden Member
Jul 6, 2011
1,241
2
81
And the bad news is that you'd like everyone and all tech sites to grow up and not talk about this anymore? Does that include AMD? Cause they're talking about it to.

Never said that. It's dangerous to go outside alone, take this:

Dec 2011: http://techreport.com/review/21982/today-mid-range-gpus-in-battlefield-3/6




What I've been saying this whole time is that this has been an issue with Fermi cards since TR started this kind of testing back in late 2011.

I don't understand why it wasn't an issue with Fermi cards and it's such a big deal with the 7950 which is not as abyssal as it was back then. Everyone agreed that Fermi was way better than anything AMD had at that time.

I'm not te guy with double standards here.
 

willomz

Senior member
Sep 12, 2012
334
0
0
It was an issue at the time, although it was an issue with both AMD and Nvidia depending on game.

I think the reason it is such an issue now, is because it was a surprise/shock finding. Consensus was generally of the opinion that the 7950 ought to thrash the 660 Ti and thus there was surprise that this didn't happen.

If you read what they write they really lay into the GeForce GPUs:
"The GeForces crank out so many frames in 40-ms or more, it's not pretty. In fact, it's much, much uglier than the show the Radeons put on in Fear no evil. Even when we buff out the very highest peaks with our 99th-percentile calculation, the GeForces come out looking very weak. It's not just about numbers, either. Playing this section with a GeForce, the latency spikes were very palpable, causing animations seemingly to speed up and slow down wantonly."

If anything this is proof that tech report has no Nvidia bias whatsoever.
 
Last edited: