Asking Ryan Smith of AT if a special examination could be done?

Page 4 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Jaydip

Diamond Member
Mar 29, 2010
3,691
21
81
Wrong what? You can have automated tools to "minimize" one example I gave, but nothing is foolproof. Hell you could misinstall hardware or whatever as well.

How is it not repeatable, you can do the same run over and over again.

And if what you say is true, tell that to Ryan and TR. You are very smug about this but I suspect that you should not be... if it were that simple, why didn't NV mention this to TR when they talked to them about the insufficiency of using FRAPS data when discussing end-user experience.

Drop the smugness, it is not endearing and you are likely wrong about this. My suggestion may be painstaking as you advance frame by frame, and not automated, but it would get the job done.

Dude I'm not smug about it at all.NV probably did mention them as it has been used by the game developers for some time now.Very few people are interested in this and that is probably one of the reasons it never got publicity.What I'm saying is a hook to a process and debugging it on wards.Your method doesn't give me a complete picture.
 

BrightCandle

Diamond Member
Mar 15, 2007
4,762
0
76
FPS as a measure of performance is open to abuse because its an average. For example my card could produce 60 frames within 16ms and then nothing for the rest of the second and the result would be 60 fps but it would also be totally unplayable. We are using FPS as a proxy to what we actually want which is that the image moves smoothly and our eyes perceive it as motion.

Frame times are better, because you can start to look within the second and determine if the frames are being delivered on a reliable schedule.That however does not tell the whole story because there is a trade off with latency and the frame is clearly older than just the GPU part of the pipeline. But I don't know yet of a way to measure latency in a practical way without resorting to the monitor and high speed cameras, and realistically that isn't a reliable or cost effective way for the industry to move forward.

So while we can all point at the benchmarks showing that AMDs 7970 Ghz is faster than the GTX 680 just understand that FPS isn't what you want measured, it is what currently is measured and there is a big gulf between what you want and what you are getting in actual data. Techreport at least are looking inside the second and showing problems, on both manufacturers cards, and I think the rest of the sites need to start doing something similar and working out how to push further.

Microstutter isn't a myth and it is a problem. Some people don't seem to mind it, that doesn't mean it shouldn't be fixed because once it is they will notice the improvement.
 
Last edited:

blastingcap

Diamond Member
Sep 16, 2010
6,654
5
76
Dude I'm not smug about it at all.NV probably did mention them as it has been used by the game developers for some time now.Very few people are interested in this and that is probably one of the reasons it never got publicity.What I'm saying is a hook to a process and debugging it on wards.Your method doesn't give me a complete picture.

TR did not mention this.

Ryan did not mention this.

If I were a betting man, I'd bet that your software solution has some flaw--maybe not the exact same flaw but some flaw--that prevents it from being a perfect stand-in for end-user eyeballs, which is why neither TR nor Ryan mentioned it.

Also, feel free to message Ryan and TR about this if you think they somehow did not know this obvious solution, if it in fact it is a solution. You'd think that NVidia would have discussed this to death with TR already if they had one of their guys talk to TR for the frame time article. http://techreport.com/review/21516/inside-the-second-a-new-look-at-game-benchmarking/11
 
Last edited:

BrightCandle

Diamond Member
Mar 15, 2007
4,762
0
76
I am downloading Cuda 5.0 that contains the NVidia profiler and I'll see what it can do and what it can tell me about stutter. I am seeing some in Planetside 2 and I'll see if Fraps shows it in its data and if the Cuda 5.0 profiler does in its data.
 

Lepton87

Platinum Member
Jul 28, 2009
2,544
9
81
All it would take to resolve the issue once and for all is a high speed camera and I find it hard to believe that such a big site as anandtech can't afford it. Suddenly when AMD takes the lead average FPS don't matter and even minimum fps don't matter either.
 

Jaydip

Diamond Member
Mar 29, 2010
3,691
21
81
TR did not mention this.

Ryan did not mention this.

If I were a betting man, I'd bet that your software solution has some flaw--maybe not the exact same flaw but some flaw--that prevents it from being a perfect stand-in for end-user eyeballs, which is why neither TR nor Ryan mentioned it.

Also, feel free to message Ryan and TR about this if you think they somehow did not know this obvious solution, if it in fact it is a solution. You'd think that NVidia would have discussed this to death with TR already if they had one of their guys talk to TR for the frame time article. http://techreport.com/review/21516/inside-the-second-a-new-look-at-game-benchmarking/11

Yes there is a huge catch your application needs to be configured to use "perfhud" otherwise anybody can modify your D3D routines.You also need additional parameters for the "CreateDevice" function.
 

Jaydip

Diamond Member
Mar 29, 2010
3,691
21
81
All it would take to resolve the issue once and for all is a high speed camera and I find it hard to believe that such a big site as anandtech can't afford it. Suddenly when AMD takes the lead average FPS don't matter and even minimum fps don't matter either.
Tbh my understanding was that only AFR introduces ms but that TR article is sure weird.
 

blastingcap

Diamond Member
Sep 16, 2010
6,654
5
76
Yes there is a huge catch your application needs to be configured to use "perfhud" otherwise anybody can modify your D3D routines.You also need additional parameters for the "CreateDevice" function.

And you think that TR did not know about this? I think it's more likely that NV's representative either mentioned this workaround to TR but shot it down as not being effective for some reason, or else NV made no mention of this workaround to them because it isn't a perfect workaround. I would think that if it were so easy to do, NV would have mentioned it and that others would have hopped onto it already. Stranger things have happened though. Ryan, if you are still on this thread, feel free to comment on Jaydip's supposed miracle cure.

All it would take to resolve the issue once and for all is a high speed camera and I find it hard to believe that such a big site as anandtech can't afford it. Suddenly when AMD takes the lead average FPS don't matter and even minimum fps don't matter either.

FPS has never been that great of a measure because the human eye can see things faster than once per second.

I think minimums matter as well.
 

Jaydip

Diamond Member
Mar 29, 2010
3,691
21
81
And you think that TR did not know about this? I think it's more likely that NV's representative either mentioned this workaround to TR but shot it down as not being effective for some reason, or else NV made no mention of this workaround to them because it isn't a perfect workaround. I would think that if it were so easy to do, NV would have mentioned it and that others would have hopped onto it already. Stranger things have happened though. Ryan, if you are still on this thread, feel free to comment on Jaydip's supposed miracle cure.



FPS has never been that great of a measure because the human eye can see things faster than once per second.

I think minimums matter as well.

Dude you are not listening.This is what happens when you are persistent about your "camera" method ;) .Read my post , the application needs to be recompiled to make it work.Also the tool is used to find CPU/GPU bottlenecks, not every one is comfortable using it.
 
Last edited:

Lepton87

Platinum Member
Jul 28, 2009
2,544
9
81
Tbh my understanding was that only AFR introduces ms but that TR article is sure weird.

Nope, single GPU also don't deliver frames in exactly the same time. Every frame in the 16ms intervals with no deviation would be ideal for 60Hz monitor.
 

Jaydip

Diamond Member
Mar 29, 2010
3,691
21
81
Nope, single GPU also don't deliver frames in exactly the same time. Every frame in the 16ms intervals with no deviation would be ideal for 60Hz monitor.

That is almost impossible.No two frames are exactly same.
 

blastingcap

Diamond Member
Sep 16, 2010
6,654
5
76
Dude you are not listening.This is what happens when you are persistent about your "camera" method ;) .Read my post , the application needs to be recompiled to make it work.Also the tool is used to find CPU/GPU bottlenecks, not every one is comfortable using it.

Er? I was listening, your proposal if it would work, would require recompiling sure but assuming that it would be equally fair to both AMD and NV cards, as in, AMD has an equivalent, shouldn't it be possible to compare the two modified programs anyway? And are you 100.0% sure that it would 100.0% track the end-user experience? Because I'm not so sure. Like I said, feel free to message TR and Ryan about this if you feel sure.
 

thilanliyan

Lifer
Jun 21, 2005
12,085
2,281
126
Just to be sure about something, the frame time fluctuates following the fps right? For example at 30fps, the frame time is higher (takes longer to draw) than at 60fps (doesn't take as long to draw)...is that correct?

Vesku, thanks for overlaying the fps on your frame time graph but would it be possible to put the fps on a separate Y axis (I know MS Office can do it)? It's just hard to see the frame time fluctuation when put on the same axis as the fps.
 

Jaydip

Diamond Member
Mar 29, 2010
3,691
21
81
Er? I was listening, your proposal if it would work, would require recompiling sure but assuming that it would be equally fair to both AMD and NV cards, as in, AMD has an equivalent, shouldn't it be possible to compare the two modified programs anyway? And are you 100.0% sure that it would 100.0% track the end-user experience? Because I'm not so sure. Like I said, feel free to message TR and Ryan about this if you feel sure.

It would be fair but the issue is getting hold of the source code :) which game devs are going to give away their code base so they can make some "nerds" happy :cool: When you are doing a production build all the hooks for performance counters are removed and as a result you need the source code to put them back in.But it will only work for NV, there must be something equivalent for AMD as well.It will track how much time each frame is taking to get displayed along with other numerous counters.This tool has been used by UT3,Crysis blah blah(can't remember)
 

Jaydip

Diamond Member
Mar 29, 2010
3,691
21
81
Just to be sure about something, the frame time fluctuates following the fps right? For example at 30fps, the frame time is higher (takes longer to draw) than at 60fps (doesn't take as long to draw)...is that correct?

Vesku, thanks for overlaying the fps on your frame time graph but would it be possible to put the fps on a separate Y axis (I know MS Office can do it)? It's just hard to see the frame time fluctuation when put on the same axis as the fps.

Correct.All frames are different.
 

KingFatty

Diamond Member
Dec 29, 2010
3,034
1
81
My camera suggestion does not have horribad human error potential or anything. You start recording before the start of a run and stop after the stop of a run. Then you have thousands of frames of raw material to work with and can get actual end-user frame times to the closest 1/1200 sec or whatever you decide to use.

Hey, what about using an oscilloscope? Is there a pin on the DVI (or displayport, or VGA) signal that changes to indicated a new frame?

Is there any electrical signal you can identify in the VGA, DVI, or DisplayPort signal that would indicate when one frame switches, and then use a scope to capture its output over time?

I remember years ago the scopes in our EE lab had a record function and you could save the results to a *floppy* disk. Surely someone has access to a scope or a lab with one, but the question is can we tap one of the pins on the video signal to reveal this info?

Just saying, a scope would be nearly immune to the types of issues that might affect a camera-based capture of frames.
 

Vesku

Diamond Member
Aug 25, 2005
3,743
28
86
Vesku, thanks for overlaying the fps on your frame time graph but would it be possible to put the fps on a separate Y axis (I know MS Office can do it)? It's just hard to see the frame time fluctuation when put on the same axis as the fps.

mtXKV.png
 

Lepton87

Platinum Member
Jul 28, 2009
2,544
9
81
That is almost impossible.No two frames are exactly same.

With a single GPU, sure not possible unless it's a very old game and it runs at 200fps without V-syn or frame rates cap. It should be possible even in new games, with enough graphics power (4 GPUs) and some clever driver tricks, but it would add input lag for sure.
 

Vesku

Diamond Member
Aug 25, 2005
3,743
28
86
No there was no stuttering or jitteriness that I could notice. The left axis is frametime and even during the noisy section most of that variation is a few milliseconds. Not that FRAPs can tell the complete picture, as is being discussed in this thread. FRAPs can't follow the frame processing completely.
 
Last edited:

blastingcap

Diamond Member
Sep 16, 2010
6,654
5
76
It would be fair but the issue is getting hold of the source code :) which game devs are going to give away their code base so they can make some "nerds" happy :cool: When you are doing a production build all the hooks for performance counters are removed and as a result you need the source code to put them back in.But it will only work for NV, there must be something equivalent for AMD as well.It will track how much time each frame is taking to get displayed along with other numerous counters.This tool has been used by UT3,Crysis blah blah(can't remember)

Could run it on open source games, freemium games, and/or games that have depreciated to the point where the publishers are okay with giving them away free from time to time. Heck does it even need to be a game per se? Not really, right? So open source game-like software might work, too, as long as they had D3D support.

This is all assuming that the metrics tracked do in fact track the end-user experience and there are no problems like you get with FRAPS.

did you even read his post?

Did you read mine? A camera solution is ugly, slow to analyze the output from, but effective. I think Ryan would rather have something more automated and quicker to do, and I don't blame him, however, I'm not sure how feasible something like his proposal is. If it would require a lot of custom software or even hardware, then it could be a real chore.

Hey, what about using an oscilloscope? Is there a pin on the DVI (or displayport, or VGA) signal that changes to indicated a new frame?

Is there any electrical signal you can identify in the VGA, DVI, or DisplayPort signal that would indicate when one frame switches, and then use a scope to capture its output over time?

I remember years ago the scopes in our EE lab had a record function and you could save the results to a *floppy* disk. Surely someone has access to a scope or a lab with one, but the question is can we tap one of the pins on the video signal to reveal this info?

Just saying, a scope would be nearly immune to the types of issues that might affect a camera-based capture of frames.

Interesting idea, I don't know the answer, but perhaps someone here does. I hope Ryan will look into this as well.
 
Last edited:

omeds

Senior member
Dec 14, 2011
646
13
81
3-way is certainly far smoother than 2-way, cant remember where the review is, but even in my own testing frame times are far more evenly distributed, and it is noticable in game play.
 

Lepton87

Platinum Member
Jul 28, 2009
2,544
9
81

So the solution to MS is to just buy 4 gpus, problem solved. :D I bought 4-way set-up because I had heard that it has markedly less ms then 2-way, no to mention increased performance. I've heard that even 3-way CF is much better than 2-way CF when it comes to MS. Wonder how much better is 4-way as compared to 3-way.
But 4000 and 5000 series had much worse CF scaling then 6 series, so it's hard to say how valid this is review for me. But really I want to stay clear of Multi-GPU solutions in the future, not because of MS but I hate relying on drivers and profiles support to play a game. But if a really fast single gpu won't come out on the market my next multi-gpu will certainly have more than 2 gpus. There's probably no chance that a single GPU produced at 28nm would beat my cards when they scale properly and I don't want to gain some and lose some, so I will have to wait for the next die shrink.
 
Last edited: