PCPer on Crossfire problems in the Titan review

Page 6 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

BrightCandle

Diamond Member
Mar 15, 2007
4,762
0
76
I dont understand, there are 3 frames shown in the screenshot with 2 lines of tearing. The 3 frames are overlapping each other in sequence, right?

3 frames are shown, between those 3 frames their are the appropriate lines of tearing, so 2 tear lines. That is correct. I wouldn't call a buffer swap an overlap, you need to understand double buffering and how it works to understand how and why the final picture looks like it does.

It takes 16ms for the graphics card to write out everything for a screen to the monitor and it starts in the top left and ends in the bottom right, this process is often called scan out and its the bit that goes from the GPU to the monitor over the cable. The graphics card also has another area of memory its using to write the next frame. So while scan out is occurring the GPU is busy rendering the next scene in parallel. In the case of vsync off the graphics card when its completed the new frame swaps what is being sent to the monitor to the new one and so the next pixel sent to the monitor will be from a different image taken at a different moment. In this case the pink frame lasts just 4 or 5 lines in total, so something like 5/1080 / 16 = 0.000289ms, ie a very short period of time.

A normal frame on 60hz monitor and a game at 60 fps will survive to completion, get to write all its 1080 lines of output but because vsync is off will more than likely be split across two different images on the monitor, half on one and half on the other. The bottom of frame100 and the top of frame 101 say.

Does that help?
 

omeds

Senior member
Dec 14, 2011
646
13
81
In the case of vsync off the graphics card when its completed the new frame swaps what is being sent to the monitor to the new one and so the next pixel sent to the monitor will be from a different image taken at a different moment. In this case the pink frame lasts just 4 or 5 lines in total, so something like 5/1080 / 16 = 0.000289ms, ie a very short period of time.

Yes the next pixel is from a different frame at a different moment and the GPU has rendered 3 full frames, so isnt it a problem of timings and not AMD is "cheating" with runt frames? The runt frames, depsite being briefly shown on the display are still taxing a full frames worth of performance, right?
 
Last edited:

Rikard

Senior member
Apr 25, 2012
428
0
0
I find this confusing and it appears I am not alone. Is this sketch "accurate"? Assuming single graphics card for simplicity's sake.
2z5u840.png

If so, I think it would be interesting to measure the frame latency plus the input lag. There is no way I would use PCPER's configuration in real life without fixing that tearing, but I would like to know how much longer it would make the total time between the instructions to GPU and frame rendered on screen.

The stutter here occurs because there is nothing new do display at refresh 3 in the v-sync off case.
 

BrightCandle

Diamond Member
Mar 15, 2007
4,762
0
76
There are more answers than Vsync. As for people playing without Vsync because they don't care the slightest about screen tearing...

People play with vsync off because it reduces the latency of the image to the screen by quite a lot, to the tune of at least 16ms and up to 32 ms. This is not an insignificant number. If you have been following the Occulus rift you'll know that John Carmack has been talking about anything above 20ms total of input latency on the head movement makes some people sick because the world isn't following along correctly, 32ms reduction on a pipeline that is 50ms+ long is dramatic. Tearing provides worse IQ but that isn't the goal of using it or why all these competitive FPS players are using it.

...don't know why they would care about stuttering either and more when that stuttering is under 16.6ms, the minimum that you aim to see in a 60Hz display.

If that stutter last any further than 16.6ms you're going to have a 33.3ms stutter and even more than that to 50ms but looks like reviewers don't even know how their monitors work.

The entire graphics pipeline from game world to monitor is not synced at all. What happens is that the GPU/DX blocks the CPU and game engine from producing more frames while its busy with existing ones. We want it to take exactly 16.6ms to render a frame so that the CPU gets released for the next frame so the game world will move in 16.6ms steps. But from the GPUs perspective its writing into a buffer the results of rendering the picture and then swapping the buffer at the end in accordance with the vsync setting. The two sides of game telling the GPU what to do and the GPU producing an image are separate and the process of swapping the buffers for when it goes out to the monitor is separate again.

However if that blocking time for the game differs between frames and the moment a picture is taken taken differs from when it will actually be displayed. Stuttering impacts on the game world and how it produces the next frame, which causes a shorter/longer jump in game world time which then gets rendered and displayed in accordance with the 16.6ms screen output. So even in the case when vsync is on stutter is changing the perception of motion because the game world isn't getting to render on a nice even schedule, its taking pictures at 5ms then 10ms then 46ms and those three screens are going out to the screen at 16,32 and 48 milliseconds. The moment of the picture appears on the monitor wont match the motion showing on the screen because the time snapshot moments taken were not on an even 16.6ms schedule and that schedule is controlled mostly by the GPU.

And no, they can't test it on a 120Hz monitor because their capture hardware is capped to 60Hz. Wanna test it again when they get new hardware? OK, but don't show this as it is right now as the holy grail of "user experience testing".[/quot]

I hope I have explained why that is irrelevant. More to the point the grand majority of users have 60hz monitors, and those with 120hz would still see a similar problem just with twice as many frames. 120Hz would reduce the magnitude of the issue but certainly not eliminate it.

Again the new problem with this method is that having a faster graphics solution is indeed worse than a slower one tear wise. There are only problems with this new testing methods and we are in hands of utterly incompetent people.

Where? Anyway 60Hz displays ain't the issue here, TR and PCPer methodology is.

Hopefully I have explained now why that is not the case, why this test is highly indicative of player perception and why the stuttering is very visibly a problem. They are far from incompetent, this mechanism of testing is actual kind of cool and very solid. What we have here is electronic eyes, its even better than going via the monitor via high speed cameras because it can eliminate problems with the monitor and can one day maybe even tell us how old a frame is from inception on the CPU game world to the moment its output onto the screen. That would even tell us the amount of lag we have in the Graphics pipeline.
 

omeds

Senior member
Dec 14, 2011
646
13
81
It should but it gets interrupted by the next cycle of the refresh rate.

Wouldn't that only matter to the display and the image it presents, not the work the GPU is actually doing - rendering frames on its end?
 
Last edited:

BrightCandle

Diamond Member
Mar 15, 2007
4,762
0
76

Sort of but I think that is a picture trying to explain why we get latency with vsync not why stuttering is bad and also why we might get a frame that is very short in the crossfire case. My other issue with it is that its not making clear that at some point the CPU is getting blocked, there is too much parallel behavior happening in that picture which reality doesn't happen For DX present modes see http://msdn.microsoft.com/en-gb/library/windows/desktop/bb172585(v=vs.85).aspx for the possibilities. Quite enlightening about the ways you can ask the game to present to the GPU the frame and what if any waiting should be done.

The way I view it is:

1) CPU takes a time moment of now and starts producing the game world updates (physics, any user input, moving the camera, changing health you name it, everything that describes the worlds state).

2) Then the CPU starts to render that frame via DX, so starts a frame, passes a load of DX calls for the world and everything in it and where it is and eventually tells DX the frame is done which is a call to present in the dx api.

3) With the frame done the GPU takes the frame, renders it and draws it out into a buffer in memory.

4) When the frame is done then it can be swapped such that it becomes what is scanned out to the monitor. With vsync off that happens immediately, with vsync on it waits until the previous one is finished before its swapped.

Always) Simultaneously while the GPU is rendering it has a second buffer which it is using to send image data from a previous render to the monitor.

But in (2) the DX API can block the users game thread if it doesn't have space for another picture yet, ie lets say for simplicity that the previous render hasn't finished yet. So the CPU just waits at the end of the frame in presentation doing nothing but sleeping. When the GPU finishes and swaps the buffer it takes the frame from the CPU which now goes back to (1) and it gets on with rendering the next game world picture.
 
Last edited:

BrightCandle

Diamond Member
Mar 15, 2007
4,762
0
76
It should but it gets interrupted by the next cycle of the refresh rate.

I wouldn't say that, I would say that the scan out is interrupted by a buffer swap to the new frame. That new frame came along very quickly after the previous one, even though all 3 were fully formed. The GPU doesn't produce partial frames it only produces full complete images but then the process by which its swapped in the monitors scan out buffer determines if it went as was rendered or split across multiple screens.
 

Imouto

Golden Member
Jul 6, 2011
1,241
2
81
BrightCandle man, stop trying to explain anything with terminology you don't even know. It didn't make you see 6ms differences between frames in your 60Hz monitor and doesn't make your point any better.

You're just making no sense.
 

omeds

Senior member
Dec 14, 2011
646
13
81
I wouldn't say that, I would say that the scan out is interrupted by a buffer swap to the new frame. That new frame came along very quickly after the previous one, even though all 3 were fully formed. The GPU doesn't produce partial frames it only produces full complete images but then the process by which its swapped in the monitors scan out buffer determines if it went as was rendered or split across multiple screens.

This is what I was getting at, thanks.
 

BrightCandle

Diamond Member
Mar 15, 2007
4,762
0
76
BrightCandle man, stop trying to explain anything with terminology you don't even know. It didn't make you see 6ms differences between frames in your 60Hz monitor and doesn't make your point any better.

You're just making no sense.

Your more than welcome to explain the DX11 API better than me if you choose to. I have written 3D programs with various different types of blocking and frame rating, have you? I ran a massively multiplayer online turn based strategy game with 3D graphics and make my money programming. So if you think I have it wrong have at it.
 

Ferzerp

Diamond Member
Oct 12, 1999
6,438
107
106
Imouto sure is trying *really* hard to make this not about the weird stuff going on with CFX and about vsync vs no vsync...

Still.

At this point, it's either willful ignorance, or malice because there is awalys tearing without vsync.

Judging by the posts he made in this thread http://forums.anandtech.com/showthread.php?t=2293380 , I'm going to go with malice. He's trying really hard to get no one to pay attention to anything like this.
 
Last edited:

Imouto

Golden Member
Jul 6, 2011
1,241
2
81
I'm trying to say that this methodology is awfully wrong. I'm just tired of dealing with BrightCandle, his superpowers and his way of deflecting recalling DX11 and other stuff that ain't related to the issue being discussed or even the topic.

He's just a delusional random using jargon to get away with his point and deceive people.

He's doing the very same thing with this thread than he did with the previous about this methodology.

http://forums.anandtech.com/showpost.php?p=34543547&postcount=743

That thread is full of his bullshit, just highlighted a post of his, if anyone is interested can read it and find out he's blatantly lying about the stuff he can "see".

And you Ferzderp. I don't care about testing flawed at its roots or your comments about my malice when you're preaching against the very same thing you're doing right now. You just don't accept what I'm saying and try to throw me out the thread. Cool.
 

dqniel

Senior member
Mar 13, 2004
650
0
76
BrightCandle man, stop trying to explain anything with terminology you don't even know. It didn't make you see 6ms differences between frames in your 60Hz monitor and doesn't make your point any better.

You're just making no sense.

“The cure for a fallacious argument is a better argument, not the suppression of ideas.”

instead of just saying he's wrong over and over again, please do us all a favor and explain why your theory is right. if you can't, or won't even attempt it, then there's no point in telling him to stop explaining his theory.
 

Final8ty

Golden Member
Jun 13, 2007
1,172
13
81
It isn't me who is defining "noticeable". Read any article or scientific paper about perceptive latency and you'll likely come to the same conclusion.

When I make a statement, don't think for a second that I haven't researched it to an extensive degree.

Links to some of said papers:

http://www.stuartcheshire.org/papers/LatencyQuest.html

http://www.ncbi.nlm.nih.gov/pubmed/16639613 (link to PDF @ source)

http://www.perceptionweb.com/abstract.cgi?id=p240749

If you have an account @ Sciverse / Science Direct, there are several other articles in their archives which discuss the same thing. Unfortunately, I can't post links to them due to copyright terms but you'll quickly get the idea after Googling a bit.

However, I will distill it down into plain speak: In extensive lab tests with and without human subjects, a time of 48 frames per second (ie: 20.83ms) was deemed to be the threshold at which the majority of people anbd test instruments saw completely smooth display images or refreshes.

One test actually went so far as to rapidly flash a lightbulb at variable speeds in order to demonstrate when people would see it "flicker". That threshold for 95% of the participants was at 50 flashes per second (ie: exactly 20ms).

Not sure how much more information anyone could be looking for here.... ;)
http://www.xtremesystems.org/forums/showthread.php?285164-PCPer-Frame-Rating-Part-3-First-Results
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
Pretty optimistic interpretation. Xfire isn't new, they must know about these frames and the effect they have on the all important fps charts which despite all the advances in reviewing are still probably what sells the most graphics cards. I take a more realistic line - they knew, they left them there and didn't tell reviewers about them.

This in my book is pretty bad - they effectively lied about how well their graphics cards perform in Xfire as they knew what reviewers reported wasn't correct. You could even ask - did AMD put them there intentionally? I'm sure this will all come out over time - e.g. testing old drivers to see when the phantom frames first appeared. I await the official AMD response to see what they've got to say.

This "AMD cheating" gets old. When TR found the frame latency issue it was said that "AMD cheated" with their last drivers and that's what caused it. Now, "AMD is cheating" again.

When [H] finds out that Titan is throttling in games as the card warms up, thus making benchmarks done when the card is cool not representative of the performance people who actually play games will see, that's a safety feature.

If AMD hasn't done the same tests PCPer are doing how would they know? Those are not frames that are being purposely inserted. They are frames that because of the uneven distribution of frame times are not given a chance to completely appear before the next frame is rendered.
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
I don't understand the problem here. Who is to say this problem doesn't also appear with Vsync on? While gaming, my fps often are below the refresh rate, so microstutter still occurs with Vsync active.

This method may not be directly relevant for people playing with Vsync, but that doesn't mean that you cannot deduct useful information from it. And I know of many people who would never play with Vsync due to input lag. They are important, too.

What PCPer is showing occurred when frame rates were higher than the refresh rate. I don't know if it occurs when FPS drop below the refresh rate, or not. They haven't shown that.
 

njdevilsfan87

Platinum Member
Apr 19, 2007
2,330
251
126
What PCPer is showing occurred when frame rates were higher than the refresh rate. I don't know if it occurs when FPS drop below the refresh rate, or not. They haven't shown that.

I experienced this with the HD-6000 series and it was awful. It made me lose faith in any dual GPU setup. The HD-6000s were supposed to be the first to have near perfect scaling... what a load of crap that was. My FPS counter felt like a complete lie pretty much all of the time.

It's not something you realize going from single GPU to dual-GPU right away because you'll get a large enough improvement in performance, so of course it will feel better right away. But when you go from last generation dual-GPU to the current/next generation single GPU that's supposed to provide the same performance, only then will you really feel it.

But to answer the question : you don't have to go after the refresh rate. In fact, it became worse the lower the frame rate got. Once I hit 30-40fps it just became completely unplayable, where as a single GPU would at least still provide a very playable experience.
 
Last edited:

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
I experienced this with the HD-6000 series and it was awful. It made me lose faith in any dual GPU setup. The HD-6000s were supposed to be the first to have near perfect scaling... what a load of crap that was. My FPS counter felt like a complete lie pretty much all of the time.

It's not something you realize going from single GPU to dual-GPU right away because you'll get a large enough improvement in performance, so of course it will feel better right away. But when you go from dual-GPU to the next generation single GPU that's supposed to provide the same performance, only then will you really feel it.

I'm not saying it's not a problem. It's been reported before by review sites (namely [H]) that crossfire doesn't appear as smooth as the frame rates would imply. This seems likely to be the cause. We haven't been given enough info to know anything yet. A few screen captures that are above the screens refresh rate with vsync off, isn't enough info.
 

VulgarDisplay

Diamond Member
Apr 3, 2009
6,193
2
76
This "AMD cheating" gets old. When TR found the frame latency issue it was said that "AMD cheated" with their last drivers and that's what caused it. Now, "AMD is cheating" again.

When [H] finds out that Titan is throttling in games as the card warms up, thus making benchmarks done when the card is cool not representative of the performance people who actually play games will see, that's a safety feature.

If AMD hasn't done the same tests PCPer are doing how would they know? Those are not frames that are being purposely inserted. They are frames that because of the uneven distribution of frame times are not given a chance to completely appear before the next frame is rendered.

I originally viewed it as cheating when I forgot momentarily how gpu's output content onto a monitor. Now that a few people reminded me that GPU's render a screen from left to right, top to bottom, one row of pixels at at a time I know that the reported FPS is correct and not cheating. It is however very poorly spaced which is the major issue with crossfire setups it seems.

Those of you who are calling this cheating need to remember that that sliver of a frame is actually rendered all the way from the top, but it was either delayed or the next frame came early. This resulted in the the next frame being rendered top to bottom covering most of the previous frame.

So, AMD's dual GPU solution scales very well, but until they get their frames to be rendered more evenly they will continue to feel like basically half the FPS that is reported. I would think that with alternate frame rendering it would be easier for the GPU's to render evenly spaced images, but it's obviously more complicated than I think.
 

BrightCandle

Diamond Member
Mar 15, 2007
4,762
0
76
I originally viewed it as cheating when I forgot momentarily how gpu's output content onto a monitor. Now that a few people reminded me that GPU's render a screen from left to right, top to bottom, one row of pixels at at a time I know that the reported FPS is correct and not cheating. It is however very poorly spaced which is the major issue with crossfire setups it seems.

Those of you who are calling this cheating need to remember that that sliver of a frame is actually rendered all the way from the top, but it was either delayed or the next frame came early. This resulted in the the next frame being rendered top to bottom covering most of the previous frame.

So, AMD's dual GPU solution scales very well, but until they get their frames to be rendered more evenly they will continue to feel like basically half the FPS that is reported. I would think that with alternate frame rendering it would be easier for the GPU's to render evenly spaced images, but it's obviously more complicated than I think.

Its also the knock on effect this has in making the game engine unevenly render as well. Games use the blocking nature of the present call (whether in immediate mode or not) to ensure that the game world is produced exactly the right amount of times. In the GPU limited case (most common) this is assumed by the game to always be an even time, but for AMD cards (crossfire or not) it isn't. So while removing the runt frames from the frame time chart is one way of adjusting the frame time chart to compensate we also know the content of the frames is wrong and is being displayed at the wrong time. I am fairly confident that the approach means that the FPS adjustment chart is right so long as the data is correct. I don't know if adjusting the frame time chart in the way pcper have is accurate, because we are trying to measure two things with the same measure, when the frame went out and for what moment it was created.

The ideal game would produce a frame and then 16ms it would get rendered and the buffer swapped immediately. Thus the game world as its displayed is simply delayed by the render time (16ms) scan out (16ms) the monitor (~16ms or less). The world is sampled every 16ms and despite the delay it matches with the screen it displays.

But if you have a situation where the world is being produced haphazardly, 0,10,33 for the slots of 0,16,33 on the monitor then things are much worse. The frame for 16 doesn't actually exist, one was created for 10ms but that is never going to get to the monitor because it only has frames at 0,16 and 33. So the animation in the second frame is designed for display at 10ms is actually at 16ms so its also off by 6ms to the user. This is why frame times matter even now because they are telling us how much this problem is occurring and by how much its varying.
 

Dribble

Platinum Member
Aug 9, 2005
2,076
611
136
This "AMD cheating" gets old. When TR found the frame latency issue it was said that "AMD cheated" with their last drivers and that's what caused it. Now, "AMD is cheating" again.

When [H] finds out that Titan is throttling in games as the card warms up, thus making benchmarks done when the card is cool not representative of the performance people who actually play games will see, that's a safety feature.

If AMD hasn't done the same tests PCPer are doing how would they know? Those are not frames that are being purposely inserted. They are frames that because of the uneven distribution of frame times are not given a chance to completely appear before the next frame is rendered.

What the PCPer shows is fraps reporting one fps, and the game actually being rendered at a much lower fps due to fraps reporting dodgy runt frames as an actual frame. That means the real fps scores for xfire cards is wrong. It's not a problem nvidia have so clearly those runt frames don't need to be there. Hence that's cheating in my book - if I buy two cards because I read they get 100fps in game X where in fact they only get 75fps I am buying it based on a lie.
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
You aren't only getting 75. You are getting a poorly rendered 100. Cheating infers doing something unfair to gain an advantage. There's no evidence that AMD was ever aware of this. There's also very little information so far. Let it play out a bit.