[pcper] frame metering review 690 vs. 7970 CF vs. Titan

tviceman

Diamond Member
Mar 25, 2008
6,734
514
126
www.facebook.com
http://www.pcper.com/reviews/Graphi...eForce-GTX-690-Radeon-HD-7990-HD-7970-CrossFi

Probably the website that is now pushing, divulging, and investigating with the most thorough analysis into frame times has posted it's lastest updated info today. They are using their own unique solution to test frame times (not FCAT).

The results shouldn't surprise you, and it won't surprise AMD any more either - if released today, the HD 7990 would not perform well in our tests. AMD has told me that they are working on an option to meter frames in the way that NVIDIA is doing it, but offering users the options to enable or disable it, but we are months away from that fix. Until then, any dual-GPU Radeon HD 7000-series cards are going to show these problems represented as runts and dropped frames. We have many more pages of results to go over for the HD 7950/7870/7850/etc and those will be published in the coming days - but the story will look very similar as you'll find.

In all honesty, when AMD told me they were planning this card release I recommend they hold off until its driver fix is in place - myself and other reviewers are going to be hitting them hard on these issues until then, and any dual-GPU option with the Radeon name is going struggle to live up to any placed expectations.

Final Thoughts

The second part of our final reveal of the Frame Rating performance methods have shown us some interesting results for the $999 and above card lineups as they stand today. The Radeon HD 7970s in CrossFire, representing the currently available and upcoming HD 7990s don't look great in our testing as we mentioned above, and I would seriously consider your buying decision before picking up this configuration.
 
Last edited:

Grooveriding

Diamond Member
Dec 25, 2008
9,147
1,329
126
How is this guy evaluating the unreleased 7990 using two 7970s and giving conclusions already ? We've already heard from GDC where they demoed Battlefield 4 with the 7990 that it is the new malta card, we don't even know if they are the same 7970 Tahiti cores... We also have no idea of drivers or what hardware is on board the 7990 PCB...

Haha, what a joke. Why didn't he confine his review conclusions to exactly what he was reviewing, instead of trying to hypothesize the results for unreleased hardware using completely different hardware ? He might as well have benchmarked a GTX 280 and used his results multiplied by 3.86 to give his Titan numbers. :D

I get PCPER has been pretty shady with releasing data before everyone else using nvidia's frame metering tools without transparency or disclosure that they were working with nvidia, but now this is just getting silly. Reviewing an unreleased card and unreleased drivers using different hardware and drivers... Maybe he is high and/or drunk.
 
Last edited:

Lonyo

Lifer
Aug 10, 2002
21,938
6
81
Based on the AT article/discussion with AMD, pcper seems to be saying that because NV went for a higher latency approach (pacing frames) rather than a low latency approach (runt frames), it's better/worse, assuming what AMD said about their decisions is accurate.

AMD have said they will give users a choice in the future, which is the most sensible option.
The reviewer seems to have decided that one method is better than the other, and therefore NV is better than AMD because it uses that method.

Also, correct me if I'm wrong, but if your display shows 60fps, your observable framerate is 60fps.
Whether you see two exact half frames 50%/50%, or 90%/10%, you're still only seeing one full frame's worth of information at a time.
 
Last edited:

Fx1

Golden Member
Aug 22, 2012
1,215
5
81
Someone explain to my why Vsync doesnt turn the runt frames into full frames synced with the refresh of the monitor?

Basically this issue is for non vsync CF uses who are wanting awful screen tearing and screwed up visuals?

Im sorry but anyone who wants tearing over input lag needs their head looking at.

Also would a 120hz monitor with Vsync turned on also fix this runt problem and give 100% performance of the CF?
 

Final8ty

Golden Member
Jun 13, 2007
1,172
13
81
Someone explain to my why Vsync doesnt turn the runt frames into full frames synced with the refresh of the monitor?

Basically this issue is for non vsync CF uses who are wanting awful screen tearing and screwed up visuals?

Im sorry but anyone who wants tearing over input lag needs their head looking at.

Also would a 120hz monitor with Vsync turned on also fix this runt problem and give 100% performance of the CF?

They want there tearing evenly space out.
So instead of having those 3 frames overlapping near the top they should be evenly spaced which should take 3rd of the screen each giving you a tear a 3rd up and a 3rd down.

frcf3.jpg


No thanks ill just take Vsync.

And its actually Output lag.
 
Last edited:

parvadomus

Senior member
Dec 11, 2012
685
14
81
I dont understand one thing, if they can make the GPU to wait for monitor sync, why dont they make it wait at least X milliseconds when vsync is off, or better having into account the last N frames, you have an average M FPS for that frames, then make the GPU wait at least a little less of that 1000/M milliseconds, that would be an easy fix..
 
Feb 19, 2009
10,457
10
76
http://www.pcper.com/reviews/Graphi...eForce-GTX-690-Radeon-HD-7990-HD-7970-CrossFi

Probably the website that is now pushing, divulging, and investigating with the most thorough analysis into frame times has posted it's lastest updated info today. They are using their own unique solution to test frame times (not FCAT).

Another website thats doing it wrong, screen tear galore without vsync in games, and if adaptive vsync (no sudden drops to 30 fps) is wanted, radeon users have that option too with radeon pro.

Beating a stupid horse.
 
Feb 19, 2009
10,457
10
76
Here is something from their last article that includes vsync data. http://www.pcper.com/reviews/Graphi...ils-Capture-based-Graphics-Performance-Tes-11

And the site fails to mention the often used radeon pro for users of radeons and especially CF. Enables adaptive vsync and frame smoothing for AMD cards.

If enabling vsync adds latency to the input lag, then NV's default solution also adds input lag. Read AT's article, AMD focus on minimal inpug lag at the expense of frame time interval because they dont feel that many users can notice millisecond variations. But having the option in CCC will be good later.
 

Fx1

Golden Member
Aug 22, 2012
1,215
5
81
Here is something from their last article that includes vsync data. http://www.pcper.com/reviews/Graphi...ils-Capture-based-Graphics-Performance-Tes-11

So basically V Sync fixes not only runt frames but also the famous frame latency problem.

V Sync Should be turned on at all times because there is little to not benefit at running 90fps on a 60fps monitor.

Games should be set at performance settings so you can get 60fps 100% of the time.

16ms is fast enough to get your twitch shots in BF3 or what ever shooter you play. If you need more then get a 120hz monitor and aim for a higher vsync setting.

So there is NO PROBLEM with CF after all. Move along.
 

BrightCandle

Diamond Member
Mar 15, 2007
4,762
0
76
The thing is that to get this more even really doesn't add much of a problem. Lets say the game produces a new game world really quickly, say in 2ms. So when the game first starts the second frame is just 2ms after the first one and hence the two cards are always rendering almost at the same time with just 2ms between them and hence lots of runt frames.

All AMD has to do is move the second frame +8ms at 60hz and +4ms at 120hz and the frames are evenly spaced and no additional lag has been added except to the single frame that was initially delayed to even things out everything else is minimal latency.

Now from there it might drift due to changes in the processing time for moments and you might have to readjust to even them out, but its unlikely to ever have to move it as much as half a frame again. As a technique it really doesn't add much latency at all, infact it reduces the apparent input latency because you can actually see the frame that comes out! Its not hard, NVidia has been metering for a very long time.

VSync on is a horrible option, for any one who competitively games you either have to accept a lot of latency (+50% with triple buffering) or 30/60 fps stepping issues. Neither of these is really an acceptable option, especially considering in the past triple buffering has crashed certain games.

AMD needs to fix this for its crossfire to be viable, but in the meantime people will have to get by using VSync and all the additional stuttering that introduces. Its still better than without vsync.
 

Final8ty

Golden Member
Jun 13, 2007
1,172
13
81
AMD needs to fix this for its crossfire to be viable, but in the meantime people will have to get by using VSync and all the additional stuttering that introduces. Its still better than without vsync.

I think your confusing that with the NV 6xxx Vsync stuttering issue.
 

Lonyo

Lifer
Aug 10, 2002
21,938
6
81
The thing is that to get this more even really doesn't add much of a problem. Lets say the game produces a new game world really quickly, say in 2ms. So when the game first starts the second frame is just 2ms after the first one and hence the two cards are always rendering almost at the same time with just 2ms between them and hence lots of runt frames.

All AMD has to do is move the second frame +8ms at 60hz and +4ms at 120hz and the frames are evenly spaced and no additional lag has been added except to the single frame that was initially delayed to even things out everything else is minimal latency.

Now from there it might drift due to changes in the processing time for moments and you might have to readjust to even them out, but its unlikely to ever have to move it as much as half a frame again. As a technique it really doesn't add much latency at all, infact it reduces the apparent input latency because you can actually see the frame that comes out! Its not hard, NVidia has been metering for a very long time.

VSync on is a horrible option, for any one who competitively games you either have to accept a lot of latency (+50% with triple buffering) or 30/60 fps stepping issues. Neither of these is really an acceptable option, especially considering in the past triple buffering has crashed certain games.

AMD needs to fix this for its crossfire to be viable, but in the meantime people will have to get by using VSync and all the additional stuttering that introduces. Its still better than without vsync.

That's what AMD have indicated they are going to do, delay frames to smooth out the display and reduce/remove runt frames.
The problem is though, you are delaying frames, meaning what you see isn't what corresponds to your inputs, which is where the latency issue comes in.
Your displayed frames are delayed from your inputs, potentially by an irregular amount, resulting in latency in game play, but no "runt" frames.

NV already seem to do it this way, which is why pcper believes their performance to be superior, because they think that non-runt framerates are better than framerates with runt frames, even though those non-runt framerates have delayed frames, introducing lag.

So either you have high "observed" FPS in their terms, being that a larger amount of each frame is displayed (but not necessarily an entire frame), or you have lower "observed" FPS, but potentially higher actual FPS (not that it's relevant either way), but without the delayed frames which can cause latency.

pcper believes #1 is better, because they believe it. You are seeing more of more frames, even though you are actually only seeing 60 screens worth of frames (on a 60Hz display), and how much of each frame you see is irrelevant, because seeing 50% of two frames vs 90/10 of two frames doesn't make any real difference.
 

sontin

Diamond Member
Sep 12, 2011
3,273
149
106
SLI with Kepler has no higher input lag than Crossfire. nVidia has only a better output system for the frames.
 

BrightCandle

Diamond Member
Mar 15, 2007
4,762
0
76
The GPUs meter the game simulation as well. The present call made from the game to the GPU driver via directX blocks the games thread until the frame can be accepted by the context queue. Its this which regulates the production of new game sim frames as until the GPU starts processing the next frame there isn't space. On a single card at 60fps this would mean the game sim only gets to finish a present call every 16ms and hence it produces nicely evenly spaced steps. With two cards the queue is getting cleared by two different cards and in that case its important that they take from the context queue when the other card is half way down with its frame so that release to the game sim is evenly spaced.

So while the initial delay increases delay on 1 frame the impact is nice evenly spaced frames and evenly spaced game steps. Frame metering does not introduce any simulation stutter and keep it forever, its a momentary correction that fixes both problems.

I know a lot of people here haven't programmed DirectX and don't understand how that works and Anandtech also didn't in their explanation of the problem and fraps unfortunately. But its a really important aspect of understanding stuttering behavior because its not sufficient to just fix the output to the screen. TechReport in their FCAT review showed a 50ms spike in Skyrim with high speed video that on FCAT was basically gone but the problem was that fraps had captured an animation stutter, the game sim had been impacted by the GPUs uneven rendering behavior. So that stutter was very much something you could see. Unfortunately Anandtech's lack of understanding here means they don't like fraps as a tool, but its not really for measuring stutter coming out on the monitors its for measuring stutter going into the GPUs.

There is no reason not to meter, it is basically all around better except for the 1 correction frame that gets delayed, the impact is lower input latency and high perceived frame rate. From then on the only time the metering has to adjust is when there is a big change in frame rate, then it needs to potentially space the cards out again but again this should impact just one frame. Or alternatively they space it out over a small period and gradually push one card until its evenly spaced, either strategy could work. But the current situation is the very worst possible outcome, the second card in essence is only adding to the animation stutter and not providing additional framerate. Its all around worse than having a single card on its own.
 
Last edited:

Fx1

Golden Member
Aug 22, 2012
1,215
5
81
The thing is that to get this more even really doesn't add much of a problem. Lets say the game produces a new game world really quickly, say in 2ms. So when the game first starts the second frame is just 2ms after the first one and hence the two cards are always rendering almost at the same time with just 2ms between them and hence lots of runt frames.

All AMD has to do is move the second frame +8ms at 60hz and +4ms at 120hz and the frames are evenly spaced and no additional lag has been added except to the single frame that was initially delayed to even things out everything else is minimal latency.

Now from there it might drift due to changes in the processing time for moments and you might have to readjust to even them out, but its unlikely to ever have to move it as much as half a frame again. As a technique it really doesn't add much latency at all, infact it reduces the apparent input latency because you can actually see the frame that comes out! Its not hard, NVidia has been metering for a very long time.

VSync on is a horrible option, for any one who competitively games you either have to accept a lot of latency (+50% with triple buffering) or 30/60 fps stepping issues. Neither of these is really an acceptable option, especially considering in the past triple buffering has crashed certain games.

AMD needs to fix this for its crossfire to be viable, but in the meantime people will have to get by using VSync and all the additional stuttering that introduces. Its still better than without vsync.

Im sorry but competitive gamers who "THINK" they are super fast actually aren't. V Sync should be turned on at all times.

Reaction times arent important in BF3 since the kill box is so far off due to latency that your reaction speeds are mute. Game lag will kill your so called super human twitch response. Ping is normally 70ms+ so forget 16ms frame time latency.

Now if we get into a real competitive games such as Counter strike where players think they need superhuman twitch. They turn the resolution down to 800x600 to make the headshot kill box bigger. They also can get a 120hz monitor to run vsynced 24/7 even on max settings this is possible.

So please explain to me where V Sync needs to be turned off?
 
Last edited:

badb0y

Diamond Member
Feb 22, 2010
4,015
30
91
So I was wondering since the BF4 demo ran on a 7990, how come I didn't see any tearing or stuttering? Is it only apparent when you are sitting in front of the monitor and seeing it with your own eyes and somehow doesn't show up if you record the gameplay?

Honestly I thought the BF4 demo was really smooth and I didn't experience any tearing/stuttering watching the video.
 

Fx1

Golden Member
Aug 22, 2012
1,215
5
81
So I was wondering since the BF4 demo ran on a 7990, how come I didn't see any tearing or stuttering? Is it only apparent when you are sitting in front of the monitor and seeing it with your own eyes and somehow doesn't show up if you record the gameplay?

Honestly I thought the BF4 demo was really smooth and I didn't experience any tearing/stuttering watching the video.

Its because they clearly had vsync turned on. Like all sensible gamers.
 

sontin

Diamond Member
Sep 12, 2011
3,273
149
106
Yeah, "like all sensible gamers". With Vsync you have 2 frames input lag.

But "like all sensible gamers" input lag is nothing what they can feel, i guess...
 

Fx1

Golden Member
Aug 22, 2012
1,215
5
81
Yeah, "like all sensible gamers". With Vsync you have 2 frames input lag.

But "like all sensible gamers" input lag is nothing what they can feel, i guess...

Why is it two frames? Surely the game engine updates every 16ms so the longest time you would wait is until the next frame if you just missed one by a few ms.

16ms isnt long to wait either since your waiting normally 100ms of ping to see what the other player is doing on screen.

Without Vsync you could also be shooting at a previous frame where the player has moved since it could tear 2 times on one screen which would show your 3 frames on a single screenshot. Where do you aim? top middle or bottom? This is all theoretical since you arent quick enough to make a difference anyway.

If you do have an issue there is always 120hz displays where you get all the benefits and none of the downsides except TN panels
 

sontin

Diamond Member
Sep 12, 2011
3,273
149
106
How do you think "Alternate Frame Rendering" is working?

AFR introduces a 1-Frame+ input lag. Combine this with Vsync and you have 2 frames.
 

Fx1

Golden Member
Aug 22, 2012
1,215
5
81
How do you think "Alternate Frame Rendering" is working?

AFR introduces a 1-Frame+ input lag. Combine this with Vsync and you have 2 frames.

Except this isnt the case where each GPU manages 50% of the frame.

Ill be honest im not sure which rendering method is used but i was under the impression that SLI used to cut the image in 2 and render 50% each.
 

badb0y

Diamond Member
Feb 22, 2010
4,015
30
91
Yeah, "like all sensible gamers". With Vsync you have 2 frames input lag.

But "like all sensible gamers" input lag is nothing what they can feel, i guess...

Doesn't this affect both vendors though when it comes to Crossfire/SLi?
 

Grooveriding

Diamond Member
Dec 25, 2008
9,147
1,329
126
Who actually plays BF3 multiplayer with vsync on ? And for that matter - who plays it without using the built in framerate cap; gametime.maxvariablefps - set to a few frames above their monitor's refresh rate or whatever FPS level is their system's average threshold ?

I've tried playing BF3 without using the framerate cap and the experience is horrible as your framerate skyrockets and then can drop suddenly when an explosion goes off. Same goes for CS or CS:S, who does not use fpx_max via the console to steady their frame rate from big vacillations ? Some pseudo 'competitive gamer' logic being posted in this thread. The experience is probably hampered even moreso from having a framerate constantly leaping and falling compared to the lag from vsync. fps_max was a go-to feature for competitive CS gaming.

Any single player game vsync is always on because it makes the experience vastly better.
 

lavaheadache

Diamond Member
Jan 28, 2005
6,893
14
81
Except this isnt the case where each GPU manages 50% of the frame.

Ill be honest im not sure which rendering method is used but i was under the impression that SLI used to cut the image in 2 and render 50% each.

if that is what you think then you need to do some homework. I'll give you a hint... It's called AFR